Google Look is an object-oriented search voice assistant from the surroundings.
This earphone & app transforms the space as a searchable world and allow you to search 3D objects by using AI technology.
Unity, Vuforia, Sketch, Principle, After Effects
Timeline: 14 Weeks,
September — December 2018
Camera and Circular Laser Pointer for tracking and pointing nearby objects
Bone Conduction Earphone to hear the search results
AI is nearby but always far away. It is like how my personal friendship builds.
I am not living with my close friends nearby so we chatted online a lot, and we trust each other.
If I need help, they come to walk with me.
After playing with Unity in AR, I realize that the trends of screen is not limited to 2D space but develop to 3D objects. I did a research on the market and found that digital-physical vs physical- Digital has been merging by using AR technology.
Interaction with on 2D Screen
Digital vs Physical screen
(Logitech 2D laser presentation tool)
Silent mode by using AR ball clicking on Physical Objects for 2 seconds
Needs: “I want to search what I can make.”
Solution: Google Look always tries to collect your personal data and assist user to make choices smartly by curating results and as a guidebook.
Needs: “I want to compare products in order to buy quickly.”
Solution: Google look enables users to make decisions and compare quickly in the store. It bridges the online and offline shopping experiences.
Our relationship to the digital world has changed. No longer differentiating between our on and off screen experiences, social connections, or IRL/URL identities, we’ve crossed a threshold where the ‘real’ and the ‘virtual’ are almost indistinguishable.
As objects becoming searchable, I have experimented and tested with my friends about what information they want to learn from the searching of the objects.
Machine will look up the online documents such as Wiki, google ads, and history.
By having environment detected, machine specify the topic and keywords to search for this object.
Objects are compared in a real environment, AI can narrow down more based on the offline and online data.
“We become what we behold. We shape our tools and then our tools shape us.”
- John Culkin, Digital Media Academic
Talking with creators made me rethink about what if 3D objects are searchable or data is scattered around us
Researcher job is not just online search but more emotional, relationships and observational understanding users such as going to their home.
3D Designers find investigating work is extremely hard by only having phone with them to search keywords before they want to design 3D objects.
Housewife always have to search online before going to shop for recipes. She always needs to write it down on fridge or ask store workers about what she needs.
Shopping and comparing usually happen on a daily basis, but can be also easily triggered by inspirations from others. On the other hand, people use other people's ratings as inspirations, and the desire to shop arises naturally. I decided to create a persona to narrow down the user needs. Efficiency and portability is key.
Overall the feedback of this project is very positive, Google daydream team Lead says that " Your design is doing subtraction and that is the your success of less is more. We really really love it." Visitors came to my desk after the project was presentation.
However, there is never ending of any project.
I have several goals planned for further user testing.
ROI(Net benefit cost) of product development
Time it takes them to find the correct answer by talking to Look and switching to another object
Low Bounce Rate and high customer task achievement : Success
1. Shopping in market with Look app prototypes
2.Calculation of successful shopping lists created in a certain time
3. Voice Over tone testing by recording different scenes