Google Look- turning offline world into searchable spaces.

My Role

• Worked with Google daydream team as an individual contributor student delivering both UX prototypes & Industrial design products in sponsor class.
• After the class, I built up a team with friends including script writer & 3D architect to revisit the project and push it to a higher level.
• Voice Flow Design, User Storyboards, AR prototypes

Timeline:
Sep-Dec 2018( 2020 Version)

Team:
Eileen Deng, Xiangtai Sun

Result: Use Case Demo

What is Google look?

Look is an AI-empowered searching tool including an earphone product and an AR app to help you save your object search histories by looking at it, you are basically googling the nearby with your smart assistant friend. It recognizes your surroundings by sharing the same view with you through its camera. Customers who may not like searching on the phone and taking pictures can quickly find their results by asking Google Look:

“Hey Look, which sauce is my favorite? Any recommendations?”

The final result was presented to the Daydream team and all the audience want to buy this device in the market.

Look and ask

Look assists you along your way exploring the environment : search and answer your questions in real-time.

Smart Comparison

Users can simply ask our friend "Look" to hear about personalized opinions based on users' preferences and behaviors. Users do not need to sort out complex data out of lots of searching results.

Search to create

By talking with Look, users can create their own search results based on what they see and hold. For example, I want to know what recipes that I can make based on what ingredient that I have now.

How to interact with Look?

What you do is that, you put up this I/O voice earphone piece. For example, you are looking at two pots, the ear pieces will answer your questions along your shopping like how you shop with a smart assistant.

You can ask for the voice assistant while it learns your behaviors at the same time. Look help you make decisions intelligently and it will go back to your previous search histories to analyze with you.

Talk to Stakeholder - Google Daydream team's needs

Google Daydream team was interested in future seamless transitions between the real and digital worlds by uncovering new hybrid tools for creators in 2023. I decided to dive deeper into emerging technology, searching and recording potentials beyond current use cases.

Creating seamless transitions between the real and digital worlds with emerging technology

Who are Creators - interviewed a group of people curious of making things

By interviewing with experts in diverse fields who are crafting things, I learnt to understand how they create things out of their hand from their backgrounds. It was interesting to find that they have one thing in common is that before making a decision on researching, designing, or shopping, they want to see, feel, touch in real life. Searching online is not enough.

Searching 2D images is not enough, you want to see, feel, and touch in the reality before making a decision.

"Researcher is like a detective, we search from details of your your everyday life. We want to meet the user to get guided through in physical world instead of reading papers."
-ADP Researcher

"As a product designer, I wish I can search 3D objects without just going online. I want to see it and feel it before designing. "
-Lego Industrial Designer

Users are struggled of getting quick and meaningful responses

Summarizing the needs and goals from the daily creators who need a contextual searching in 3D world, I found those touch points from the research. People are not able to search in a real-time with their needs.  I decided to break them down based on the 3 use cases.

1. I want to search and compare offline- online shopping quickly.
2. I want to discover making new things by searching online.
3. I want to collect and share my searching results more efficiently.

What if online search can be as natural as asking a close friend by your side?

Imagine you are shopping in the market with Look, you found two different BBQ sauces and they both look quickly great. You want to know which is better and recommended. Look found that the left has better rating from other people reviews.

1. I shop freely with LOOK navigating.

2. LOOK compares the two BBQ sauces in my hand.

3. LOOK recommends the left one by pointing at the bottle.

Next one, imagine you are back to home after the grocery shopping, you look up the ingredients in fridge and wonder what to make tonight.Look recognizes the context in front of you and guide you through the journey of the cooking.

1. I want to make a dinner by asking LOOK.

2. LOOK sorted out recipes based on my daily preferences.

3. I decided to cook scrambled sausages tonight and LOOK guides me step by step.

Imagine you see something that you are interested but not able to search from other people. Since Look can only recognized objects, it will ask for your access. They can share information quickly to your bookmarks.

1. I like my friend’s clothes and wonder if she can share me a link.

2, She allows LOOK to share her clothes to me.

3. I received the information and save it to my phone.

Voice Flow- taking it to next level

I also explored the natural motion from the AI and how it can express emotions to users from natural way and how voice can be routed from AI side. With the smart camera, our AI can be smart and has unique personality.

Like how you talk to your friend.

A friend who will help you choose wisely.

A friend who will teach you new skills.

A friend who will share information together.

Look app helps you to save your results.

Remember that we have looking at the pots, app can also act as an input. You can turn your phone as an album to save search results. That is how people prefer visual data instead of voice data. The app’s data transcribing all your voice data into text and visuals hints.

Scan & Buy
By simply scanning the objects in front of you, users can quickly grasp the online links based on what they see. They can save it and re-access it later.

Smart Assistant
Look app serves to assist your searching journey seamlessly without pausing at the moment of looking up online. Look curates your search results.

Save and Compare
Look helps you compare your searching results by asking your AI assistant in your look earphone.

Results & Testing with Unity

While working with new technology is always hard to test, I looked up through lots of online tutorials by learning IBM Watson and Unity Vuforia to try image recognition through voice. Meanwhile, I brought the device to my friends who just had a baby born at home, we tested it in a real environment- acting out the scenario and see everyones’ responses.

Google look successfully responded to her questions with ideal answers naturally.

Object Recognition

Object + Environment

Object + Environment

Reflection on process

Voice user interfaces allow the user to interact flexibly with a system through voice commands and multi-modal gestures that are not limited to buttons.
But it is still limit to technology and privacy issues of where to access the information.
It is like a RPG gaming that you lead the story to the ending by your choices.

My Projects/ 2018-2020


🍔 frog Design: Redesign food & family experiences
for families & food/ 2019

💡 Google Look: Voice UI to search nearby
with AI assistant /2018-2020

📈 Virtualitics: Data Analysis Internship
with AI algorithm and VR visualizations

🦄 Bauer: Educational platform to study design
for teenagers / 2020