Google Look

Timeline:
Sep-Dec 2018( 2020 Version)

Role:
Product Designer

Team:
Script Writer, 3D architect

Look is an AI-empowered voice assistant earphone that turns offline world into searchable spaces.
I worked with Google daydream team as an individual contributor for UX & Industrial designs in the sponsor class. After delivering to Google Day Industrial Designers, I built up a team working closely with script writer and 3D architect to push the final quality to higher level.

Process Deck

- Google Daydream team's needs-

01. THE CHALLENGE

Creating seamless transitions between the real and digital worlds with emerging technology

Google Daydream team was interested in future seamless transitions between the real and digital worlds by uncovering new hybrid tools for creators in 2023. I decided to dive deeper into emerging technology, searching and recording potentials beyond current use cases.

- Final Design Solutions-

Curious about the surroundings? Now you can talk to smart assistant!
An AI earphone to track surrounding objects.

Look and ask

Look assists you along your way exploring the environment : search and answer your questions in real-time.

Smart Comparison

Users can simply ask our friend "Look" to hear about personalized opinions based on users' preferences and behaviors. Users do not need to sort out complex data out of lots of searching results.

Search to create

By talking with Look, users can create their own search results based on what they see and hold. For example, I want to know what recipes that I can make based on what ingredient that I have now.

-Research-

03. MARKET RESEARCH

Personalization & Real-time are trendy needs but not seen in searching experiences
1. Mobile devices allow users to search and learn anywhere and anytime.

2. Phone camera enables users to search based on what they see, and user can track objects around and look up on the phone but it is not widely used yet.

3. Voice speakers are limited to the technology to understand your context and maintaining context so you cannot have dynamic exchanges that start, stop and pick up again later.

04.STIMULUS & BRAINSTORMING

Searching 2D images is not enough, you want to see, feel, and touch in the reality before making a decision.

"Researcher is like a detective, we search from details of your your everyday life. We want to meet the user to get guided through in physical world instead of reading papers."

"As a product designer, I wish I can search 3D objects without just going online. I want to see it and feel it before designing. "

05.IDEATIONS

Users are struggled of getting quick and meaningful responses

1. I want to search and compare offline- online shopping quickly.
2. I want to discover making new things by searching online.
3. I want to collect and share my searching results more efficiently.

06.STORYBOARDING

What if online search can be as natural as asking a close friend by your side?

Intuitive Searching

1. I shop freely with LOOK navigating.

2. LOOK compares the two BBQ sauces in my hand.

3. LOOK recommends the left one by pointing at the bottle.

Contextual Searching

1. I want to make a dinner by asking LOOK.

2. LOOK sorted out recipes based on my daily preferences.

3. I decided to cook scrambled sausages tonight and LOOK guides me step by step.

Shareable Searching

1. I like my friend’s clothes and wonder if she can share me a link.

2, She allows LOOK to share her clothes to me.

3. I received the information and save it to my phone.

06.VOICE FLOW

After the interviews insight, I decided to break it down into 3 different user scenarios and use cases that contextualize the searching to buy, to make, and to share.

“Hey Look, I want to know which one I should buy? ”
“Hey Look, I want to know what can I make from what I have. ”
“Hey Look, I want to share my own stuff to others seamlessly. ”

Scan & Buy

By simply scanning the objects in front of you, users can quickly grasp the online links based on what they see. They can save it and re-access it later.

Smart Assistant

Look app serves to assist your searching journey seamlessly without pausing at the moment of looking up online. Look curates your search results.

Save and Compare

Look helps you compare your searching results by asking your AI assistant in your look earphone.

07. VISUAL REPRESENTAION

AI understands your questions and respond with sounds & visual effects

08.TESTING

Unity Vuforia: 
Object-Recognition + Voice command

Object Recognition

Object + Environment

Object + Environment

08. PRODUCT DESIGN

Product forms factors

09. PROTOTYPING PROCESS

Reflection
Voice user interfaces allow the user to interact flexibly with a system through voice commands and multi-modal gestures that are not limited to buttons.
But it is still limit to technology and privacy issues of where to access the information.
It is like a RPG gaming that you lead the story to the ending by your choices.

My Projects/ 2018-2020


🍔 frog Design: Redesign food & family experiences
for families & food/ 2019

📈 Virtualitics: Data Analysis Internship
with AI algorithm and VR visualizations

🚗 Yoohoo: Autonomous Carpool system
for solo travelers / 2018

💡 Google Look: Voice UI to search nearby
with AI assistant /2018-2020