Google Look

Look is an AI-empowered voice assistant earphone that turns offline world into searchable spaces.

Timeline:Sep-Dec 2018( 2020 Version)

Role:Product Designer

Team:Script Writer, 3D artist

During class, I delivered user scenario video and concept to Google Daydream product designers. We checked in weekly to ideate future emerging technology possibilities. After the project, I built up a team working closely with script writer, product designer, 3D modeling for to push the user story to another level.

Full Process Deck

- Understand Google Daydream team's needs-

01. THE CHALLENGE

Creating seamless transitions between the real and digital worlds with emerging technology

Google Daydream team was interested in future seamless transitions between the real and digital worlds by uncovering new hybrid tools for creators in 2023. I decided to dive deeper into emerging technology, searching and recording potentials beyond current use cases.

Curious about the surroundings? Now you can talk to smart assistant!
An AI earphone to track surrounding objects.

- Our Solutions -

Save and Compare

Look helps you compare your searching results by asking your AI assistant in your look earphone.

Smart Assistant

Look app serves to assist your searching journey seamlessly without pausing at the moment of looking up online. Look curates your search results.

Scan & Buy

By simply scanning the objects in front of you, users can quickly grasp the online links based on what they see. They can save it and re-access it later.

-Research-

03. MARKET RESEARCH

Personalization & Real-time are trendy needs but not seen in searching experiences
1. Mobile devices allow users to search and learn anywhere and anytime.

2. Phone camera enables users to search based on what they see, and user can track objects around and look up on the phone but it is not widely used yet.

3. Voice speakers are limited to the technology to understand your context and maintaining context so you cannot have dynamic exchanges that start, stop and pick up again later.

04.STIMULUS & BRAINSTORMING

Searching 2D images is not enough, you want to see, feel, and touch in the reality before making a decision.

"Researcher is like a detective, we search from details of your your everyday life. We want to meet the user to get guided through in physical world instead of reading papers."

"As a product designer, I wish I can search 3D objects without just going online. I want to see it and feel it before designing. "

05.IDEATIONS

Users are struggled of getting quick and meaningful responses

1. I want to search and compare offline- online shopping quickly.
2. I want to discover making new things by searching online.
3. I want to collect and share my searching results more efficiently.

06.STORYBOARDING

What if online search can be as natural as asking a close friend by your side?

Intuitive Searching

1. I shop freely with LOOK navigating.

2. LOOK compares the two BBQ sauces in my hand.

3. LOOK recommends the left one by pointing at the bottle.

Contextual Searching

1. I want to make a dinner by asking LOOK.

2. LOOK sorted out recipes based on my daily preferences.

3. I decided to cook scrambled sausages tonight and LOOK guides me step by step.

Shareable Searching

1. I like my friend’s clothes and wonder if she can share me a link.

2, She allows LOOK to share her clothes to me.

3. I received the information and save it to my phone.

06.VOICE FLOW

After the interviews insight, I decided to break it down into 3 different user scenarios and use cases that contextualize the searching to buy, to make, and to share.

“Hey Look, I want to know which one I should buy? ”
“Hey Look, I want to know what can I make from what I have. ”
“Hey Look, I want to share my own stuff to others seamlessly. ”

07. VISUAL REPRESENTAION

AI understands your questions and respond with sounds & visual effects

08.TESTING

Unity Vuforia: 
Object-Recognition + Voice command

Object Recognition

Object + Environment

Object + Environment

08. PRODUCT DESIGN

Product forms factors

09. PROTOTYPING PROCESS

Reflection
Voice user interfaces allow the user to interact flexibly with a system through voice commands and multi-modal gestures that are not limited to buttons.
But it is still limit to technology and privacy issues of where to access the information.
It is like a RPG gaming that you lead the story to the ending by your choices.

Next Projects/ 2018-2020PROJECASE CASE STUDIES

🚗 Yoohoo: Autonomous Carpool systemfor solo travelers / 2018

💡 Google Look: Voice UI to search nearbywith AI assistant /2018-2020

🍔 frog Design: Redesign global food brand
📈 Virtualitics: Data Analysis Internship
with AI algorithm and VR visualizations