Google look :
Voice assistant to search wisely

Google Daydream Sponsor

Team:
Google Daydream
& Individual Project

Duration:
14 weeks

Role:
UX & Product Design

Tools:
Unity, Sketch & Principle,
After Effects,
Solidworks & Keyshot

Curious about the surroundings? Now you can talk to smart assistant!

Outcome

Look is an AI-empowered voice assistant earphone that turns offline world into searchable spaces.

01.
The Challenge

Creating seamless transitions between the real and digital worlds with emerging technology

Google Daydream team was interested in future seamless transitions between the real and digital worlds by uncovering new hybrid tools for creators in 2023. I decided to dive deeper into emerging technology, searching and recording potentials beyond current use cases.

02.
Research

Personalization & Real-time are trendy needs but not seen in searching experiences

1. Mobile devices allow users to search and learn anywhere and anytime.
2. Phone camera enables users to search based on what they see, and user can track objects around and look up on the phone but it is not widely used yet.
3. Voice speakers are limited to the technology to understand your context and maintaining context so you cannot have dynamic exchanges that start, stop and pick up again later.

03.
Interviews

Searching 2D images is not enough, you want to see, feel, and touch in the reality before making a decision.

I conducted 3- 4 expert interviews with researchers and designers. I learned that searching is not just to read data numbers but also about to extract information from the real context.

"Researcher is like a detective, we search from details of your your everyday life. We want to meet the user to get guided through in physical world instead of reading papers."

"As a product designer, I wish I can search 3D objects without just going online. I want to see it and feel it before designing. "

Users are struggled of getting quick and meaningful responses

Continuing with 5 more interviews with users who rely searching on phone to learn, I identified shopping, discovering and sharing by searching are  the most valuable use cases and needs from users:

1. I want to search and compare offline- online shopping quickly.
2. I want to discover making new things by searching online.
3. I want to collect and share my searching results more efficiently.

04.
Define Pain Points

Primary persoand and scenario

I conducted 3- 4 expert interviews with researchers and designers. I learned that searching is not just to read data numbers but also about to extract information from the real context.

05. Storyboard

What if online search can be as natural as asking a close friend by your side?

Intuitive Searching

1. I shop freely with LOOK navigating.

2. LOOK compares the two BBQ sauces in my hand.

3. LOOK recommends the left one by pointing at the bottle.

Contextual Searching

1. I want to make a dinner by asking LOOK.

2. LOOK sorted out recipes based on my daily preferences.

3. I decided to cook scrambled sausages tonight and LOOK guides me step by step.

Shareable Searching

1. I like my friend’s clothes and wonder if she can share me a link.

2, She allows LOOK to share her clothes to me.

3. I received the information and save it to my phone.

06.
Voice Flow

Personalized conversation

After the interviews insight, I decided to break it down into 3 different user scenarios and use cases that contextualize the searching to buy, to make, and to share.

“Hey Look, I want to know which one I should buy? ”


“Hey Look, I want to know what can I make from what I have. ”


“Hey Look, I want to share my own stuff to others seamlessly. ”

07. Testing

Object-Recognition + Voice command

Object Recognition

Object + Environment

Object + Environment

08. Visual Personality of AI

AI understands your questions and respond with sounds & visual effects

Service System

How it works?

Product design

Product forms

App
design

With app, it tracks your data from searching

Visual guideline on the system

Location: Store
“I want to check my searching visually”

By having Voice UI guideline on the lock screen, users can learn how to communicate while they are searching with LOOK.

2. Real-time Data tracking

Location: Home
“What did I search and where do I find them?”

Look enables searching of voice back to the reality while users are able to retrieve back the information and data that they wanted.

Reflection

ROI(Net benefit cost) of product development
Bounce Rate:
Time it takes them to find the correct answer by talking to Look and switching to another object

Low Bounce Rate and high customer task achievement : Success


Task types:
1. Shopping in market with Look app prototypes
2.Calculation of successful shopping lists created in a certain time
3. Voice Over tone testing by recording different scenes


My Projects/ 2018-2020


🍔 frog Design: Redesign food & family experiences
for families & food/ 2019

💡 Google Look: Voice UI to search nearby
with AI assistant /2018-2020

📈 Virtualitics: Data Analysis Internship
with AI algorithm and VR visualizations

🦄 Bauer: Educational platform to study design
for teenagers / 2020