sharon dayoung lee

abbi: voice assistant for airbnb

ABBI is a voice assistant for Airbnb that helps user with the check-in process.


abbi: Voice Assistant for Airbnb
Check-in process innovation through voice user interface

Spring 2019
VUI, UX, branding, motion graphics

4 weeks group project with Allissa Chan and Maddy Cha



‘abbi’ is a voice assistant for Airbnb that helps its users with the check-in process. Currently, Airbnb does not adopt a standardized check-in process and leaves it up to the hosts to inform the guests. This frequently becomes a source of confusion for the users. We wanted to address the frustration and the confusion during the check-in process by allowing ‘abbi’ to guide the user through it.

My Role

I worked on identifying use cases and pain points of the current Airbnb user journey as well as iterating on the motion of the voice user interface.



Motion Studies

Part 1 — Motion study the movement and intensity
Part 2 — Motion study looking at colors and shapes

These sets of motion studies helped us think critically about the relationship between color, shape, and intensity of motion before we made any decisions of what kind of voice assistant product we wanted to design. The motion studies helped us to discover similarities and patterns we as a group identified was suitable in communicating certain states/emotions.


Identifying Use Cases

Once we familiarized ourselves with the similarities we found from the motion studies, we decided to create a voice assistant for Airbnb. We started by briefly analyzing the user journey along the app, which were on-boarding, reservation, check-in, and check-out.

We decided to optimize our voice assistant for the check-in process which many airbnb users expressed confusion and frustration due to the large volume of information they had to sort through during this process (no universal check-in procedure— it is largely up to the host to send out the check-in information in whatever platform they like to use i.e. Google Docs, PDFs, plain text).


Storyboarding Possible Scenarios

Then we discussed the possible user scenarios to determine what kind of UI states we needed to create for the voice assistant. Then, the storyboard was used to create scenes for the demo video.


Design Testing

We tested our VUI design with users by asking them to 1) rate how close a motion resembles a certain action in a scale of 1 to 10 2) define the action that best suits the motion. Based on these insights, we refined our final design.



There are already many VUI out in the market — Apple’s Siri, Microsoft’s Cortana, Samsung’s Bixby, and Amazon’s Alexa to name a few. It’s natural for people to associate certain motions with a certain action due to their experience with pre-existing VUIs. The big question here was: how do we know whether people identify certain motions with their action because they are used to them or because they are intuitive and natural?