Background
I joined Adept.ai as the Lead Product Designer and worked on designing how humans can best utilize AI agents to complete tasks for them. Two months later the team got acquired* by Amazon.
At Amazon, my team of two was initially tasked with aligning the Alexa org with the Adept team. Then we decided to design and prototype a solution for what we think is the right interaction model for Adept's workflows.
Breaking away from the turn-by-turn chat
The challenge
It's cumbersome to have a back-and-forth conversation with a voice agent to get real-life tasks done.
How do we minimize back-and-forth steps with the customer?
Inventing a new way to use agents
We built a working protoype where the user can talk naturally, and the interface instantly turns the input into action and visual feedback on the screen.
Building groceries cart
Me talking naturally to my groceries cart to add items
Ordering food
Asking questions about the menu while you're ordering food
Alexa+ Launch Demo
A few months later our prototype, built by three people including myself, got on the stage for the Alexa+ launch demo.
