Alexa. Siri. Cortana. We’re talking to or at our machines. I walk into my office and say “Hey Google, what’s the weather?” or “Hey Google, when’s my first appointment?” When I’m driving in a strange town, it’s “hey Google, navigate to the [fill in the blank] hotel.”

This kind of hands-free access to information is hugely helpful and hugely popular. But there’s a long way to go toward a general purpose voice interface for every task we want to accomplish.

That said, we’re getting there. In this conversation with Central 1’s Alex Chan, we discuss the process of voice-enabling access to the high volume queries that credit union members make, i.e. balance inquiries, balance transfers, etc.

We cover what it takes to build an Alexa skill, the code that links Alexa’s natural language processing to the underlying application that executes the action.

Voice design, the process of imagining and codifying how the user interaction proceeds, is at the heart of a successful voice-enablement project. Alex takes us through that process. It sounds like fun.

While payments are a tiny fraction of today’s voice-based interactions, they’re coming along, too. Better design and broader participation is needed. As a recent (failed) demo proved, Siri can’t send me money if I’m not an Apple Pay Cash user.

Take a listen and get in touch if you've questions or comments. We'd love to hear from you!

 

Direct download: VoiceInterface_mixdown.mp3
Category:general -- posted at: 6:51pm EDT

1