Amazon to humanise Alexa with whispers, pauses and emotion

We may be one step closer to the vision of how human’s will with interact with AI systems outlined in the Spike Jonze film “Her” if Amazon’s latest set of skills is anything to go by.

Whether the idea of your Alexa device pausing, having a vocal range and breathing is your idea of a utopian or dystopian future, it’s happening – and quickly.

Now Amazon’s Alexa assistant is going to sound a lot more human.

Last week the company unveiled a number of new speaking skills allowing the iconic voice associated with the Echo or Dot able to whisper, take a breath or pause and adjust the rate, pitch and volume of its speech. Somewhat bizarrely, Amazon also say Alexa will be able to “bleep” out certain words when speaking.

Amazon introduced the changes alongside a roll out of “speechcons” previously available in the US (e.g. “Blimey,” and “Bob’s your uncle” and “Donnerwetter”) specially created for the UK and German developer communities for a more engaging way of communicating.

These latest updates to Alexa’s way of communicating were offered to developers via Alexa’s Speech Synthesis Markup Language (a standardised markup language commonly dubbed “SSML”) they use to code speech patterns into applications. The five new SSML tags now allow programmers to play with emotion, pronunciation, intonation, timing and expletive beeps when creating responses in skills.

Amazon maintains that its latest development of Alexa is still to be taken 100% seriously however. The company have set limits on the amount developers can alter the rate, pitch and volume of her voice to try and focus on them the end goal: making Alexa more human.

Regardless, it’s a move that’s fully dependent on the developer community's embrace, and with 12,000 skills already in the marketplace for users to navigate, time will tell how soon Alexa will be pausing, breathing and bleeping in our homes.