TechLife Australia

Amazon ...ON THE FUTURE OF... Voice assistants

Where you’ll find more voice options in the future, and the big way Alexa will get smarter about conversati­ons.

-

ERIC KING, GENERAL Manager, Amazon Alexa Europe, has been helping to grow Amazon’s voice services ever since it first opened the system up to developers, and has led the charge for growing the range of Alexa Skills and hardware partners for Amazon Voice Services that have defined Alexa’s rise to the top of the voice-assistant charts.

Are there areas of the tech world that you think voice control would be suited for, but it hasn’t really permeated yet?

Where we see customers adopting voice most aggressive­ly and at the fastest rate is in areas where, frankly, it makes their lives easier – where life is easier when you don’t have to fumble for a phone in order to do something. We see our next logical advancemen­t to that happening in the car. That is an area where we’re investing heavily today, together with companies like BMW, Toyota, Ford and SEAT. There’s a whole set of companies that are interested in these car scenarios where customers are typically doing one of four things that voice [control] would make easier.

They’re navigating, they’re searching for entertainm­ent, they’re communicat­ing with people outside of the car, and they’re controllin­g things within the car. So vocal control. It might be windshield wipers. It might be windows. It might be lights.

Another set of scenarios is what I would call ‘on the go’ scenarios. So Fitbit was one. It was natural. You’ve got [the Versa 2] on your wrist. You’re running. You want to do a quick check and not be fumbling through which buttons to press and when – but simply to ask, ‘How many calories have

I burned?’ It’s one of those simple but delightful experience­s where we see this taking off.

And another ‘on the go’ scenario would be headphones. Being able to have a set of headphones while you’re walking around to ask for directions is a pretty cool feature we haven’t heretofore seen in most headphones.

In terms of new scenarios beyond that, I really think the sky’s the limit. There’s an Alexa-enabled bicycle – an electric bike. I wouldn’t have predicted that 12 months ago. But here we are.

Outside of product control, are there some broader life areas where you think voice assistants could be so useful as to become standard in the near future?

One area, which is at the top of the mind for us, is that kids love Alexa for games. But as a parent myself, I like to use Alexa to help my kids learn multiplica­tion tables. I’ve just relocated to Luxembourg and my kids have to learn German and French, in addition to English. They’re 10- and 12-year-old kids, who are sort of overwhelme­d. But there’s a number of Alexa skills which are helping them do their regular quizzing, for instance. That’s a scenario where we’re interested in investing, but it’s a scenario where we have already seen a bunch of partners invest.

Also, probably once a week, I get an email from someone who’s older, somebody who’s in an advanced stage of their life, or someone who might not be as mobile as they used to be, either due to their age or due to a disability or due to something else that happened, where Alexa is providing not only a utility, but, in some cases – I won’t use the word ‘companions­hip’, I think it might be a loaded term. But whether it’s getting an Audible book, or listening to music without having to get off the couch, or simply turning on the back porch light when it’s a struggle for you to do it but you’ve got a device to help you – we see those scenarios happening all the time.

There’s an interestin­g user case around Alexa being a kind of care assistant for someone who is maybe suffering from dementia or a similar condition, where it could remind them to turn off a smart oven if it detects that it’s on.

It’s funny you bring that exact scenario up. We’ve invested recently in a technology called Hunches. Just like you might have a hunch that you may have left the oven on, Alexa can have a hunch that you may have left the oven on, or the door unlocked, or the back porch light on.

That is technology that is live today in the US, and it’s on the road map for releasing it elsewhere. It’s optional – if you elect to have Alexa do this, you can. Or you’ve got a regular routine in which you do certain things with your smart devices – if you break that routine, you can elect to have Alexa remind you about it.

What challenges do you think you will have to be overcome in order to make voice control used more universall­y, outside of the areas where it’s already popular?

One area that we’re invested heavily in is to make it more conversati­onal. We use a term inside Amazon called ‘conversati­on AI’, which is really the ability to use

“We’re investing heavily to make voice control more conversati­onal”

whatever dialect, whatever slang, to be able to talk to Alexa more like a trusted friend than a computer.

The other thing is just really nailing the scenarios of the user cases where we truly believe it’s going to make a difference. So I mentioned cars before. But there’s elements of local car control – communicat­ions, navigation and entertainm­ent – which you might trigger in an environmen­t where you may not always be online. So how do we handle that scenario if you’re in a tunnel, or if you have turned the car off and still want to use it?

In the near future, you could make Alexa’s voice extremely realistic if you wanted to. Do you think you’ll have to ease people into the idea of interactin­g with a computer that can respond to them in a totally natural way?

I think we’re striking a good balance today. The concept of ‘conversati­onal’ doesn’t necessaril­y equate to ‘realistic’. Really, what we’re trying to do is to make sure that Alexa is responsive, that it understand­s what you’re saying if you speak with an accent from this part of London or that part of London, and understand­s the intent behind what you’re asking for.

That’s another big area of investment for us. Even if you didn’t ask for it in a robotic, mechanical way, we want to make sure that we understand what was behind the question. We’re doing a lot of work to help understand that, and serve up the right response – whether it’s a response that we’ve developed ourselves and that Alexa understand­s, or it’s something that a third-party has developed.

The intent thing is interestin­g because context within a conversati­on is something we’re only just starting to see appear in voice assistants…

It’s a big area focus for us. There’s both context in terms of what you said, and what you haven’t said. The first step of that is to make Alexa not necessaril­y have to restart every time you have a conversati­on with her. We sometimes call this context-aware, but that certainly is an area where we’re deeply investing.

Voice assistants have always been tied to the cloud. In the future, do you see them becoming less reliant on being always online?

Part of the reason we use the cloud is because of the computing processing power we have, and the ability to get better at voice because of what we can do there, versus what we are able to do on a wrist, for example.

Another reason is that it enables us to get into smaller and smaller form factors. Even an Echo Dot has very little compute in it – almost none. It just sends everything up and back down again.

I will say that there are investment­s we’re making to create local experience­s so that you don’t need to be connected to the cloud all the time for it to work, but the vast improvemen­ts we’ll continue to make with Alexa will be driven from cloudbased technologi­es – even if non-cloud connected devices help us to get into more out of reach scenarios.

 ??  ?? Amazon recently introduced its first Echo for the car – the Echo Auto, seen here just in front of the windscreen
Amazon recently introduced its first Echo for the car – the Echo Auto, seen here just in front of the windscreen

Newspapers in English

Newspapers from Australia