Voice of trea­son: could hack­ers tar­get smart gad­gets?

Alexa and Siri may be all ears to your re­quests, but oth­ers could be lis­ten­ing in, says Margi Mur­phy

The Sunday Telegraph - Money & Business - - Front page -

The sound of home­own­ers yank­ing their Ama­zon Echo from the wall socket could be heard across the na­tion this week, af­ter one de­vice was caught lis­ten­ing to a fam­ily con­ver­sa­tion and send­ing it to an­other per­son. The re­cip­i­ent, a col­league of the owner, called to tell them to switch off Alexa im­me­di­ately.

It is not the first gaffe for the smart speaker. In 2017 a six-year-old girl be­came an in­ter­net sensation af­ter a Youtube clip showed her or­der­ing a $170 (£127) doll’s house and a packet of bis­cuits. Days later a broad­caster on Cal­i­for­nia’s CW-6 ut­tered the words “I love when the lit­tle girl says ‘Alexa: or­der me a doll’s house’,” kicking a num­ber of Ama­zon Echos across the state into ac­tion. Ama­zon promised to re­fund the trans­ac­tions. Then there was the Burger King ad­vert that de­lib­er­ately tried to ac­ti­vate view­ers’ Google Home speak­ers and read out a de­scrip­tion of its whop­per. It is not just the para­noid who are grow­ing con­cerned that hack­ers might lis­ten in or clone their voice.

A picture of Mark Zucker­berg, the king of tech­nol­ogy him­self, re­vealed that he tapes up his lap­top mi­cro­phone jack.

The new­found abil­ity to con­trol our lives us­ing voice alone is lib­er­at­ing, but there are risks as­so­ci­ated with it.

Si­mon Ed­wards, of global cyber se­cu­rity com­pany

Trend Mi­cro, says it is a le­git­i­mate con­cern. “A voice is just ze­ros and ones to a com­puter. That means we are able to ma­nip­u­late voices all the time – like pop stars with auto-tune. So why wouldn’t some­one use this ma­nip­u­la­tion for ne­far­i­ous pur­poses?”

Ed­wards, a se­cu­rity ex­pert at one of the largest global en­ter­prise

se­cu­rity com­pa­nies in the world, re­fuses to put an Ama­zon Echo in his home, yet he’s aware of the po­ten­tial it has to help our day-to-day lives. “My wife is dis­abled, she has mul­ti­ple scle­ro­sis and would find one of those things quite use­ful for con­trol­ling the en­vi­ron­ment around her,” he says. “She uses voice-ac­ti­vated bank­ing be­cause her hand shakes mak­ing it hard to type. How­ever, you could eas­ily re­play that. You would still need the more tra­di­tional se­cu­rity el­e­ments to make it com­pletely safe.”

Here lies the prob­lem. While voice-ac­ti­vated gad­gets serve a seem­ingly novel pur­pose, the way we speak is now be­com­ing a hot trend in se­cu­rity. HSBC, HMRC and Talk­talk are among the busi­nesses to in­cor­po­rate voice se­cu­rity into their ser­vices. Soft­ware com­pany Nu­ance, which is valued at

$4bn, spe­cialises in al­go­rithms that can de­tect when some­one isn’t who they say they are.

Brett Baranek, the com­pany’s res­i­dent voice bio­met­rics ex­pert, says that busi­nesses are “un­der pres­sure” to switch to more se­cure forms of au­then­ti­ca­tion.

The av­er­age Bri­ton ex­pe­ri­ences sev­eral se­cu­rity mea­sures a day, if not an hour, in­clud­ing Pins or fin­ger­print ID to un­lock smart­phones, or se­cu­rity ques­tions and pass­words when log­ging into on­line or tele­phone bank­ing.

“The con­sumer to­day is still us­ing a very in­se­cure method of ver­i­fy­ing their iden­tity, a mech­a­nism that was con­ceived in a world where there was no in­ter­net,” Baranek says.

“Some­thing that was iden­ti­fied in the Fifties and ap­plied in the 20th cen­tury is not fit for pur­pose any more.”

Voice recog­ni­tion does not al­ways get it right, how­ever. In 2017 a BBC jour­nal­ist man­aged to fool HSBC’S voice sys­tem us­ing his twin.

On the sur­face, this ap­pears a dis­as­ter and a PR night­mare for the bank, but it could be per­ceived as an all-round suc­cess.

Most fraud is at the hands of the vic­tims’ fam­ily or friends, who will most likely have ac­cess to per­sonal de­tails and can pass a Pin or se­cu­rity ques­tion­naire with flying colours, so voice recog­ni­tion cre­ates a big­ger ob­sta­cle.

There will be in­stances where voice hack­ers crack into ac­counts, but the scale will be much smaller, cost­ing com­pa­nies a lot less when they have to sort out

the mess. In a world where fraud lev­els are grow­ing by dou­ble dig­its, it is clear why busi­nesses might opt for voice recog­ni­tion.

Baranek says: “The fact that a fraud­ster can go on the dark web and buy mil­lions of per­sonal de­tails like email ad­dresses, pass­words and so­cial se­cu­rity num­bers for £20 shows that the cur­rent se­cu­rity sys­tem is bro­ken. That wasn’t the case five years ago, but it has re­ally spiked to epic pro­por­tions where breaches of mil­lions of ac­counts do not seem un­usual any more.”

With voice recog­ni­tion, there isn’t a data­base of voices wait­ing to be pil­fered – mean­ing com­pa­nies could elim­i­nate the sweep­ing breaches we have be­come ac­cus­tomed to. How­ever, there has been much talk about “deep­fakes” and Rus­sian trolls po­ten­tially ma­nip­u­lat­ing voices and videos to make it ap­pear as if high­pro­file politi­cians are say­ing sen­sa­tional things.

“I don’t want to make it sound like voice bio­met­rics are in­vin­ci­ble,” Ed­wards says. So do we need to whis­per around Alexa? Or muf­fle our voices when we’re stand­ing next to a mi­cro­phone-packed Sam­sung TV?

While we should not be afraid of gad­gets that record our voice, we should be very aware of the tech­nol­ogy that un­der­pins them, Ed­wards adds.

“Alexa’s abil­ity to un­der­stand ac­cents shows you how good a sys­tem is be­com­ing. But what this ma­chine is do­ing is con­vert­ing your voice to num­bers.

“You wouldn’t need to steal some­body’s voice: just the pat­tern recog­ni­tion al­go­rithm that can de­tect them.”

‘A voice is just ze­ros and ones to a com­puter. That means we are able to ma­nip­u­late voices all the time’

Voice-con­trolled de­vices such as Ama­zon’s Echo, be­low, have be­come hugely pop­u­lar

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.