Voice of treason: could hackers target smart gadgets?
Alexa and Siri may be all ears to your requests, but others could be listening in, says Margi Murphy
The sound of homeowners yanking their Amazon Echo from the wall socket could be heard across the nation this week, after one device was caught listening to a family conversation and sending it to another person. The recipient, a colleague of the owner, called to tell them to switch off Alexa immediately.
It is not the first gaffe for the smart speaker. In 2017 a six-year-old girl became an internet sensation after a Youtube clip showed her ordering a $170 (£127) doll’s house and a packet of biscuits. Days later a broadcaster on California’s CW-6 uttered the words “I love when the little girl says ‘Alexa: order me a doll’s house’,” kicking a number of Amazon Echos across the state into action. Amazon promised to refund the transactions. Then there was the Burger King advert that deliberately tried to activate viewers’ Google Home speakers and read out a description of its whopper. It is not just the paranoid who are growing concerned that hackers might listen in or clone their voice.
A picture of Mark Zuckerberg, the king of technology himself, revealed that he tapes up his laptop microphone jack.
The newfound ability to control our lives using voice alone is liberating, but there are risks associated with it.
Simon Edwards, of global cyber security company
Trend Micro, says it is a legitimate concern. “A voice is just zeros and ones to a computer. That means we are able to manipulate voices all the time – like pop stars with auto-tune. So why wouldn’t someone use this manipulation for nefarious purposes?”
Edwards, a security expert at one of the largest global enterprise
security companies in the world, refuses to put an Amazon Echo in his home, yet he’s aware of the potential it has to help our day-to-day lives. “My wife is disabled, she has multiple sclerosis and would find one of those things quite useful for controlling the environment around her,” he says. “She uses voice-activated banking because her hand shakes making it hard to type. However, you could easily replay that. You would still need the more traditional security elements to make it completely safe.”
Here lies the problem. While voice-activated gadgets serve a seemingly novel purpose, the way we speak is now becoming a hot trend in security. HSBC, HMRC and Talktalk are among the businesses to incorporate voice security into their services. Software company Nuance, which is valued at
$4bn, specialises in algorithms that can detect when someone isn’t who they say they are.
Brett Baranek, the company’s resident voice biometrics expert, says that businesses are “under pressure” to switch to more secure forms of authentication.
The average Briton experiences several security measures a day, if not an hour, including Pins or fingerprint ID to unlock smartphones, or security questions and passwords when logging into online or telephone banking.
“The consumer today is still using a very insecure method of verifying their identity, a mechanism that was conceived in a world where there was no internet,” Baranek says.
“Something that was identified in the Fifties and applied in the 20th century is not fit for purpose any more.”
Voice recognition does not always get it right, however. In 2017 a BBC journalist managed to fool HSBC’S voice system using his twin.
On the surface, this appears a disaster and a PR nightmare for the bank, but it could be perceived as an all-round success.
Most fraud is at the hands of the victims’ family or friends, who will most likely have access to personal details and can pass a Pin or security questionnaire with flying colours, so voice recognition creates a bigger obstacle.
There will be instances where voice hackers crack into accounts, but the scale will be much smaller, costing companies a lot less when they have to sort out
the mess. In a world where fraud levels are growing by double digits, it is clear why businesses might opt for voice recognition.
Baranek says: “The fact that a fraudster can go on the dark web and buy millions of personal details like email addresses, passwords and social security numbers for £20 shows that the current security system is broken. That wasn’t the case five years ago, but it has really spiked to epic proportions where breaches of millions of accounts do not seem unusual any more.”
With voice recognition, there isn’t a database of voices waiting to be pilfered – meaning companies could eliminate the sweeping breaches we have become accustomed to. However, there has been much talk about “deepfakes” and Russian trolls potentially manipulating voices and videos to make it appear as if highprofile politicians are saying sensational things.
“I don’t want to make it sound like voice biometrics are invincible,” Edwards says. So do we need to whisper around Alexa? Or muffle our voices when we’re standing next to a microphone-packed Samsung TV?
While we should not be afraid of gadgets that record our voice, we should be very aware of the technology that underpins them, Edwards adds.
“Alexa’s ability to understand accents shows you how good a system is becoming. But what this machine is doing is converting your voice to numbers.
“You wouldn’t need to steal somebody’s voice: just the pattern recognition algorithm that can detect them.”
‘A voice is just zeros and ones to a computer. That means we are able to manipulate voices all the time’
Voice-controlled devices such as Amazon’s Echo, below, have become hugely popular