Tech firms called upon to help fight terrorism
LOS ANGELES – British officials have renewed demands for stricter regulation of the software industry to prevent the online services they develop from becoming a harbor for terrorists.
Conversations on WhatsApp, Telegram and Signal can’t be tapped by police as simply as traditional phone calls or emails, hindering the ability of law enforcement officials to spy on terrorists coordinating activities. Though evidence is often scant, politicians have suggested that several attacks in Europe over the last couple of years could have been prevented if investigators had been able to spy on secure online chats.
“We cannot allow this ideology the safe space it needs to breed,” British Prime Minister Theresa May said in the wake of a terrorist rampage that left seven people dead in London on Saturday.
Officials around the world have called on companies that produce chat apps to find ways to be more accommodating to government investigators. And they’ve also demanded that social media services do more to stop the spread of offensive material, such as beheading videos or terrorist recruitment messages.
It’s been a challenge, though, for lawmakers to find ways to strike a balance between the necessities of secure communications and law enforcement evidence-gathering. Putting pressure on the software industry last year led to “encouraging” gains for the counterterrorism community, European leaders said last week. What the latest criticism might bring about is unclear.
The tech industry has gotten more aggressive with encryption.
Encryption programs scramble the contents of messages or other files using a formula that integrates special passwords. Only people who know the secret phrase can decrypt or unscramble the information.
With hackers increasingly trying to get at people’s data, app makers have turned to encryption as a way to prevent data breaches. For example, if hackers managed to infiltrate WhatsApp systems, they still wouldn’t be able to read people’s conversations, assuming the encryption is set up correctly. In that situation, only the users in a conversation have the requisite keys.
Governments are supportive of such technology, and many regulations now either mandate it or absolve firms for liability for making encryption part of their security procedures.
“Everybody who works in IT recognizes encryption is one of the lines of defense,” said Nigel Hawthorne, European privacy lead for Campbell, Calif., cybersecurity startup Skyhigh Networks.
But encryption also acts as a block on mass online surveillance. Authorities had come to rely on searching through the online data to identify enemy maneuvers, just as they had done with letters, radios and phone calls before.
The majority of Internet traffic is now encrypted, which limits the amount of information accessible to snooping hackers and investigators. But plenty of data remain loosely encrypted, like most emails, because the companies that provide the email services usually have a password too. That means they can turn over messages when ordered by a court or law enforcement body.
Authorities have been crafty. Leaked National Security Agency files in 2013 said agency experts had found ways to defeat encryption and access some discussions on the Internet.
But exploiting security gaps and coming up with workarounds have been viewed as inefficient and insufficient. Continued complaints from officials in the US and Europe suggest that they want broader access to online conversations.
“If you are law enforcement, you want to have all the powers you can think of, and there’s no doubt full encryption stymies that,” Hawthorne said.
Political pressure has led to action from tech companies.
The wave of criticism following attacks in London and in Manchester has come with mostly generalities about what officials desire from the tech industry.
That might be because many of the easiest-to-solve issues have been addressed. A slew of Islamic State attacks has led Facebook, Twitter, YouTube and other platforms to pledge quicker removals of uploads by terrorists. Each of the services bans violent imagery or posts that incite violence or hatred.
The companies have deployed technology to identify previously banned material, and they’ve expanded their moderation teams to respond faster when users flag something objectionable.
Politicians still want the companies to be more aggressive about developing software that automatically catches never-before-seen terrorist material. But that’s a challenge that lacks an overnight solution, technologists say.
There’s a growing acknowledgment, including from Congress, that encryption shouldn’t be prohibited. May and other leaders may instead want tech companies to help devise surveillance measures that could be effective despite messages being untappable.
Authorities may seek new regulations that allow them to hack into apps or gadgets. In some cases, the tech companies could be party to the legal hacking – perhaps being forced to swap the encryption formula so that any messages in a thread from that point forward are readable by investigators.
Governments could ask for software vendors to be more forthcoming about what information is unscrambled. Or there could be rules requiring social media companies to proactively turn over data when certain suspects reach out to a new contact.
Collaboration also may be possible in helping authorities access in real time whatever data are legally available, said Daniel Weitzner, principal research scientist at MIT’s Computer Science and Artificial Intelligence Lab.
Weitzner said a discussion about data retrieval speeds would be more fruitful than “dead-end arguments about putting back doors” in encryption technology – arguments the tech world has fought because any vulnerability opened to law enforcement could be discovered and exploited by criminal hackers as well.
The European Commission and British ministers have said they would introduce in the coming weeks specific legislative proposals on apps, data and encryption.
– Los Angeles Times/TNS