RANSOMWARE ATTACK REVEALS BACKDOOR RISK, ADVOCATES SAY
Cybertools allegedly stolen from NSA used to craft WannaCry
Privacy experts are calling the global ransomware attack that hit 150 countries a prime example of why requiring tech companies to create backdoors into computer programs is a bad idea, because of the danger those digital keys might be stolen.
“This is a fine example of the difficulty of keeping secrets,” said Cooper Quintin, a staff technologist with the Electronic Frontier Foundation, a digital liberties non-profit in San Francisco.
The WannaCry ransomware attack hit Friday and was contained relatively quickly, but not before it infected at least 200,000 computers. The software used a flaw in the code for the Windows operating system that Microsoft and others said was stolen from the National Security Agency or a group believed to be affiliated with it, where it is thought to constitute part of a U.S. cyber-attack arsenal. The NSA has said it did not create ransomware tools but has not addressed whether the original exploitable flaw the ransomware was based on came from stolen NSA cyber tools.
The fact they appear to have been stolen from a U.S. government-linked group and are now in the public domain has bolstered tech companies’ contention security backdoors would do more harm than good — simply because these work-arounds risk ending up in criminal hands.
“This attack provides yet another example of why the stockpiling of vulnerabilities by governments is such a problem,” Brad Smith, Microsoft’s chief legal counsel, said in a blog post. “Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage.”
Government officials and law enforcement have pressed tech companies to write security keys into computer programs and operating systems that would aid them in gaining access to the email, networks or smartphones of suspected criminals.
This was the heart of the legal battle waged between Apple and the FBI for 43 days last year as the agency sought Apple’s help to write software that would aid it in breaking into an iPhone used by San Bernardino gunman Syed Rizwan Farook.
A bill proposed last August by Senators Richard Burr, R-N.C., and Dianne Feinstein, D-Calif., of the Senate Intelligence Committee would have required companies to provide technical support to get to encrypted data but didn’t specify how that would have to be done.
During its legal battle, Apple argued it shouldn’t be required to write code to allow the FBI to try to get into the iPhone because it was simply too dangerous — once written, it could too easily get hacked, leaked and misused. In an Op-Ed in The Washing
ton Post at the time, Apple senior vice president of software engineering Craig Federighi sounded eerily prescient about the damage such stolen tools could wreak.
“Great software has seemingly limitless potential to solve human problems — and it can spread around the world in the blink of an eye. Malicious code moves just as quickly, and when software is created for the wrong reason, it has a huge and growing capacity to harm millions of people,” he wrote.
The ransomware attack that started Friday is linked to code that started with a U.S. government group but ended up in criminal hands. A group called the Shadow Brokers said it stole the Windows vulnerabilities, called exploits, and posted them online in April, leading Microsoft to post a patch for those flaws.
But because the vulnerability was in older Windows operating systems, one of which Microsoft had stopped supporting, users who hadn’t applied the patch were left vulnerable when a hacker organization — now thought to be the same one behind the Sony Pictures Entertainment hack — used the flaw to create malware that paralyzed computers.
What happened this week won’t be lost on judges in the future should the government again try to get tech firms to build backdoor access, said Kristen Eichensehr, a law professor at UCLA with an expertise in national security law and cybersecurity.
“What we’ve seen happen with WannaCry lends credence to that — and certainly and any court is going to take it into account. The government has shown that it itself is persistently incapable of keeping its tools secure,” she said.