THE DANGERS OF SPEED
The pursuit to build faster devices that connect the globe at lightning rates may come at a troubling price, the cybersecurity expert says
SAN FRANCISCO >> Cybersecurity researcher Paul Kocher believes we are all hooked on speed. Computer speed.
Kocher, an independent researcher considered a leader in cryptography, has worried over the price the computing world paid to pursue building faster devices that now connect the globe at lightning speed. But in January, the tech world got a glimpse into what that sacrifice may have cost: two hardware bugs known as Spectre and Meltdown, which impact virtually all smartphones, computers and cloud servers built since 1995. Because of an architecture flaw in most existing microprocessors, hackers are able to gain access to private data by breaking down a fundamental barrier between user applications, such as emails, and the operating system.
Kocher discovered two variants of the Spectre bug last year, coincidentally with five other researchers in the United States, Europe and Australia who were pursuing the same flaw. Spectre is possible because of a core microprocessor function called speculative execution, in which the chip tries to guess the user’s next action to boost the speed; Spectre, in short, tricks the chip’s next speculative execution to hand over secret data of the user.
The discovery and public reveal rocked the tech industry. Microprocessor companies such as Intel took a major blow in their share prices, and Congress asked Apple, Microsoft and Google about the delay in disclosing information about Spectre and Meltdown to the public.
From his home office in San Francisco, Kocher preached the old adage “slow and steady wins the race” when it came to the ever-relevant technological struggle between privacy and convenience. He advocated for computer and smartphone hardware makers such as Apple, Intel and Microsoft to give consumers the choice of a privacy-first computing device that may be slower than its competitors.
He remained optimistic about the future of data privacy, believing companies will be smart enough to figure out a balance between speed and privacy — or government regulations, backed by public frustration, will make computers safer, as they have done for building construction or airplanes.
This interview has been edited for length and clarity.
Q
How did you start looking for what turned out to be the Spectre bug?
A
I have a theory that we focus too much on performance and not enough on security. I know that might sound quite self-serving since I am a security
guy. But if you look at the broad picture trend right now, the global economy is losing somewhere near $1 trillion a year due to computer insecurity, and the benefits of performance improvements are nowhere near that high. So there’s some economic rationale there.
When I was looking at this question of how these computational environments are built, one of the people mentioned there is this thing called speculative execution, which is where a processor will make guesses about what’s going to happen next and try to get data ready and do computations already before it’s 100 percent sure. As a security person, this gets me really uncomfortable because people assume that processors are doing them in the exact order you requested. I started thinking, “How could this go wrong?”
Q
Some companies knew about the existence of Spectre and Meltdown for months, and in the end the public got to know about them because The Register (a British tech news website) broke the news. Why did it take so long for them to respond, and how well did they respond?
A
The general process after you have found a software bug is that you let the company that makes the products know, and after some period of time — around one to four months — they normally have a fix out. But there were a bunch of reasons why this bug didn’t work well in that process. One, hardware takes a lot longer time, if at all, to respond to. Two, the number of employees who had an absolutely legitimate need to know vastly outnumbered those who can keep a secret.
Q
How did we get to this point where a bug that affects nearly all computing devices in the world can exist?
A
For 60 years now in computer science, people’s careers have been made by making faster computers. As a society, we decided the upsides of having faster computers far exceeded the downsides. The idea you can’t hurt performance in your quest for security is a reflection of priorities in the industry. If security is the second-class citizen, then it’s going to get traded away regularly.
Q
The trade-off seems to be showing up to some degree with Facebook right now. Is that a fair comparison?
A
Yeah, I think we are going through another round with Facebook. There is a betrayal in confidence. Facebook’s interest in investing in protecting your information is vastly smaller than yours. Theirs actually may be negative. Your loss doesn’t remotely translate to their loss. This results in massive under-spending in security. If you can make money doing bad things, somebody will figure out a business doing that.
Q
Whether it be Facebook’s Cambridge Analytica fallout, the Equifax leak or the Yahoo leak, this seems to be so commonplace. What needs to happen?
A
I think it’s going to take a whole new shift, like the one we saw with airplanes, where we went from first making them fly to then making them safe. We have to eventually do that in
this industry. It’s a trade-off we see in structural engineering. We don’t build buildings where they are one twig away from collapsing. We don’t make airplanes where they are one engine away from crashing.
Q
But after all these leaks, it feels like data privacy is all so pointless. As someone who’s been in the industry for so long, do you share that pessimism?
A
I would have to push back on that. For one, it’s not newsworthy when systems are not broken. There have been architectures that have been very successful, like the chip cards implanted ubiquitously around the world in debit and credit cards. Those cost only a few cents to make per chip to hold the cryptographic keys that never leave the chip and they authenticate when you use the chip. We see Apple doing that with some degree with the Security Enclave in iPhones. It can manage some secrets and perform some high-security authentications like Touch ID and Face ID and keep that info segregated from the processor.
Q
You believe computers are too fast for their own good now. Are you saying we should go back in time with our computers?
A
Not necessarily; I don’t mean like taking an Apple IIe (personal computer) processor and sticking it in a Macbook. There are a lot of things we can do now with a computer from 20 years ago. It was plenty fast then for approving wire transfer, reading email, just about anything that involves human interaction and not a lot of graphics. We don’t have computers today that are suitable for approving wire transfers from a true security perspective. These machines now aren’t designed to do that robust privacy protection; they are designed to play video games.