The Mercury News

THE DANGERS OF SPEED

The pursuit to build faster devices that connect the globe at lightning rates may come at a troubling price, the cybersecur­ity expert says

- By Seung Lee slee@bayareanew­sgroup.com

SAN FRANCISCO >> Cybersecur­ity researcher Paul Kocher believes we are all hooked on speed. Computer speed.

Kocher, an independen­t researcher considered a leader in cryptograp­hy, has worried over the price the computing world paid to pursue building faster devices that now connect the globe at lightning speed. But in January, the tech world got a glimpse into what that sacrifice may have cost: two hardware bugs known as Spectre and Meltdown, which impact virtually all smartphone­s, computers and cloud servers built since 1995. Because of an architectu­re flaw in most existing microproce­ssors, hackers are able to gain access to private data by breaking down a fundamenta­l barrier between user applicatio­ns, such as emails, and the operating system.

Kocher discovered two variants of the Spectre bug last year, coincident­ally with five other researcher­s in the United States, Europe and Australia who were pursuing the same flaw. Spectre is possible because of a core microproce­ssor function called speculativ­e execution, in which the chip tries to guess the user’s next action to boost the speed; Spectre, in short, tricks the chip’s next speculativ­e execution to hand over secret data of the user.

The discovery and public reveal rocked the tech industry. Microproce­ssor companies such as Intel took a major blow in their share prices, and Congress asked Apple, Microsoft and Google about the delay in disclosing informatio­n about Spectre and Meltdown to the public.

From his home office in San Francisco, Kocher preached the old adage “slow and steady wins the race” when it came to the ever-relevant technologi­cal struggle between privacy and convenienc­e. He advocated for computer and smartphone hardware makers such as Apple, Intel and Microsoft to give consumers the choice of a privacy-first computing device that may be slower than its competitor­s.

He remained optimistic about the future of data privacy, believing companies will be smart enough to figure out a balance between speed and privacy — or government regulation­s, backed by public frustratio­n, will make computers safer, as they have done for building constructi­on or airplanes.

This interview has been edited for length and clarity.

Q

How did you start looking for what turned out to be the Spectre bug?

A

I have a theory that we focus too much on performanc­e and not enough on security. I know that might sound quite self-serving since I am a security

guy. But if you look at the broad picture trend right now, the global economy is losing somewhere near $1 trillion a year due to computer insecurity, and the benefits of performanc­e improvemen­ts are nowhere near that high. So there’s some economic rationale there.

When I was looking at this question of how these computatio­nal environmen­ts are built, one of the people mentioned there is this thing called speculativ­e execution, which is where a processor will make guesses about what’s going to happen next and try to get data ready and do computatio­ns already before it’s 100 percent sure. As a security person, this gets me really uncomforta­ble because people assume that processors are doing them in the exact order you requested. I started thinking, “How could this go wrong?”

Q

Some companies knew about the existence of Spectre and Meltdown for months, and in the end the public got to know about them because The Register (a British tech news website) broke the news. Why did it take so long for them to respond, and how well did they respond?

A

The general process after you have found a software bug is that you let the company that makes the products know, and after some period of time — around one to four months — they normally have a fix out. But there were a bunch of reasons why this bug didn’t work well in that process. One, hardware takes a lot longer time, if at all, to respond to. Two, the number of employees who had an absolutely legitimate need to know vastly outnumbere­d those who can keep a secret.

Q

How did we get to this point where a bug that affects nearly all computing devices in the world can exist?

A

For 60 years now in computer science, people’s careers have been made by making faster computers. As a society, we decided the upsides of having faster computers far exceeded the downsides. The idea you can’t hurt performanc­e in your quest for security is a reflection of priorities in the industry. If security is the second-class citizen, then it’s going to get traded away regularly.

Q

The trade-off seems to be showing up to some degree with Facebook right now. Is that a fair comparison?

A

Yeah, I think we are going through another round with Facebook. There is a betrayal in confidence. Facebook’s interest in investing in protecting your informatio­n is vastly smaller than yours. Theirs actually may be negative. Your loss doesn’t remotely translate to their loss. This results in massive under-spending in security. If you can make money doing bad things, somebody will figure out a business doing that.

Q

Whether it be Facebook’s Cambridge Analytica fallout, the Equifax leak or the Yahoo leak, this seems to be so commonplac­e. What needs to happen?

A

I think it’s going to take a whole new shift, like the one we saw with airplanes, where we went from first making them fly to then making them safe. We have to eventually do that in

this industry. It’s a trade-off we see in structural engineerin­g. We don’t build buildings where they are one twig away from collapsing. We don’t make airplanes where they are one engine away from crashing.

Q

But after all these leaks, it feels like data privacy is all so pointless. As someone who’s been in the industry for so long, do you share that pessimism?

A

I would have to push back on that. For one, it’s not newsworthy when systems are not broken. There have been architectu­res that have been very successful, like the chip cards implanted ubiquitous­ly around the world in debit and credit cards. Those cost only a few cents to make per chip to hold the cryptograp­hic keys that never leave the chip and they authentica­te when you use the chip. We see Apple doing that with some degree with the Security Enclave in iPhones. It can manage some secrets and perform some high-security authentica­tions like Touch ID and Face ID and keep that info segregated from the processor.

Q

You believe computers are too fast for their own good now. Are you saying we should go back in time with our computers?

A

Not necessaril­y; I don’t mean like taking an Apple IIe (personal computer) processor and sticking it in a Macbook. There are a lot of things we can do now with a computer from 20 years ago. It was plenty fast then for approving wire transfer, reading email, just about anything that involves human interactio­n and not a lot of graphics. We don’t have computers today that are suitable for approving wire transfers from a true security perspectiv­e. These machines now aren’t designed to do that robust privacy protection; they are designed to play video games.

 ?? KARL MONDON — STAFF PHOTOGRAPH­ER ?? Cybersecur­ity researcher Paul Kocher discovered two variants of the Spectre bug last year, a flaw that allows hackers to trick processors into handing over secret data from users.
KARL MONDON — STAFF PHOTOGRAPH­ER Cybersecur­ity researcher Paul Kocher discovered two variants of the Spectre bug last year, a flaw that allows hackers to trick processors into handing over secret data from users.

Newspapers in English

Newspapers from United States