PC Pro

WHAT AMBER RUDD NEEDS TO KNOW ABOUT ENCRYPTION

The home secretary Amber Rudd says she doesn’t need to know how encryption works to know that it’s good for terrorists. She’s wrong. Davey Winder explains why end-toend encryption is vital

-

The home secretary is revelling in her own ignorance. In the aftermath of the Westminste­r Bridge attack, Amber Rudd demanded that “organisati­ons like WhatsApp… don’t provide a secret place for terrorists to communicat­e with each other” and called for the security services to be given backdoor access.

A few months later, playing to the crowd at the Conservati­ve party conference, the home secretary said she didn’t need to “understand how encryption works to understand how it’s helping the criminals”.

It’s hard to comprehend a situation in any other walk of life where someone would boast of their own ignorance. Would the CEO of Ford take the stage at the company’s shareholde­rs meeting and declare he knew little about the workings of the combustion engine? Would a police commission­er admit that they don’t really grasp the law?

Understand­ing how encryption works isn’t the same as being the person who chooses cryptograp­hy as their Mastermind specialist subject. Nobody’s suggesting that Amber Rudd must be able to tell the difference between 3DES and SHA-2. However, having a grip on what encryption is, how it’s used and how weakening it in any way will have an impact far beyond is vital if you are the home secretary.

While Rudd insists that she’s going to “engage with the security services” to determine the best method of combating the “end-to-end encryption threat”, if she engaged with people who work in the IT security field, she would

be incredibly hard-pressed to find anyone who agreed with her proposed solution to kick in the backdoor.

So here, for Amber Rudd and your good selves, is an explanatio­n of what she needs to know about encryption and why weakening encryption is a truly terrible idea.

How does end-to-end encryption work?

Encryption can’t easily be explained in a book, let alone a four-page article, but here’s the skinny.

The ancient Greeks used a “scytale” to encode military communicat­ions more than 2,000 years ago. A scytale was a wooden pole with a message written on a parchment spiralwrap­ped around it. When unrolled, the message was scrambled, and could only be read properly if wrapped around another pole of the same size. This is how encryption works: the pole is the key, the wrapped message is the plaintext and the unwrapped is the ciphertext. The ancient Greeks also came up with proper ciphers, where codes used substituti­ons or transposit­ions that could only be “unlocked” if both parties knew the code key, or cipher. Think of a cipher as a modern algorithm and the history lesson is complete.

As to modern encryption, we have a symmetric key variety that requires both parties to have access to the same key to both create and decrypt the text, and an asymmetric variety better known as “public-key encryption”. Public-key encryption has a public key that can be known, and used, by anyone to scramble a conversati­on. This conversati­on can only be unscramble­d if you have the other key pair, the private or secret one that isn’t known to anyone else. At least, that’s the theory.

In practice, that private key is often managed by the service provider or encryption platform you are using. This is the weak link in the chain, as the provider could theoretica­lly eavesdrop on your encrypted conversati­on. It could also pass that key to law enforcemen­t if a court order forced them so to do.

This is where end-to-end encryption enters the scene. Rather than the service provider managing and storing your private key, it really is private, known only to you and stored on your end point device, such as the smartphone you’ve installed WhatsApp on. The provider can’t turn the key over to law enforcemen­t because it doesn’t have it. All it has is the public key, which is useless without the secret key pairing. This becomes even more secure when you consider that end-to-end encrypted messaging apps also employ “forward secrecy” protocols that create new keys for every conversati­on.

These “keys” are just mathematic­al hashes. A hash is a function used to map data of arbitrary size to data of fixed size. In simpler, home secretarys­ized terms, the key will be a value that is almost impossible to determine without knowing the original numbers used to create it. So, if the original input is 25,225 and the hashing algorithm value is 150, you multiply the one with the other to arrive at 3,783,750. It’s a big number from which it’s very hard to derive the input value (25,225) without knowing the multiplier (150). When we talk about the size of encryption keys in bits, by the way, we are really talking about the size of the hash value that has been used. A 128-bit key can have a possible 300,000,000,000,000, 000,000,000,000,000,000,000 combinatio­ns, for example.

Can end-to-end encryption be broken?

Things are never that black and white. Although the encryption itself is solid, there are attack methodolog­ies that could be employed. An attacker could con the sender of an encrypted message to deliver it to them instead of the intended recipient, using a combinatio­n of social engineerin­g and/or malware on a compromise­d device. This scenario is known as a “man-in-the-middle attack”.

The public key used to encrypt the message would correspond to the private key held by the attacker, who could then decrypt and copy the message before encrypting it with the intended recipient’s public key and sending it on to them. This way both sender and recipient would, if all

“The provider can’t turn over your private key to law enforcemen­t because it doesn’t have it”

went to plan, be unaware that any snooping had occurred.

That assumes the messaging system doesn’t already have something in place to defend against such attacks – something such as a character string generated from the public keys of both users, who can then confirm with the other that these are correct before the encrypted message is sent. A man-in-the-middle would have no knowledge of these strings, even if they held both public keys, and so the attack would be foiled before it could commence. WhatsApp does just this using unique 60-digit identifier­s, and if that’s too much hassle there’s a QR code that can be scanned instead if both users are in the same room.

Noise and Signal

This has become the real “problem” for home secretary Amber Rudd and the security wonks, who see messaging services such as WhatsApp as empowering terrorists and criminals. Despite what you may have read in the mainstream media, and indeed in some online publicatio­ns that should have known better, there is no WhatsApp backdoor. What was described as such was a potential vulnerabil­ity to man-in-the-middle attacks, and one that remains disputed by security experts.

The reported flaw revolves around how WhatsApp handles changes to your contacts’ public keys. A change can happen legitimate­ly if your contact changes phone or reinstalls the app, but it can also happen if an attacker impersonat­es a user to intercept messages. Unless the user enables the “Show security notificati­ons” function from the Security settings, WhatsApp will make these changes silently without informing the user. WhatsApp uses the “Signal Protocol” for end-to-end encryption, but it doesn’t prevent the message being sent if your contact’s key changes; if you’ve opted to receive security notificati­ons, though, it will let you know afterwards. It’s all about usability rather than security, and, with a billion users, WhatsApp went for the former.

Does this make WhatsApp insecure? No, not really. However, the Signal ( app itself does block messages if there’s any doubt over the recipient’s key, which makes the Signal app more secure and so more likely to be used by those who really do want to keep their conversati­ons secret.

This is a perfect example of why Amber Rudd should care about how encryption works: small details make a big difference when it comes to forming policy and crafting laws. It’s a perfect example of why backdoorin­g WhatsApp would make journalist­s, activists and ordinary folk who want secure methods of communicat­ion for perfectly legitimate reasons more vulnerable; and why it would have next-to-no impact upon the serious criminals or terrorists who most likely use more secure applicatio­ns such as Signal anyway.

It’s almost as if the government is driven by mass surveillan­ce of its citizens than specific surveillan­ce of suspected terrorists…

Why ignorance isn’t bliss for Amber Rudd

If a government-approved encrypted messaging app existed, how many users do you think it would have? Nowhere near the billion that WhatsApp has, that’s for sure. In the event of any kind of fantasy backdoorin­g into WhatsApp, a true end-to-end encryption replacemen­t would soon take its place. How would usage of that be legislated against and enforced?

Besides, backdoors and eavesdropp­ing on conversati­onal content are not the only methods of surveillan­ce. There’s the collection and analysis of metadata for example. Metadata, data providing informatio­n about other data, is routinely collected by services that triumphant­ly herald the encryption of the conversati­on end-to-end. Yet, metadata is misunderst­ood by far more people than encryption.

Edward Snowden once tweeted that people who have difficulty understand­ing what metadata is should just replace the term with “activity records” for instant clarity. The WhatsApp privacy policy states that it collects informatio­n when you install, access or use the service. Informatio­n about “your activity” including how you interact with other users, log files, device location data, IP addresses, contact phone numbers and so on. It also collects data about you from your contacts, of course, when they communicat­e with you or add you to their address books.

While WhatsApp doesn’t store this on its own servers, it’s stored on your device and any server your device backs up to. It’s informatio­n that can paint a much more accurate portrait of your activity than you might think. If someone can reveal that you contacted a sex chat line on a specific date and for how long you were connected, does it matter that they don’t know the precise detail of the conversati­on? My partner would be less concerned with the detail than the broad strokes, and it’s largely the same for “big brother” who can use such informatio­n as the basis for further investigat­ions.

Here’s the thing: not understand­ing how end-to-end encryption works severely reduces your ability to legislate against it. Banning such encryption on an app-by-app basis would inevitably mean banning the open-source projects that power them. Just as there are many end-to-end messaging apps, there are many end-to-end encryption platforms. Ban one and people – especially the terrorist or criminal targets – will move to another, and another.

Ban the platforms behind them and you take the wrecking ball not only to messaging apps, but the financial and other secure services that rely on them. We can talk about the thin edge of the wedge, but the truth of the matter is that business relies upon encrypted data. Anything that weakens encryption would, therefore, be catastroph­ic in purely commercial terms. It would make Brexit look like a bank holiday.

If Amber Rudd thinks that the financial institutio­ns would allow such a thing to happen, then her ignorance of how things work extends far beyond cryptograp­hy.

“Not understand­ing how end-to-end encryption works reduces your ability to legislate against it”

 ??  ?? 45
45
 ??  ?? 44
44
 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Kingdom