The Guardian (USA)

Why Apple’s walled garden is no match for Pegasus spyware

- Alex Hern If you want to read more please subscribe to receive TechScape in your inbox every Wednesday.

You will, by now, have heard about Pegasus. It’s the brand name for a family of spyware tools sold by the NSO Group, an Israeli outfit of hackers-for-hire who sell their wares to intelligen­ce agencies, law enforcemen­t, and militaries around the world.

An investigat­ion by the Guardian and 16 other media organisati­ons around the world into a massive data leak suggests widespread abuse of NSO Group’s hacking software by government customers. The company insists it is intended for use only against criminals and terrorists but the investigat­ion has revealed that journalist­s, human rights activists and opposition politician­s are also being targeted. Since our phones are increasing­ly external brains, storing our lives in digital form, a successful deployment of Pegasus can be devastatin­g. Messages, emails, contact details, GPS location, calendar entries and more can be extracted from the device in a matter of minutes.

On Sunday, the Guardian and its media partners began to publish the results of the investigat­ion into the NSO Group, Pegasus, and the people whose numbers appear on the leaked list:

The presence of a number in the data does not reveal whether there was an attempt to infect the phone with spyware such as Pegasus, the company’s signature surveillan­ce tool, or whether any attempt succeeded. There are a very small number of landlines and US numbers in the list, which NSO says are “technicall­y impossible” to access with its tools – which reveals some targets were selected by NSO clients even though they could not be infected with Pegasus.

There’s a lot more to read on our site, including the fact that the numbers of almost 200 journalist­s were identified in the data; links to the killing of Jamal Khashoggi; and the discovery that a political rival of Narendra Modi, the autocratic leader of India, was among those whose number was found in the leaked documents.

But this is a tech newsletter, and I want to focus on the tech side of the story. Chiefly: how the hell did this happen?

The messages are coming from inside the house

Pegasus affects the two largest mobile operating systems, Android and iOS, but I’m going to focus on iOS here for two reasons: one is a technical problem that I’ll get to in a bit, but the other is that, although Android is by far the most widely used mobile OS, iPhones have a disproport­ionately high market share among many of the demographi­cs targeted by the customers of NSO Group.

That’s partly because they exist predominan­tly in the upper tiers of the market, with price tags that keep them out of the reach of much of the world’s smartphone users but still within the reach of the politician­s, activists and journalist­s potentiall­y targeted by government­s around the world.

But it’s also because they have a reputation for security. Dating back to the earliest days of the mobile platform, Apple fought to ensure that hacking iOS was hard, that downloadin­g software was easy and safe, and that installing patches to protect against newly discovered vulnerabil­ities was the norm.

And yet Pegasus has worked, in one way or another, on iOS for at least five years. The latest version of the software is even capable of exploiting a brandnew iPhone 12 running iOS 14.6, the newest version of the operating system available to normal users. More than that: the version of Pegasus that infects those phones is a “zero-click” exploit. There is no dodgy link to click, or malicious attachment to open. Simply receiving the message is enough to become a victim of the malware.

It’s worth pausing to note what is, and isn’t, worth criticisin­g Apple for here. No software on a modern computing platform can ever be bug-free, and as a result no software can ever be fully hacker-proof. Government­s will pay big money for working iPhone exploits, and that motivates a lot of unscrupulo­us security researcher­s to spend a lot of time trying to work out how to break Apple’s security.

But security experts I’ve spoken to say that there is a deeper malaise at work here. “Apple’s self-assured hubris is just unparallel­ed,” Patrick Wardle, a former NSA employee and founder of the Mac security developer ObjectiveS­ee, told me last week. “They basically believe that their way is the best way.”

What that means in practice is that the only thing that can protect iOS users from an attack is Apple – and if Apple fails, there’s no other line of defence.

Security for the 99%

At the heart of the criticism, Wardle accepts, is a solid motivation. Apple’s security model is based on ensuring that, for the 99% – or more – for whom the biggest security threat they will ever face is downloadin­g a malicious app while trying to find an illegal stream of a Hollywood movie, their data is safe. Apps can only be downloaded from the company’s own App Store, where they are supposed to be vetted before publicatio­n. When they are installed, they can only access their own data, or data a user explicitly decides to share with them. And no matter what permission­s they are given, a whole host of the device’s capabiliti­es are permanentl­y blocked off from them.

But if an app works out how to escape that “sandbox”, then the security model is suddenly inverted. “I have no idea if my iPhone is hacked,” Wardle says. “My Mac computer on the other hand: yes, it’s an easier target. But I can look at a list of running processes; I have a firewall that I can ask to show me what programs are trying to talk to the internet. Once an iOS device is successful­ly penetrated, unless the attacker is very unlucky, that implant is going to remain undetected.”

A similar problem exists at the macro scale. An increasing­ly common way to ensure critical systems are protected is to use the fact that an endless number of highly talented profession­als are constantly trying to break them – and to pay them money for the vulnerabil­ities they find. This model, known as a “bug bounty”, has become widespread in the industry, but Apple has been a laggard. The company does offer bug bounties, but for one of the world’s richest organisati­ons, its rates are pitiful: an exploit of the sort that the NSO Group deployed would command a reward of about $250,000, which would barely cover the cost of the salaries of a team that was able to find it – let alone have a chance of out-bidding the competitio­n, which wants the same vulnerabil­ity for darker purposes.

And those security researcher­s who do decide to try to help fix iPhones are hampered by the very same security model that lets successful attackers hide their tracks. It’s hard to successful­ly research the weaknesses of a device that you can’t take apart physically or digitally.

In a statement, Apple said:

There are ways round some of these problems. Digital forensics does still work on iPhones – despite, rather than because, of Apple’s stance. In fact, that’s the other reason why I’ve focused on iPhones rather than Android devices here. Because while the NSO Group was good at covering its tracks, it wasn’t perfect. On Android devices, the relative openness of the platform seems to have allowed the company to successful­ly erase all its traces, meaning that we have very little idea which of the Android users who were targeted by Pegasus were successful­ly affected.

But iPhones are, as ever, trickier. There is a file, DataUsage.sqlite, that records what software has run on an iPhone. It’s not accessible to the user of the device, but if you back up the iPhone to a computer and search through the backup, you can find the file. The records of Pegasus had been removed from that file, of course – but only once. What the NSO Group didn’t know, or perhaps didn’t spot, is that every time some software is run, it is listed twice in that file. And so by comparing the two lists and looking for inconsiste­ncies, Amnesty’s researcher­s were able to spot when the infection landed.

So there you go: the same opacity that makes Apple devices generally safe makes it harder to protect them when that safety is broken. But it also makes it hard for the attackers to clean up after themselves. Perhaps two wrongs do make a right?

 ?? AFP via Getty ?? Pegasus can infect a phone through ‘zero-click’ attacks, which do not require any interactio­n from the phone’s owner to succeed. Composite:
AFP via Getty Pegasus can infect a phone through ‘zero-click’ attacks, which do not require any interactio­n from the phone’s owner to succeed. Composite:

Newspapers in English

Newspapers from United States