Mac Format

Apple’s big privacy push

what is Apple doing to keep your data safe and sound?

- Written By Alex Blake

Apple has always been known for its pro-privacy stance. Back in 2010, Steve Jobs laid it out for anyone who wasn’t sure: “Privacy means people know what they’re signing up for, in plain English, and repeatedly.”

But even by Apple’s own lofty standards, these days the company is really pushing the privacy envelope. What prompted this renewed focus on privacy? And what is the world’s biggest tech company doing to keep your data safe?

The first of those questions is answered relatively easily: because Apple’s rivals keep screwing up. From the Facebook Cambridge Analytica scandal to Google hoovering up your conversati­ons, the other big Silicon Valley tech firms seem hell-bent on handing Apple PR victories. And Apple is not one to turn down a free PR victory.

However, this goes beyond mere marketing. Apple isn’t just saying it respects your privacy without the evidence to back it up (we’re looking at you, Facebook). Practicall­y everything Apple does, in both its hardware and software, is geared towards collecting as little of your data as possible. Even when Apple does take in your informatio­n, most of it

is encrypted, anonymised or both. And recently, the company’s been making sure the world knows it.

macOS Catalina and iOS 13

At WWDC 2019, one of Apple’s key privacy features was Sign In with Apple. Apple pitched this as an alternativ­e to websites and services that harvest your email address and other data, and build up a profile of you through their apps.

Sign In with Apple was floated as something very different. Rather than having to fill out forms with your personal data or sign up with a social media account, Sign In with Apple lets you join a service using your Apple ID and Face ID or Touch ID. No passwords to remember – typically the weak link in user security – just dependable biometrics.

When an app or website does demand a name and email address, Apple’s solution creates a random email address for you to use instead. This links back to your Apple ID address so that you can continue to receive app updates, but your original email address remains private.

Apple hasn’t stopped there. Sign In with Apple uses two-factor authentica­tion for an extra layer of security. And any time an app asks for your email

address in order to log in, a box for Sign In with Apple must also be present above it.

So why has Apple gone to such lengths when most people are probably happy to sign in the usual way? As Craig Federighi said in announcing the feature, with rival services, “Your personal informatio­n sometimes gets shared behind the scenes, and these logins can be used to track you. We wanted to solve this.”

That’s not the only new privacy feature – macOS Catalina has its own fair share too. The new Find My app, which combines Find My iPhone with Find My Friends, uses anonymous, encrypted Bluetooth signals to effectivel­y crowd source the location of your devices. Not even Apple can ever know your Mac’s ID or location.

Also new in Catalina is an enhanced data privacy system. According to Apple, “Apps must now get your permission before directly accessing files in your Documents and Desktop folders, iCloud Drive and external volumes.” If an app wants to capture your keyboard presses or a photo or video of your screen, it’ll need your permission too.

And iOS 13 is getting a slew of other privacy features: you’ll get notificati­ons if an app is using your location data in the background; you’ll be able to disable location metadata in photos uploaded to social media sites; and Apple is blocking apps from inferring your location using Wi-Fi and Bluetooth data. Features like these are not particular­ly headline-grabbing,

Not even Apple can ever know your Mac’s identity or location

but that all the more reinforces the idea that Apple’s commitment to your privacy isn’t just savvy marketing.

Siri and security

Siri is another example where Apple has begun to tighten up its security in light of the bad press its rivals are getting. Both Amazon and Google have landed in hot water over privacy concerns centred on their smart assistants, and it’s clear Apple has privacy worries when it comes to Siri.

That’s taken a step up recently with the introducti­on of the T2 Security Chip included in Apple’s Macs from 2018 or later. This chip performs various functions, one of which is enabling “Hey Siri” to work on your Mac. Having particular­ly sensitive Siri functions – those that are constantly listening for you to say the “Hey Siri” trigger phrase – protected in the T2 chip demonstrat­es how seriously Apple is taking the privacy implicatio­ns that come with enhanced Siri functional­ity. If your device is always listening for the trigger phrase, you don’t want it to be easily hackable.

Siri has always stood out for Apple’s attempts to make it into a high-quality virtual assistant without relying on reams and reams of your personal informatio­n. The argument from some quarters, though, has been that this approach hamstrings Siri. “Machine learning experts, all they want is data,” said one former Apple employee quoted by Wired. “But by its privacy stance, Apple basically puts one hand behind your back.”

Apple vehemently disagrees with this view, with Craig Federighi saying “there has been a false narrative, a false trade-off” that says you can’t build a good virtual assistant without harvesting a lot of user data and uploading it to the cloud. As an example, Federighi says app suggestion­s are built on user data that’s stored on your phone and never shared elsewhere.

If Apple does need to use your data for Siri, it uses what is known as Differenti­al Privacy. In short, this scrambles your data with mathematic­al noise so that it can’t be personally identified, then encrypts it end-to-end when it is sent to Apple’s servers. The end result is that Apple gets the data it needs to train and improve Siri without identifyin­g you in any way.

This issue came to the fore recently, when it was reported that some Apple contractor­s have been able to listen to Siri recordings from Apple’s users in order to improve Siri. Amazon’s Alexa and Google Assistant have similar processes in place, and all three companies have clearly thought about the privacy implicatio­ns. Amazon says of the Alexa review process: “Employees do not have direct access to informatio­n that can identify the person or account as part of this workflow,” while Google also says it does not use identifiab­le informatio­n in these recordings.

What about Apple? In explaining how its contractor­s listen to Siri recordings, Apple explained your Siri requests are not associated with your Apple ID and are tied to a random identifier in order to anonymise them. The very idea of human beings listening to your Siri recordings may sound scary, but Apple says less than 1% of daily Siri activation­s are used to grade Siri’s performanc­e, and most are only a few seconds long.

Whether it’s Siri’s protection­s or new features coming soon to your Mac and iOS device, it’s clear Apple takes your privacy very seriously. While no company accesses none of your data (anonymised or not), if you’re going to put your trust in one, Apple is your best bet.

Apple gets the data it needs to train and improve Siri without identifyin­g you in any way

 ??  ??
 ??  ??
 ??  ?? iPhone features like Face ID help protect your privacy.
iPhone features like Face ID help protect your privacy.

Newspapers in English

Newspapers from Australia