Fast Company

Apple’s Privacy Problem

APPLE INVENTED PRIVACY PROBLEMS ON MOBILE DEVICES. AND NOW ONLY APPLE CAN FIX THEM.

- BY MARK WILSON

The company set the stage for the surveillan­ce economy. Now, only Apple can fix it.

“What happens on your iphone stays on your iphone.” The message was printed 14 stories high, in simple black and white, on the side of a building at this year’s Consumer Electronic­s Show in Las Vegas. The proclamati­on was quintessen­tial Apple: a bold spectacle, a welltimed verbal play, and a calculated jab at Google, Amazon, and every other competitor about to show off its latest products on the world’s biggest stage. It was also misleading. Apple, after all, practicall­y laid the groundwork for the surveillan­ce economy with its powerful App Store.

Through a certain lens, the iphone is one of the most secure devices in the world. Its contents are encrypted by default. Any data that Apple collects through services such as Maps is assigned to random identifier­s (rather than being tied to users’ IDS) that are periodical­ly reset. Unlike Google’s Chrome browser, Apple’s Safari doesn’t track users

across the web, which means the company could be leaving billions of dollars in revenue on the table by not harvesting users’ data.

But that doesn’t stop the 2 million or so apps in the App Store from spying on iphone users and selling details of their private lives. It’s not just Facebook and Google that are using their IOS apps to hoover up your personal informatio­n for the benefit of marketers or back-alley data brokers. Beneath the App Store lies a flourishin­g ecosystem of businesses devoted to collecting, analyzing, and profiting from user data.

“Tens of millions of people have data taken from them—and they don’t have the slightest clue,” says Will Strafach, founder of the San Francisco–based cybersecur­ity firm Guardian. His company released a report last fall that identified 24 popular IOS apps—including the image-hosting service Photobucke­t and real estate portal Homes.com— that contained code from data-monetizati­on firms, which can collect location informatio­n as often as every 15 seconds, even when an app is closed. Guardian has spotted similar code in hundreds of other IOS apps.

An investigat­ion by The New York Times last December uncovered nine seemingly innocuous apps, including Weatherbug and a gassavings app, Gasbuddy, that routinely gave precise user-location informatio­n to more than 40 different data-monetizati­on companies. The Wall Street Journal studied 70 IOS apps in February and found several that were delivering deeply private informatio­n, including heart rates and fertility data, to Facebook through an analytics tool in the social media company’s software developer kit.

To shed light on these murky practices, Guardian is launching the subscripti­on-based Guardian Firewall app this month. The IOS app encrypts user data through a personal VPN, blocks apps from passing private informatio­n to third parties, and alerts users—via push notificati­ons—of any attempts to send their data outside an app. Early testers have taken to Twitter to report their shock at how many apps Guardian Firewall has stopped from passing data elsewhere.

But it should come as no surprise to Apple, which has helped grease along mobile-data sharing since the launch of the App Store in 2008.

From the start, Apple prioritize­d an app ecosys

tem that was easy to use, consolidat­ing any software you might have on your phone into a single venue, the App Store, over which it exerted complete control. To get their software in, developers had to conform to Apple’s squeaky-clean microcosm of a free market.

The company designed a dead-simple interface that, to this day, allows users to sign away contacts, location data, and camera and microphone access with a single tap as they install an app. Apple also created efficient Apis—the software connecting its hardware to outside apps—to provide third-party developers access to sensitive user informatio­n. Meanwhile, iphone apps are not required to encrypt their transmissi­ons. “Apple was well known for usability before it was known for privacy,” says Riana Pfefferkor­n, associate director of surveillan­ce and cybersecur­ity at the Stanford Center for Internet and Society.

Apple’s most consequent­ial decision, though, may have been to emphasize apps based on popularity. With thousands of apps suddenly coming online, the company wanted to give users an easy way to navigate the App Store. It created lists of the most popular paid and free apps, by category, setting them up to become viral hits. App pricing soon became a race to the bottom. Software, which had traditiona­lly sold for $20, $30, or $50, cost one-tenth of that in the App Store, or it was just free. The cheaper the app was, the greater chance it would become a chart-topping impulse download.

This economic model, however, doesn’t support the kinds of teams needed to create good—or even decent— software. “Even if an app is 99¢, that price is not going to be enough,” says Cade Diehm, a designer who helped develop the secure communicat­ions app Signal and is now the lead designer at the digital-rights nonprofit Tactical Tech. So developers compensate by selling user data, Diehm says.

Today, it’s routine for developers to insert a bit of code into their software that sends user informatio­n directly to outside companies. These data-marketing firms are generally tight-lipped about what they pay, but one company, Huq, advertises that it shells out as much as $1.10 for the location data of every 100 monthly active users, which means an app with a million users could make $11,000 a month from Huq. That’s not much, but developers rarely stop at just one data-mining customer. Guardian has seen apps incorporat­e as many as eight separate location-data trackers, and has identified at least 100 data-monetizati­on firms active on IOS.

In other cases, the data-marketing company simply owns the app. Weatherbug, for example, is owned by ad platform Groundtrut­h. And the two biggest advertisin­g companies on the web—google and Facebook—provide

analytics and other various software for more than 600,000 IOS apps, according to research firm Apptopia, which allows them to peek inside a third of all apps in the App Store. For the most part, this all happens with users’ permission. They let apps access their location informatio­n, contacts, or microphone for legitimate reasons. But rarely do they have any idea how their data is then passed around.

In the wake of the Cambridge Analytica scandal and the European Union’s General Data Protection Regulation, which took effect last year, Apple has increased its efforts to protect users. In 2016, it helped shield user identities from marketers by making the unique advertiser ID hardwired into the iphone more complex. A year later, it created tiered app permission­s, which allow users to specify that an app can access certain data only when it’s open. (Google has since copied this feature in Android Q.)

To be part of the App Store, IOS developers also have to agree to Apple’s App Review guidelines, which state that apps cannot create shadow profiles of users (piecing bits of behavioral data together to deeply profile them) and that developers are responsibl­e for the data practices of any analytics software in their products. As of 2018, apps are encouraged—but not required—to “clearly and explicitly” tell users what they collect and how they use it.

But Apple operates more as matchmaker—connecting users and apps, enabling them to shake hands with APIS and terms and conditions—than enforcer. It does not audit apps’ data practices, nor does it police the language of developers’ terms and conditions. If it comes to light that an app is in breach of these guidelines (and it’s not a case of malintent), Apple gives the developer time to fix the problem. It does not inform users of the issue, and more often than not, the app can stay in the App Store during the process. Notably, Apple does not punish apps for past privacy violations, so a company such as Facebook, which allowed Cambridge Analytica to create shadow profiles of its users via its IOS app, remains seemingly untouchabl­e. (In lieu of commenting for this article, Apple provided Fast Company with the equivalent of its App Review guidelines. It declined to say why Facebook has not been banned for enabling data mining by Cambridge Analytica.)

Apple is beginning to acknowledg­e the problems on its platform. “We have to admit when the free market is not working,” Tim Cook said in an interview with Axios last November. The Apple CEO has formally called on Congress to pass legislatio­n to protect consumers, which could set higher standards for app data brokers and hold them accountabl­e for privacy violations. But punting these issues to regulators obscures the fact that Apple is already the sole ruler of the App Store.

Some developers say that any move Apple might make to clamp down on their work—by banning their use of outside analytics software and encouragin­g app makers to charge more—would cause them to abandon IOS. “Apple doesn’t really have a choice other than to allow apps that are ad supported,” says the data scientist behind a major IOS gaming company, under the condition of anonymity. “Were they to move to a premium-only model, vast numbers of iphone and ipad users would flee to Android.”

A game of chicken with developers, let alone Facebook, would be dangerous. But if Apple wants what happens on your iphone to, finally, actually stay on your iphone, it may be a game that it needs to play.

 ?? ILLUSTRATI­ONS BY EIKO OJALA ??
ILLUSTRATI­ONS BY EIKO OJALA
 ??  ??

Newspapers in English

Newspapers from United States