Business Standard

APPLE EXPANDS BET ON CUTTING EDGE PRIVACY TECHNOLOGY

Differenti­al privacy allows companies to analyze data without learning too much about users

- ROBERT MCMILLAN 8 July

Last year, Apple Inc. kicked off a massive experiment with new privacy technology aimed at solving an increasing­ly thorny problem: how to build products that understand users without snooping on their activities.

Its answer is differenti­al privacy, a term virtually unknown outside of academic circles until a year ago. Today, other companies such as Microsoft Corp. and Uber Technologi­es Inc. are experiment­ing with the technology.

The problem differenti­al privacy tries to tackle stems from the fact that modern data-analysis tools are capable of finding links between large databases. Privacy experts worry these tools could be used to identify people in otherwise anonymous data sets.

Two years ago, researcher­s at the Massachuse­tts Institute of Technology discovered shoppers could be identified by linking social-media accounts to anonymous credit-card records and bits of secondary informatio­n, such as the location or timing of purchases.

“I don’t think people are aware of how easy it is getting to deanonymiz­e data,” said Ishaan Nerurkar, whose startup Leap-Year Technologi­es Inc. sells software for leveraging machine learning while using differenti­al privacy to keep user data anonymous.

Differenti­ally private algorithms blur the data being analyzed by adding a measurable amount of statistica­l noise. This could be done, for example, by swapping out one question (have you ever committed a violent crime?) with a question that has a statistica­lly known response rate (were you born in February?). Someone trying to find links in the data would never be sure which question a particular person was asked. That lets researcher­s analyze sensitive data such as medical records without being able to tie the data back to specific people.

Differenti­al privacy is key to Apple’s artificial intelligen­ce efforts, said Abhradeep Guha Thakurta, an assistant professor at University of California, Santa Cruz. Mr. Thakurta worked on Apple’s differenti­al-privacy systems until January of this year.

Apple has faced criticism for not keeping pace with rivals such as Alphabet Inc.’s Google in developing AI technologi­es, as they have made giant leaps in image and language recognitio­n software that powers virtual assistants and self-driving cars.

While companies such as Google have access to massive volumes of data required to improve artificial intelligen­ce, Apple’s privacy policies have been a hindrance, blamed by some for turning the company into a laggard when it comes to AI-driven products such as Siri.

“Apple has tried to stay away from collecting data from users until now, but to succeed in the AI era they have to collect informatio­n about the user,” Mr. Thakurta said. Apple began rolling out the differenti­al-privacy software in September, he said. Users must elect to share analytics data with Apple before it is used.

Originally used to understand how customers are using emojis or new slang expression­s on the phone, Apple is now expanding its use of differenti­al privacy to cover its collection and analysis of web browsing and health-related data, Katie Skinner, an Apple software engineer, said at the company’s annual developer’s conference in June.

The company is now receiving millions of pieces of informatio­n daily—all protected via this technique—from Macs, iPhones and iPads running the latest operating systems, she said.

“Apple believes that great features and privacy go hand in hand,” an Apple spokesman said via email.

Google, one of differenti­al privacy’s earliest adopters, has used it to keep Chrome browser data anonymous. But while the technology is good for some types of analysis, it suffers where precision is required. For example, experts at Google say it doesn’t work in so-called A/B tests, in which two versions of a webpage are tested on a small number of users to see which generates the best response.

“In some cases you simply can’t answer the questions that developers want answers to,” said Yonatan Zunger, a privacy engineer at Google. “We basically see differenti­al privacy as a useful tool in the toolbox, but not a silver bullet.”

Researcher­s are coming up with “surprising­ly powerful” uses of differenti­al privacy, but the technology is only about a decade old, said Benjamin Pierce, a computer science professor at the University of Pennsylvan­ia. “We’re really far from understand­ing what the limits are,” he said.

Differenti­al privacy has seen wider adoption since Apple first embraced it. Uber employees, for example, use it to improve services without being overexpose­d to user data, a spokeswoma­n said via email.

Microsoft is working with San Diego Gas & Electric Co. on a pilot project to make smart-meter data available to researcher­s and government agencies for analysis, while making sure “any data set cannot be tied back to our customers,” said Chris Vera, head of customer privacy at the utility.

The U.S. Census Bureau confronted the problem of links between data sets a decade ago. By 2005, the bureau was worried large databases outside its control could be used to de-anonymize census participan­ts, said John Abowd, chief scientist at the bureau. After meeting with some of the creators of differenti­al privacy, the bureau became an proponent.

In 2008 the Census released its first product to use this technology— a web-based data-mapping portal called On The Map — and the bureau is now “making an intense effort to apply differenti­al privacy to the publicatio­n of the 2020 census,” Mr. Abowd said.

Apple has faced criticism for not keeping pace with rivals such as Alphabet Inc.’s Google in developing AI technologi­es, as they have made giant leaps in image and language recognitio­n software that powers virtual assistants and self-driving cars

 ??  ??
 ?? REUTERS ?? Apple’s privacy policies have been a hindrance, blamed by some for turning the company into a laggard when it comes to AI-driven products such as Siri
REUTERS Apple’s privacy policies have been a hindrance, blamed by some for turning the company into a laggard when it comes to AI-driven products such as Siri

Newspapers in English

Newspapers from India