Gulf News

Data has value too, let us not overreact

- By Cass R. Sunstein

The horrendous actions by Cambridge Analytica, a voter-profiling company, and Aleksander Kogan, a Russian-American researcher, raise serious questions about privacy, social media, democracy and fraud.

Amid the justified furore, one temptation should be firmly resisted: for public and private institutio­ns to lock their data down, blocking researcher­s and developers from providing the many benefits that it promises — for health, safety, and democracy itself.

The precise facts remain disputed, but according to reports, here’s what happened. Kogan worked at Cambridge University, which has a Psychometr­ics Centre. The Centre purports to be able to use data from Facebook (including “likes”) to ascertain people’s personalit­y traits. Cambridge Analytica and one of its founders, Christophe­r Wylie, attempted to work with the Centre for purposes of vote profiling. It refused, but Kogan accepted the offer.

Without disclosing his relationsh­ip to Cambridge Analytica, Kogan entered into an agreement with Facebook, which agreed to provide data to him—solely for his own research purp os es.Kog an created ana pp, called“this is your digital life ”.

Offering a personalit­y prediction, the app described itself on Facebook as “a research app used by psychologi­sts”. About 270,000 Facebook users agreed to disclose their data (again, for research purposes).

By sharing data with Cambridge Analytica, Kogan violated his agreement with Facebook. According to one report, he ended up providing more than 50 million user profiles to Cambridge Analytica, not for academic research, but to build profiles for partisan political uses.

Armed with those profiles, Cambridge Analytica worked with members of the Ted Cruz and Donald Trump campaigns in 2016. Among other things, the firm helped to model voter turnout, identify audiences for fund-raising appeals and advertisem­ents, and specify the best places for Trump to travel to increase support.

As early as 2015, Facebook learnt that Kogan was sharing his data and demanded that Kogan, Cambridge Analytica, and Wylie cease using, and destroy, all the informatio­n they had obtained. They certified that they had done so.

That was a lie — which recently led Facebook to suspend all three from its platform. Facebook was careful to add, “People knowingly provided their informatio­n, no systems were infiltrate­d, and no passwords or sensitive pieces of informatio­n were stolen or hacked.”

All this raises numerous questions — some of which involve difficult trade-offs with respect to privacy and competing values. Aware of the risks, Facebook emphasises that all apps requesting detailed user informatio­n have to “go through our App Review process, which requires developers to justify the data they’re looking to collect and how they’re going to use it — before they’re allowed to even ask people for it.”

Sufficient safeguards

In view of Kogan’s misconduct, it’s reasonable to ask whether that process contains sufficient safeguards. An external review panel might well be a good addition; continuing monitoring of all uses of Facebook data, on the part of app developers, seems important.

But let’s not overreact. Authorised use of that data can do a great deal of good.

For example, Genes for Good, from the University of Michigan, is using a Facebook App to help combat diabetes, cancer, and heart disease. It seeks to learn how genes interact with the environmen­t to produce — or not to produce — serious illness. There’s tremendous potential there.

A more immediate response to health problems is HealthTap, an app that permits users to type questions into Facebook’s Messenger and to obtain free responses from doctors — or to see answers from doctors to questions that are like their own.

Pew found that disagreeme­nt comes most often from party leaders — and that it is far more common from Republican­s than from Democrats.

Sure, those aren’t the most surprising findings, but there is far more to learn about polarisati­on and partisansh­ip — and Facebook’s data will prove exceedingl­y valuable.

It is true, of course, that social media users should have a great deal of control over whether and how their informatio­n is used, and that app developers should be sharply constraine­d in their ability to share data.

The US government has faced, and solved, similar problems: Data.gov discloses a great deal of informatio­n, with more than 230,000 data sets involving health, safety, travel, energy, and the environmen­t. Available apps, made possible by that informatio­n, are helping people to save money and to avoid health risks.

For social media providers, including Facebook, the fiasco underlines the need for more careful vetting of all developers who seek access to their data. But it would be a mistake to take the fiasco as a reason to keep treasure troves of informatio­n out of the hands of people who can provide immensely valuable services with it.

Cass R. Sunstein is a columnist. He is the editor of Can It Happen Here? Authoritar­ianism in America and a co-author of Nudge: Improving Decisions About Health, Wealth and Happiness.

Newspapers in English

Newspapers from United Arab Emirates