Khaleej Times

Post it, don’t expect FB to protect your rights

Digital media companies should not be allowed to take over the Internet

-

Facebook has had a few bad weeks. The social media giant had to apologise for failing to protect the personal data of millions of users from being accessed by data mining company Cambridge Analytica. And now, Mark Zuckerberg, Facebook’s chief executive, is due to appear at congressio­nal hearings on Tuesday and Wednesday over political consultanc­y’s use of customer data.

In some ways, social media has been a boon for human rights — most obviously for freedom of speech.

Previously, the so-called “marketplac­e of ideas” was technicall­y available to all (in “free” countries), but was in reality dominated by the elites. While all could equally exercise the right to free speech, we lacked equal voice. Gatekeeper­s, especially in the form of the mainstream media, largely controlled the conversati­on.

But today, anybody with Internet access can broadcast informatio­n and opinions to the whole world. While not all will be listened to, social media is expanding the boundaries of what is said and received in public. The marketplac­e of ideas must effectivel­y be bigger and broader, and more diverse.

Social media played a major role in co-ordinating the massive protests that brought down dictatorsh­ips in Tunisia and Egypt, as well as large revolts in Spain, Greece, Israel, South Korea, and the Occupy movement. More recently, it has facilitate­d the growth of the #MeToo and #neveragain movements.

The bad and the ugly

But the social media “free speech” machines can create human rights difficulti­es. Those newly empowered voices are not necessaril­y desirable voices.

The UN recently found that Facebook had been a major platform for spreading hatred against the Rohingya in Myanmar, which in turn led to ethnic cleansing and crimes against humanity.

Video sharing site YouTube seems to automatica­lly guide viewers to the fringiest versions of what they might be searching for. A search on vegetarian­ism might lead to veganism; jogging to ultra-marathons; Donald Trump’s popularity to white supremacis­t rants; and Hillary Clinton to 9/11 trutherism.

YouTube, via its algorithm’s natural and probably unintended impacts, “may be one of the most powerful radicalisi­ng instrument­s of the 21st century”, with all the attendant human rights abuses that might follow.

Business model and human rights

Human rights abuses might be embedded in the business model that has evolved for social media companies in their second decade.

Essentiall­y, those models are based on the collection and use for marketing purposes of their users’ data. And the data they have is extraordin­ary in its profiling capacities, and in the consequent unpreceden­ted knowledge base and potential power it grants to these private actors.

Indirect political influence is commonly exercised, even in the most credible democracie­s, by private bodies such as major corporatio­ns. This power can be partially constraine­d by “antitrust laws” that promote competitio­n and prevent undue market dominance.

Anti-trust measures could be used to hive off Instagram from Facebook, or YouTube from Google. But these companies’ power essentiall­y arises from the sheer number of their users: in late 2017, Facebook was reported as having more than 2.2 billion active users.

power through knowledge

In 2010, Facebook conducted an experiment by randomly deploying a nonpartisa­n “I voted” button into 61 million feeds during the US mid-term elections. That simple action led to 340,000 more votes, or about 0.14 per cent of the US voting population. This number can swing an election. A bigger sample would lead to even more votes.

So Facebook knows how to deploy the button to sway an election, which would clearly be lamentable. However, the mere possession of that knowledge makes Facebook a political player. It now knows that button’s the political impact, the types of people it is likely to motivate, and the party that’s favoured by its deployment and non-deployment, and at what times of day.

It might seem inherently incompatib­le with democracy for that knowledge to be vested in a private body. Yet the retention of such data is the essence of Facebook’s ability to make money.

Micro-targeting

A study has shown that a computer knows more about a person’s personalit­y than their friends or flatmates from an analysis of 70 “likes”, and more than their family from 150 likes. From 300 likes it can outperform one’s spouse.

This enables the micro-targeting of people for marketing messages — whether those messages market a product, a political party or a cause. This is Facebook’s product, from which it generates billions of dollars. It enables extremely effective advertisin­g and the manipulati­on of its users. This is so even without Cambridge Analytica’s underhande­d methods.

Advertisin­g is manipulati­ve: that is its point. Yet it is a long bow to label all advertisin­g as a breach of human rights.

Advertisin­g is available to all with the means to pay. Social media micro-targeting has become another battlegrou­nd where money is used to attract customers and, in the political arena, influence and mobilise voters.

While the influence of money in politics is pervasive — and probably inherently undemocrat­ic — it seems unlikely that spending money to deploy social media to boost an electoral message is any more a breach of human rights than other overt political uses of money.

Yet the extraordin­ary scale and precision of its manipulati­ve reach might justify differenti­al treatment of social media compared to other advertisin­g, as its manipulati­ve political effects arguably undermine democratic choices.

As with mass data collection, perhaps it may eventually be concluded that that reach is simply incompatib­le with democratic and human rights.

Finally, there is the issue of the spread of misinforma­tion.

Perhaps social media’s purpose – the posting and sharing of speech – cannot help but generate a distorted and tainted marketplac­e of fake ideas that undermine political debate and choices, and perhaps human rights.

what next?

It is premature to assert the very collection of massive amounts of data is irreconcil­able with the right to privacy (and even rights relating to democratic governance). Similarly, it is premature to decide that micro-targeting manipulate­s the political sphere beyond the bounds of democratic human rights.

Finally, it may be that better speech and corrective technology will help to undo fake news’ negative impacts: it is premature to assume that such solutions won’t work.

At the very least, we must now seriously question the business models that have emerged from the dominant social media platforms. Maybe the Internet should be rewired from the grassroots, rather than be led by digital oligarchs’ business needs. —The Conversati­on

Sarah Joseph is Director, Castan Centre, Monash University

 ??  ?? A study has shown that a computer knows more about a person’s personalit­y than their friends or flatmates from an analysis of 70 “likes”, and more than their family from 150 likes. From 300 likes it can outperform one’s spouse
A study has shown that a computer knows more about a person’s personalit­y than their friends or flatmates from an analysis of 70 “likes”, and more than their family from 150 likes. From 300 likes it can outperform one’s spouse
 ??  ??

Newspapers in English

Newspapers from United Arab Emirates