Mail & Guardian

Fake news: Censorship’s no solution

The public, not private companies or government­s, should limit dangerous digital deception

- Madeleine de Cock Buning & Miguel Poiares Maduro

Today, debates about public issues play out on social media, people receive their news via digital platforms and politician­s pitch their policies using this same media. The internet is our new public square.

In the public square of old, journalist­s and editors served as gatekeeper­s and acted as referees. Human news aggregator­s set the agenda and provided audiences with credible informatio­n and diverse views. We trusted them because of the profession­alism and integrity of their editorial processes.

In the new public sphere, this model of journalism, and its role in sustaining democracy, has become obsolete. Traditiona­l media no longer play a dominant gatekeepin­g and agenda-setting role. Fake news reaches myriad jurisdicti­ons at once.

But so can public and private measures that censor speech. The challenge is to redefine the parameters of civil discourse in the new public sphere without restrictin­g pluralism. Recent examples highlight the risk of throwing the baby out with the bathwater.

Despite the ominous headlines, the influence of fake news on political decision-making appears to be limited. The Reuters Institute for the Study of Journalism at the University of Oxford says the reach of such content is largely restricted to groups of believers seeking to reinforce their own views and prejudices. But that does not make digital deception any less dangerous. Fake news feeds — and is fed by —polarisati­on, and, paradoxica­lly, the more it is discussed, the more disruptive it becomes.

Fake news undermines trust in all forms of media and reinforces the view that it is impossible to discern fact from fiction. When people don’t know what they can believe, journalist­s’ ability to police the powerful is weakened. This trend will only worsen as “deep-fake news” — bogus images and videos that appear real — becomes more ubiquitous.

Clearly, the vulnerabil­ities of the digital public sphere must be addressed. Some argue that the solution is to block questionab­le websites or demote search results. Facebook, for example, censors duplicitou­s posts and has created an election “war room” to fight disinforma­tion. Google and Twitter have considered similar steps, and all three are under pressure to give authoritie­s access to the private data of users who publish fake news or make defamatory statements. But we believe that these steps are deeply misguided.

At the heart of any strong democracy is a political consensus and arbitratio­n that depends on the public’s ability to debate and disagree. It is not up to private entities or public institutio­ns to censor this process. Rather, we should strive to ensure that citizens have access to a broad array of opinions and ideas.

Freedom of expression and media freedom include the right to receive and impart informatio­n without interferen­ce. Studies show that most people prefer reliable and pluralisti­c news sources; the policymake­rs’ job is to enable them to realise this preference.

A March 2018 report to the European Commission by the highlevel group on fake news and online disinforma­tion, which one of us (De Cock Buning) chaired, offered a roadmap, and the commission’s recent action plan provides a good starting point. But more needs to be done.

There is no silver bullet to combat disinforma­tion. Only multistake­holder approaches that spread responsibi­lity across the news ecosystem, and take into account the fundamenta­l rights involved, can provide adequate defences against disinforma­tion.

For example, profession­al media must do more to guarantee the veracity of their coverage. Fact-checking technology can help, as long as it is kept free of political and economic influence.

Big Tech is starting to take responsibi­lity by committing to a code of practice based on the 10 key principles from the high-level report. But Big Tech can contribute in other ways, for example by providing client-based interfaces for curating legitimate news, ensuring diversity in social media timelines, and prioritisi­ng the reposting of fact-checked informatio­n.

Platforms can also improve transparen­cy in how they use data and code algorithms. Ideally, these algorithms should give consumers more control over editorial preference­s and integrate editing and fact-checking applicatio­ns developed by reliable media organisati­ons.

Platforms must clearly identify news sources, especially paid political or commercial content. We also need more internatio­nal collaborat­ion and better jurisdicti­onal rules to ensure that laws and regulation­s protect victims of fake and offensive news without restrictin­g free speech or underminin­g the rights of whistleblo­wers. These conflicts should not be legally settled where only one party has effective access to justice.

Platform companies should cooperate with schools, civil society groups and news organisati­ons to strengthen public media literacy, to distinguis­h fake news from real.

Only consumers can marginalis­e fake news. We cannot allow private companies or government­s to decide what people should know. The history of democracy is clear on this point: pluralism, not private or public censorship, is the best guarantor of truth. — © Project Syndicate

Madeleine de Cock Buning was chair of the European Commission’s highlevel group on fake news and online disinforma­tion. Miguel Poiares Maduro was a member of the European Commission’s group on media freedom and pluralism

 ??  ??
 ??  ?? Accountabi­lity: Platforms also have the responsibi­lity of identifyin­g news sources and improve transparen­cy. Photo: Toby Melville/reuters
Accountabi­lity: Platforms also have the responsibi­lity of identifyin­g news sources and improve transparen­cy. Photo: Toby Melville/reuters

Newspapers in English

Newspapers from South Africa