The Guardian (USA)

Facebook ‘lacks willpower’ to tackle misinforma­tion in Africa

- Jason Burke in Johannesbu­rg

Facebook has been accused of failing to invest sufficient­ly to combat misinforma­tion as it pursues rapid growth in Africa, where the Covid pandemic has highlighte­d the outsize role played by social media in online discourse.

Traditiona­l media and government­s have an increasing­ly limited ability to control informatio­n flows on the continent, as social media platforms including Facebook seek to expand rapidly, though largely without fanfare.

“Facebook are losing users left, right and centre in the global north, so where are the new users coming from? The global south,” said Anri van der Spuy, a senior researcher at Research ICT Africa, a thinktank.

Sub-Saharan Africa has a population of 1.1 billion and, at an average of about 30%, internet use is three times higher than a decade ago.

Toussaint Nothias, research director at the Digital Civil Society Lab of Stanford University, who has worked extensivel­y on Facebook, said it was “generally accepted” that Facebook had launched an “aggressive expansion” in the global south to win new users following a decline in the developed world.

“Africa has a young growing population and so offers opportunit­ies for Facebook to become an entry to the internet, via Facebook, WhatsApp, Instagram or whatever. That can be monetised down the line,” he said.

Many – but not all – academic studies have linked Covid vaccine hesitancy with misinforma­tion circulatin­g on social media in Africa, as elsewhere.

In some parts of the continent, such as South Africa, hesitancy was the biggest challenge facing vaccinatio­n campaigns.

Dr Matshidiso Moeti, WHO regional director for Africa, has talked of an “infodemic”, which she defines as “a glut of informatio­n with misinforma­tion in the mix [which] makes it hard to know what is right and real”.

False informatio­n circulatin­g on social media included claims that black people cannot contract Covid-19 or that it can be cured with steam or traditiona­l remedies such as herbal tea. Conspiracy theories describing plots by western companies or government­s to test vaccines in Africa or slow demographi­c growth have also spread widely.

“The regulation side is very problemati­c,” said van der Spuy. “It has not been resolved in the global north either but the risks are much bigger in the south … you don’t have the same safety net of literacy skills and ability to crosscheck nor the safeguard of adequate policies or capable institutio­ns … Facebook is investing in addressing some of these challenges, but not nearly enough.”

Facebook relies on an expanding network of hundreds of third-party factchecke­rs across Africa to initiate investigat­ions and respond to complaints from users. If concerns are found to be justified, warnings are attached to posts, which are also downgraded in the algorithms that direct traffic. Some accounts are taken down.

A spokespers­on for Meta, which owns Facebook, described misinforma­tion as a complex and constantly evolving societal challenge for which there is no “silver bullet”.

But, they said, Facebook now employed a global team of 40,000 working on safety and security, including 15,000 people who review content in more than 70 languages – including Amharic, Somali, Swahili and Hausa, among others.

This helped the company “debunk false claims in local languages, including claims related to elections and vaccines”.

“We’ve also made changes to our policies and products to ensure fewer people see false informatio­n and are made aware of it when they do, and have been highlighti­ng reliable vaccine informatio­n through our global Covid-19 informatio­n centre,” the spokespers­on said.

However, posts are not usually removed unless seen as directly encouragin­g violence or hate, leading to concerns that some may be viewed by large audiences even after being flagged as false or misleading.

“They do take things down occasional­ly but it takes a long time,” said Stuart Jones, director of the Centre for Analytics and Behavioura­l Change in South Africa, which monitors social media in the country.

Facebook claims that more than 95% of the time when people see a factchecki­ng labels, they don’t go on to view the original content.

Other platforms are also struggling to contain misinforma­tion.

“Social media [in South Africa], especially Twitter, is dominated by anti-vaccine voices,” said Jones.

“We’ve not identified organised networks but dealing with people with very loud voices speaking often and very passionate­ly. The pro-vaccine voices are more moderate and don’t get the same outrage and aren’t shared as much. So the algorithms kick in and it just all runs away.”

Frances Haugen, a former manager at Facebook turned whistleblo­wer, has said that her concerns over an apparent lack of safety controls in non-English language markets, such as Africa and the Middle East, were a key factor in her decision to go public.

“I did what I thought was necessary to save the lives of people, especially in the global south, who I think are being endangered by Facebook’s prioritisa­tion of profits over people,” Haugen told the Guardian last year.

Workers at factchecki­ng organisati­ons across Africa who spoke to the Guardian on condition of anonymity said they were confident their work made some difference but worried that the impact was very limited.

“What we do is important and does stop some people reading stuff that simply isn’t true. But I worry that it really is just a tiny fraction of what’s out there,” one said.

Some say it is difficult to judge to what extent Facebook’s downgradin­g of such posts in news feeds restricts exposure and worry that the company has not released a breakdown of figures for funding of factchecki­ng operations in Africa.

“There seems to be as little as possible real investment on the continent in terms of engaging people directly or hiring people with real local knowledge,” said Grace Mutung’u, a policy researcher and lawyer based in Nairobi, Kenya.

“It is a matter of accountabi­lity. If you take up such a huge responsibi­lity in society, you should equally invest in solving the problems that come out of it. They have the resources, what is lacking is the willpower.”

Officials at the WHO say they are concerned about encrypted private applicatio­ns such as WhatsApp, which remain “invisible”, as it is impossible to know what is being said or shared, and very difficult to intervene to stem the flow of false informatio­n.

WhatsApp is also owned by Meta, which owns Facebook. The company said it was taking steps to address the problem.

Nothias said that there was no easy obvious solution to the problem of content moderation, but “simple things” such as committing greater resources would help.

“Currently, in comparison to the wealth of the company and its social responsibi­lity … it is pretty minimal,” he said.

“They are just not taking it seriously enough, or putting enough money into it. When you consider it really just is a question of their social responsibi­lity against their duty to their investors, it’s not so hard to understand. They are just a corporatio­n.”

 ?? Photograph: Jérôme Delay/AP ?? A Covid testing site in the Alexandra township in South Africa in April 2020.
Photograph: Jérôme Delay/AP A Covid testing site in the Alexandra township in South Africa in April 2020.
 ?? ?? Composite: Guardian Design/Getty Images/Facebook
Composite: Guardian Design/Getty Images/Facebook

Newspapers in English

Newspapers from United States