IT’S A WAR AND FACEBOOK’S NOW UNDER FIRE
A VIDEO link posted on Facebook on June 20 showed a man cooking human body parts in a pot over a wood fire.
In Cameroon, the footage went viral. Some Facebook users said the man was a cannibal and that the video was shot in the country’s English-speaking west, where separatist insurgents are fighting to create a breakaway state.
Local websites quickly debunked this notion. The man in the video was not a separatist fighter or cannibal, and the body parts were not real. The clip was taken on a Nigerian film set and uploaded to Instagram on June 17 by make-up artist Hakeem Onilogbo, who uses the platform to showcase his work.
But the video’s rapid spread raises questions about Facebook’s ability to police millions of posts each day and crack down on hate speech in a country where internet use is rising fast, social media are used for political ends and the company has no permanent physical presence.
The day the link was posted on Facebook, a member of the government brought the video to the attention of international diplomats in the capital, Yaounde, via Whatsapp.
Five days later, Cameroon’s minister for territorial administration cited it as justification for an army clampdown against the secessionists that was already under way in the Anglophone regions.
The minister, Paul Atanga Nji, compared the rebellion – over decades of perceived marginalisation by the French-speaking majority – to an Islamist insurgency waged by the Nigeria-based militant group Boko Haram which has killed 30 000 people.
“Boko Haram committed atrocities, but they did not cut up humans and cook them in pots,” the minister said in comments broadcast on state television.
Nji did not respond to requests for comment. Government spokesman Issa Tchiroma Bakary said that in future the government would work to verify information before commenting.
Facebook said the video had not been reported by users and that it could not comment further on the clip. It was no longer available on the site late last month.
A senior Facebook official said tackling misinformation in Cameroon was a priority for the company.
“We’re prioritising countries where we’ve already seen how quickly online rumours can fuel violence, such as Myanmar and Cameroon,” said Ebele Okobi, director of Africa Public Policy at Facebook.
Facebook is under fire for carrying misleading information, including in the US and Britain, and over posts against the Muslim Rohingya minority in Myanmar.
Sri Lankan authorities briefly banned Facebook this year because the government said it was fuelling violence between Buddhists and Muslims. In India, messages on Facebook-owned Whatsapp have been linked to attacks on religious minorities.
In Cameroon, Facebook has been used to incite violence and to make threatening posts. Simon Munzu, a former UN representative, said he was the target of death threats on Facebook after it was announced in July that he would help organise negotiations in the separatist conflict. Afraid, Munzu went to stay with friends.
Facebook removed the month.
Esther Omam, who runs a non-governmental posts last organisation called Reach Out, hid at a church and then fled to the Francophone region after receiving death threats from separatists following a peace march which she led.
“The crisis has destroyed my life and my family,” she said. “I cannot work anymore. My family is divided. My husband is elsewhere, my children are elsewhere.”
Facebook has no staff operating permanently in Cameroon and says it monitors the country from Britain and the US. It has an Africa-focused team that frequently visits the region, and has partnered with NGOS and civil society in Cameroon in recent months to combat hate speech.
This included paying several thousand dollars to help organise training sessions for journalists to spot falsehoods online, said representatives from two groups involved. Some groups also flag offensive posts to Facebook.
Facebook had removed pages and accounts related to the separatist conflict, and was working to slow the spread of kidnapping videos, the company said.
It declined to say how many people it had helping it in Cameroon, how much money it had invested or how many posts it had taken down. Reuters found dozens of pages posted in recent months showing graphic images in Cameroon, some of which were months old.
One Facebook user posted a picture on July 18 of the decapitated body of a Cameroonian policeman lying in a gutter, and said the image gave him joy.
The same day, separatist spokesman Ivo Tapang applauded the killing of two Cameroonian soldiers and linked to a website raising funds for weapons. Tapang did not respond to requests for comment.
A Facebook spokeswoman said the company was unaware of the posts before they were pointed out but that they were both removed after review. It is against Facebook rules to celebrate suffering or crowd-fund for arms, she said.
Facebook has artificial intelligence that it uses globally to detect problematic posts. But in Cameroon, it does not have fact-checking companies to monitor posts – as it does in the US.
Leading civil society figures in Cameroon say Facebook needs more resources and faces an increasingly difficult task as internet use grows.
“It is not possible to stop misinformation on Facebook,” said Maximilienne Ngo Mbe, executive director of REDHAC, a civil society group that has organised training sessions and flags indecent posts to Facebook. The number of people with internet access in Cameroon rose from 0.86 million in 2010 to 5.9 million in 2016, about a quarter of the population, according to the International Telecommunications Union, a UN agency.
The government shut down the internet in English-speaking regions for three months last year because of the unrest. After service resumed in April 2017, Facebook was the main outlet for people speaking out against the army crackdown, in which soldiers razed villages and shot dead unarmed civilians.
But misleading and hateful posts have persisted, groups that monitor posts say, echoing issues Facebook sees worldwide.
Facebook is not the only service facing a battle to tackle misinformation and hate speech. Offensive videos and images are posted on Twitter or transmitted by Whatsapp.
Whatsapp cannot view private, encrypted conversations, a Whatsapp spokeswoman said, so detecting hate speech there was harder. A Twitter spokeswoman said it prohibited the promotion of violence and encouraged users to flag those posts.
SIMON Munzu, a former UN official who is campaigning for peace in the Anglophone regions of Cameroon, shows a threat message posted against him on social media by separatists during an interview in Yaounde, Cameroon. |