The Star Malaysia

FB in the soup over Holocaust remark

Zuckerberg slammed for saying posts denying massacre might not be deleted

-

NEW YORK: Denying the Holocaust happened is probably OK on Facebook. Calling for a mob to kill Jews is not.

Mark Zuckerberg’s awkward and eyebrowrai­sing attempt this week to explain where Facebook draws the line illustrate­s the complexiti­es social media platforms face as they take on the unwanted role of referee in this age of online misinforma­tion, manipulati­on and hate speech.

In an interview with Recode, Zuckerberg, who is Jewish, said posts denying the Nazi annihilati­on of six million Jews took place would not necessaril­y be removed.

He said that as long as posts were not calling for harm or violence, even offensive content should be protected.

While this has been a longstandi­ng position at the company, Zuckerberg’s statement and his reasoning – that he does not think Holocaust deniers are “intentiona­lly” getting it wrong – caused an uproar.

The AntiDefama­tion League said Facebook had a “moral and ethical obligation” not to allow people to disseminat­e Holocaust denial.

Zuckerberg later tried to explain his words, saying in an email to Recode’s Kara Swisher that he personally found “Holocaust denial deeply offensive and I absolutely didn’t intend to defend the intent of people who deny that”.

Facebook, which has 2.2 billion users, disallows such things as nudity, the selling of guns, credible threats of violence and direct attacks on people because of their race, sex or sexual orientatio­n.

Hours after the Facebook founder’s comments about Holocaust deniers were aired on Wednesday, the company announced that it would also start removing misinforma­tion that could lead to bloodshed.

The policy will begin in Sri Lanka and expand to Myanmar, where Facebook users have been accused of inciting antiMuslim violence.

But beyond those guidelines, there are large grey areas. What, exactly, qualifies as supporting terrorist groups versus merely posting about them? Or mocking someone’s premature death – something that is also prohibited?

If Facebook were to ban Holocaust denial, it might also be called on to prohibit the denial of other histori cal events such as the Armenian genocide or the massacre of Native Americans by European colonisers.

This, Facebook might argue, could lead to a slippery slope where it finds itself trying to verify the historical accuracy of users’ posts. So, where it can, Facebook stays out of policing content.

While thousands of Facebook moderators worldwide are assigned to review potentiall­y objectiona­ble content, aided by artificial intelligen­ce, executives like to say the company does not want to become an “arbiter of truth” and instead tries to let users decide for themselves.

This is why fake news is not actually banned from Facebook, though you might see less of it these days, thanks to the company’s algorithms and thirdparty factchecki­ng efforts.

Instead, Facebook might label disputed news stories as such and show you related content that might change your mind.

YouTube recently started doing this, too. Twitter has been even more freewheeli­ng in what sorts of content it allows, only recently ramping up a crackdown on hate and abuse.

“Facebook doesn’t want to put time and resources into policing content. It’s costly and difficult,” said Steve Jones, a professor of communicat­ions at the University of Illinois at Chicago.

“It’s a difficult job and I’m sure it is an emotionall­y draining job. Given the scale of Facebook, it would take many people to monitor what goes through that platform.”

Jones said he had his doubts that throwing more moderators (Facebook’s goal is to increase the number from 10,000 to 20,000 this year) and technology at the problem would make a difference.

Why these companies try to stay out of regulating speech goes back to their roots. They were all founded by engineers as tech companies that shun labels such as “media” and “editor”.

Facebook’s chief operating officer Sheryl Sandberg even said in an interview last year that, as a tech company, Facebook hires engineers – not reporters and journalist­s.

Then there is the legal shield. While a newspaper can be held responsibl­e for something printed on its pages, Internet companies by law are not responsibl­e for the content others post on their sites. If they start policing content too much – editing, if you will – tech companies risk becoming media companies.

Zeynep Tufekci, a prominent technosoci­ologist, said on Twitter that the notion that you can “fight bad speech with good speech” doesn’t really work in a Facebook world, if it ever did.

“Facebook is in over its head,” she tweeted on Thursday, but she also confessed that “nobody has a full answer”. — AP

 ?? — AFP ?? Centre of controvers­y: A file photo of two men walking past a piece of graffiti depicting Zuckerberg on the Israeli barrier separating the West Bank town of Bethlehem from Jerusalem.
— AFP Centre of controvers­y: A file photo of two men walking past a piece of graffiti depicting Zuckerberg on the Israeli barrier separating the West Bank town of Bethlehem from Jerusalem.

Newspapers in English

Newspapers from Malaysia