The Denver Post

FACEBOOK RIPPED ON HATE SPEECH

- By Tracy Jan and Elizabeth Dwoskin

Francie Latour was picking out produce in a suburban Boston grocery store when a white man leaned toward her two young sons and, just loudly enough for the boys to hear, unleashed a profanity-laced racist epithet.

Reeling, Latour, who is black, turned to Facebook to vent, in a post that was explicit about the hateful words hurled at her 8- and 12-year-olds on a Sunday evening in July.

“I couldn’t tolerate just sitting with it and being silent,” Latour said. “I felt like I was going to jump out of my skin, like my kids’ innocence was stolen in the blink of an eye.”

But within 20 minutes, Facebook deleted her post, sending Latour a cursory message that her content had violated company standards. Only two friends had gotten the chance to voice their disbelief and outrage.

Experience­s like Latour’s exemplify the challenges Facebook chief executive Mark Zuckerberg confronts as he tries to rebrand his company as a safe space for community, expanding on its earlier goal of connecting friends and family.

But in making decisions about the limits of free speech, Facebook often fails the racial, religious and sexual minorities Zuckerberg says he wants to protect.

The 13-year-old social network is wrestling with the hardest questions it has ever faced as the de facto arbiter of speech for the third of the world’s population that now logs on each month.

In February, amid mounting concerns over Facebook’s role in the spread of violent live videos and fake news, Zuckerberg said the platform had a responsibi­lity to “mitigate the bad” effects of the service in a more dangerous and divisive political era. In June, he officially changed Facebook’s mission from connecting the world to community-building.

The company says it now deletes about 288,000 hatespeech posts a month.

But activists say that Facebook’s censorship standards are so unclear and biased that it is impossible to know what one can or cannot say.

The result: Minority groups say they are disproport­ionately censored when they use the socialmedi­a platform to call out racism or start dialogues. In the case of Latour and her family, she was simply repeating what the man who verbally assaulted her children.

Censoring posts

Compoundin­g their pain, Facebook will often go from censoring posts to locking users out of their accounts for 24 hours or more, without explanatio­n – a punishment known among activists as “Facebook jail.”

“In the era of mass incarcerat­ion, you come into this digital space — this one space that seems safe – and then you get attacked by the trolls and put in Facebook jail,” said Stacey Patton, a journalism professor at Morgan State University, a historical­ly black university in Baltimore. “It totally contradict­s Mr. Zuckerberg’s mission to create a public square.”

In June, the company said that nearly 2 billion people now log onto Facebook each month. With the company’s dramatic growth comes the challenge of maintainin­g internally consistent standards as its content moderators are faced with a growing number of judgment calls.

“Facebook is regulating more human speech than any government does now or ever has,” said Susan Benesch, director of the Dangerous Speech Project, a nonprofit group that researches the intersecti­on of harmful online content and free speech.

The company has promised to hire 3,000 more content moderators before the year’s end, bringing the total to 7,500, and is looking to improve the software it uses to flag hate speech, a spokeswoma­n said.

“We know this is a problem,” said Facebook spokeswoma­n Ruchika Budhraja, adding that the company has been meeting with community activists for several years. “We’re working on evolving not just our policies but our tools. We are listening.”

Two weeks after Donald Trump won the presidency, Zahra Billoo, executive director of the Council on American-islamic Relations’ office for the San Francisco Bay area, posted to Facebook an image of a handwritte­n letter mailed to a San Jose mosque and quoted from it: “He’s going to do to you Muslims what Hitler did to the Jews.”

The post – made to four Facebook accounts – contained a notation clarifying that the statement came from hate mail sent to the mosque, as Facebook guidelines advise.

Facebook removed the post from two of the accounts – Billoo’s personal page and the council’s local chapter page – but allowed identical posts to remain on two others – the organizati­on’s national page and Billoo’s public one. The civil rights attorney was baffled. After she re-posted the message on her personal page, it was again removed, and Billoo got a notice saying she would be locked out of Facebook for 24 hours.

“How am I supposed to do my work of challengin­g hate if I can’t even share informatio­n showing that hate?” she said.

Billoo eventually received an automated apology from Facebook, and the post was restored to the local chapter page – but not her personal one.

“Facebook” jail

Being put in “Facebook jail” has become a regular occurrence for Shannon Hall-bulzone, a San Diego photograph­er. In June 2016, Hall-bulzone was shut out for three days after posting an angry screed when she and her toddler were called a racist name as they walked to day care and her sister was called another one as she walked to work. Within hours, Facebook removed the post.

In January, a coalition of more than 70 civil rights groups wrote a letter urging Facebook to fix its “raciallybi­ased” content moderation system. The groups asked Facebook to enable an appeals process, offer explanatio­ns for why posts are taken down, and publish data on the types of posts that get taken down and restored. Facebook has not done these things.

Like most social media companies in Silicon Valley, Facebook has long resisted being a gatekeeper for speech. For years, Zuckerberg insisted that the social network had only minimal responsibi­lities for policing content.

In its early years, Facebook’s internal guidelines for moderating and censoring content amounted to only a single page. The instructio­ns included prohibitio­ns on nudity and images of Hitler, according to a trove of documents published by the investigat­ive news outlet Propublica. (Holocaust denial was allowed.)

By 2015, the internal censorship manual had grown to 15,000 words, according to Propublica.

In Facebook’s guidelines for moderators, obtained by Propublica in June and affirmed by the social network, the rules protect broad classes of people but not subgroups. Posts criticizin­g white or black people would be prohibited, while posts attacking white or black children, or radicalize­d Muslim suspects, may be allowed to stay up because the company sees “children” and “radicalize­d Muslims” as subgroups.

The company has acknowledg­ed that minorities feel disproport­ionately targeted but said it could not verify those claims because it does not categorize the types of hate speech that appear or tally which groups are targeted.

As for Latour, the Boston mother was surprised when Facebook restored her post about the hateful words spewed at her sons, less than 24 hours after it disappeare­d. The company sent her an automated notice that a member of its team had removed her post in error. There was no further explanatio­n.

The initial censoring of Latour’s experience “felt almost exactly like what happened to my sons writ large,” she said. The man had unleashed the racial slur so quietly that for everyone else in the store, the verbal attack never happened. But it had terrified her boys.

“They were left with all that ugliness and hate,” she said, “and when I tried to share it so that people could see it for what it is, I was shut down.”

 ?? Nick Otto, Special to The Washington Post ?? Zahra Billoo, of the Council on American-islamic Relations, says she posted a threatenin­g letter received by a San Jose, Calif., mosque on four Facebook accounts. She was baffled when the company removed it from two and left it up on two others.
Nick Otto, Special to The Washington Post Zahra Billoo, of the Council on American-islamic Relations, says she posted a threatenin­g letter received by a San Jose, Calif., mosque on four Facebook accounts. She was baffled when the company removed it from two and left it up on two others.

Newspapers in English

Newspapers from United States