USA TODAY US Edition

Activists on Facebook are ‘getting Zucked’

Activists say they are unfairly censored, call it ‘getting Zucked’

- Jessica Guynn

Some say site fails the very people it claims it’s trying to protect

It was spirit week, and Carolyn Wysinger, a high school teacher in Richmond, California, was cheerfully scrolling through Facebook on a break between classes. Her classroom, with its black-and-white images of Martin Luther King Jr. and Che Guevara and a “Resist Patriarchy” sign, was piled high with colorful rolls of poster paper, the whiteboard covered with plans for pep rallies.

A post from poet Shawn William caught her eye. “On the day that Trayvon would’ve turned 24, Liam Neeson is going on national talk shows trying to convince the world that he is not a racist.” While promoting a revenge movie, the Hollywood actor confessed that decades earlier, after a female friend told him she’d been raped by a black man she could not identify, he’d roamed the streets hunting for black men to harm.

For Wysinger, an activist whose podcast The C-Dubb Show frequently explores anti-black racism, the troubling episode recalled the nation’s dark history of lynching, when charges of sexual violence against a white woman were used to justify murders of black men.

“White men are so fragile,” she fired off, sharing William’s post with her friends, “and the mere presence of a black person challenges every single thing in them.”

It took just 15 minutes for Facebook to delete her post for violating its community standards for hate

“Lawmakers often tell me we have too much power over speech, and frankly, I agree.” Mark Zuckerberg

speech. And she was warned if she posted it again, she’d be banned for 72 hours.

Wysinger glared at her phone, but wasn’t surprised. She says black people can’t talk about racism on Facebook without risking having their posts removed and being locked out of their accounts in a punishment commonly referred to as “Facebook jail.” For Wysinger, the Neeson post was just another example of Facebook arbitraril­y deciding that talking about racism is racist.

“It is exhausting,” she says, “and it drains you emotionall­y.”

Black activists say hate speech policies and content moderation systems formulated by a company built by and dominated by white men fail the very people Facebook claims it’s trying to protect. Not only are the voices of marginaliz­ed groups disproport­ionately stifled, Facebook rarely takes action on repeated reports of racial slurs, violent threats and harassment campaigns targeting black users, they say.

Many of these users now think twice before posting updates on Facebook or they limit how widely their posts are shared. Yet few can afford to leave the single-largest and most powerful social media platform for sharing informatio­n and creating community.

So to avoid being flagged, they use digital slang such as “wypipo,” emojis or hashtags to elude Facebook’s computer algorithms and content moderators. They operate under aliases and maintain backup accounts to avoid losing content and access to their community. And they’ve developed a buddy system to alert friends and followers when a fellow black activist has been sent to Facebook jail.

They call it getting “Zucked,” and black activists say these bans have serious repercussi­ons, not just cutting people off from friends and family for hours, days or weeks at a time, but often from the Facebook pages they operate for their businesses and nonprofits.

A couple of weeks ago, Black Lives Matter organizer Tanya Faison had one of her posts removed as hate speech. “Dear white people,” she wrote in the post, “it is not my job to educate you or to donate my emotional labor to make sure you are informed. If you take advantage of that time and labor, you will definitely get the elbow when I see you.” After being alerted by USA TODAY, Facebook apologized to Faison and reversed its decision.

“What we continue to see time and time again is what’s framed as race-neutral decision-making ends up being overtly hostile to the communitie­s most in need of some of those free speech protection­s.” Brandi Collins-Dexter Senior campaign director at Color Of Change

“That’s the biggest thing, making sure we are in tune with this community and the way they actually speak about these topics, and making sure our policies are in line and in touch.” Neil Potts Public policy director at Facebook

“If I were to sit down with Mark Zuckerberg, the message I would want to get across to him is: You may not even realize how powerful a thing you have created. Entire revolution­s could take place on this platform. Global change could happen. But that can’t happen if real people can’t take part.” Natasha Marin Seattle black anti-racism consultant and conceptual artist

‘Black people are punished on Facebook’

“Black people are punished on Facebook for speaking directly to the racism we have experience­d,” says Seattle black anti-racism consultant and conceptual artist Natasha Marin.

Marin says she’s one of Facebook’s biggest fans. She created a “reparation­s” fund that’s aided a quarter-million people with small donations to get elderly folks transporta­tion to medical appointmen­ts or to pay for prescripti­ons, to help single moms afford groceries or rent or to get supplies for struggling new parents. More recently, she started a social media project spreading “black joy” rather than black trauma.

She also was banned by Facebook for three days for posting a screenshot of a racist message she received.

“For me as a black woman, this platform has allowed me to say and do things I wouldn’t otherwise be able to do,” she says. “Facebook is also a place that has allowed things like death threats against me and my children. And Facebook is responsibl­e for the fact that I am completely desensitiz­ed to the N-word.”

Seven out of 10 black U.S. adults use Facebook and 43% use Instagram, according to the Pew Research Center. And black millennial­s are even more engaged on social media. More than half – 55% – of black millennial­s spend at least one hour a day on social media, 6% higher than all millennial­s, while 29% say they spend at least three hours a day, 9% higher than millennial­s, Nielsen surveys found.

The rise of #BlackLives­Matter and other hashtag movements show how vital social media platforms have become for civil rights activists. About half of black users turn to social media to express their political views or to get involved in issues that are important to them, according to the Pew Research Center.

These hashtag movements, coming against the backdrop of an upsurge in hate crimes, have helped put the deaths of unarmed African Americans by police officers on the public agenda, along with racial disparitie­s in employment, health and other key areas.

“There should be policies and community standards that overtly support that kind of work,” Marin says. “Maybe Mark Zuckerberg needs to sit down with a bunch of black women who use Facebook and just listen.”

How Facebook judges which speech is hateful

For years, Facebook was widely celebrated as a platform that empowered people to bypass mainstream media or oppressive government­s to directly tell their story. Now, in the eyes of some, it has assumed the role of censor.

With more than a quarter of the world’s population on Facebook, the social media giant says it’s wrestling with its unpreceden­ted power to judge what speech is hateful.

All across the political spectrum, from the far right to the far left, Facebook gets flak for its judgment calls. To help sort what’s allowed and what’s not, it relies on a 40-page list of rules called “Community Standards,” which were made public for the first time last year. Facebook defines hate speech as an attack against a “protected characteri­stic,” such as race, gender, sexuality or religion. And each individual or group is treated equally. The rules are enforced by a combinatio­n of algorithms and human moderators trained to scrub hate speech from Facebook. From July to September 2018, Facebook removed 2.9 million pieces of content that it said violated its hate speech rules, more than half of which was flagged by its technology.

The tag team of algorithms and moderators frequently makes mistakes when flagging and removing content, Facebook acknowledg­es. And it has taken steps to try to make its system more accountabl­e. Last year, Facebook began allowing users to file an appeal when their individual posts are removed. This year, the company plans to introduce an independen­t body of experts to review some of those appeals.

In late 2017 and early 2018, Facebook explored whether certain groups should be afforded more protection than others. For now, the company has decided to maintain its policy of protecting all racial and ethnic groups equally, even if they do not face oppression or marginaliz­ation, says Neil Potts, public policy director at Facebook. Applying more “nuanced” rules to the daily tidal wave of content rushing through Facebook and its other apps would be very challengin­g, he says.

Potts acknowledg­es that Facebook doesn’t always read the room correctly, confusing advocacy and commentary on racism and white complicity in anti-blackness with attacks on a protected group of people. Facebook is looking into ways to identify when oppressed or marginaliz­ed users are “speaking to power,” Potts says. And it’s conducting ongoing research into the experience­s of the black community on its platform.

“That’s, on its face, the type of speech we want to encourage, but words and people aren’t perfect, so it doesn’t always come across as that,” he says.

Facebook wants to make sure its policies “reflect how people speak about these topics.” “That’s the biggest thing,” Potts says, “making sure we are in tune with this community and the way they actually speak about these topics.”

‘Another slap in the face’

Ayo Henry, a mother of four from Providence, Rhode Island, says Facebook’s policies could not be more out of touch.

Last year, Henry was cut off by a kid on a bike wearing a confederat­e sweatshirt when she pulled into the parking lot of a sandwich shop. She honked her horn. He responded twice with a racial slur.

She restrained herself in front of her children, but a few weeks later after leaving roller derby practice, Henry spotted the kid again, wearing the same sweatshirt. The boy tried to pedal away. She pulled out her phone.

He apologized, explaining he “wasn’t in a good mood that day.” She realized how young he was as his body trembled and hands shook. She tried to offer him some motherly advice on why he should not use racial slurs.

Henry’s video of the exchange was viewed more than 2 million times on Facebook. Within 48 hours, Facebook took the footage down, saying it ran afoul of its hate speech rules. Henry appealed the decision but Facebook refused to reverse it.

In the meantime, her Messenger inbox filled with hundreds of racial slurs, derogatory messages and threats that she would be raped or killed. Yet each time Henry tried to privately share the video with her friends on Messenger, Facebook blocked her.

“Social media is supposed to be a way that people can come together and be able to communicat­e relatively freely,” she says. “For us, it has become just another slap in the face.”

Civil rights groups push for audit, accountabi­lity

Early on, the Black Lives Matter social justice movement turned to Facebook as an organizing tool. Yet their organizers say they were soon set upon by bands of white supremacis­ts who targeted them with racial slurs and violent threats. In 2015, Color Of Change, which was formed after Hurricane Katrina to organize racial justice campaigns on the internet, began pressuring Facebook to stop the harassment of black activists by hate groups.

Chanelle Helm, a Black Lives Matter organizer from Louisville, Kentucky, says the threats intensifie­d in the form of doxxing – posting organizers’ addresses, phone numbers and photos on the internet. Faison, founding member of the Sacramento chapter of Black Lives Matter, was stalked. “It got a lot more serious,” Helm says. “They were threatenin­g folks with doxxing of family members.”

Facebook removed a group responsibl­e for some of the harassment, but Color of Change and other civil rights groups say they struggled to get the company to address other complaints. Late last year, The New York Times reported that Facebook had hired a Republican opposition research firm to discredit Color of Change and other Facebook critics.

“What we continue to see time and time again is what’s framed as race-neutral decision-making ends up being overtly hostile to the communitie­s most in need of some of those free speech protection­s,” says Brandi Collins-Dexter, senior campaign director at Color of Change.

In 2016 and again in 2017, civil rights and other groups wrote letters urging Facebook to conduct an independen­t civil rights audit of its content moderation system and to create a task force to institute the recommenda­tions.

Last May, Facebook agreed to an audit as it was trying to control the damage from revelation­s that a shadowy Russian organizati­on posing as Americans had targeted unsuspecti­ng users with divisive political messages to sow discord surroundin­g the 2016 presidenti­al election. One of the main targets of the Internet Research Agency on Facebook were African Americans. The same day Facebook gave in to demands from civil rights groups, it announced a second audit into allegation­s of anti-conservati­ve bias led by former Sen. Jon Kyl, an Arizona Republican.

There are few signs of progress in how Facebook deals with racially motivated hate speech against the African American community or the erasure of black users’ speech, says Steven Renderos, senior campaign manager at the Center for Media Justice.

Civil rights organizati­ons say they’ve largely given up on Facebook voluntaril­y taking steps to protect black users, calling instead on Congress and the Federal Trade Commission to regulate the company. Color of Change has asked Zuckerberg and COO Sheryl Sandberg to take part in a civil rights summit this spring, but they have not agreed to take part. Color of Change also is pushing a resolution at Facebook’s shareholde­r meeting in May to replace Zuckerberg as chairman of the board.

“At the end of the day, Facebook hasn’t tackled one of the biggest issues of most interest to the civil rights community, which is how it deals with content moderation and how the platform will become a place that civil rights are protected,” Renderos says. “We, and a lot of organizati­ons that we work with, are frankly tired of waiting for Facebook to decide what changes it’s going to make for itself.”

‘I don’t think Facebook cares’

In February, Wysinger decided not to risk being booted off Facebook by republishi­ng her post about Neeson, the actor. Just days before her 40th birthday, she did not want to get thrown in Facebook jail and miss the chance to celebrate with family and friends. But, she says, she wants Facebook to know that, in silencing black people, the company is causing them harm.

“Facebook is not looking to protect me or any other person of color or any other marginaliz­ed citizen who are being attacked by hate speech,” she says. “We get trolls all the time. People who troll your page and say hateful things. But nobody is looking to protect us from it. They are just looking to protect their bottom line.”

 ?? BRITTANY HOSEA-SMALL FOR USA TODAY ?? Carolyn Wysinger, a teacher in Richmond, California, says censoring on Facebook “drains you emotionall­y.”
BRITTANY HOSEA-SMALL FOR USA TODAY Carolyn Wysinger, a teacher in Richmond, California, says censoring on Facebook “drains you emotionall­y.”
 ?? ANDREW HARNIK/AP ??
ANDREW HARNIK/AP
 ??  ?? MICHAEL MEADOWS
MICHAEL MEADOWS
 ??  ?? FACEBOOK
FACEBOOK
 ??  ?? MARY DEE MATEO
MARY DEE MATEO

Newspapers in English

Newspapers from United States