San Francisco Chronicle

Battle over hate speech intensifie­s at Reddit

- By Suhauna Hussain

For years Jefferson Kelley watched hate bloom in his treasured online spaces.

When Kelley, a Reddit moderator, booted hateful users off threads where Black people discussed sensitive personal experience­s, racial slurs piled up in his inbox. Crude remarks about women filled the comment sections under his favorite “Star Trek” GIFs. The proliferat­ion of notorious forums, including one that perpetuate­d a vicious racist stereotype about Black fathers, stung Kelley, a Black father himself.

Kelley and other moderators repeatedly pleaded with the company to back them up and take stronger action against harassment and hate speech. But Reddit never quite came through.

Then, all of a sudden, that seemed to change. When Reddit announced this month it is shutting down a noxious proPreside­nt Trump group that had violated the site’s rules for years, Kelley could scarcely believe it.

Reddit’s move to overhaul content policy and ban some 2,000 subreddits, or forums, is one of the most sweeping enforcemen­t actions the company has taken to date. To Kelley and other Black moderators, it was a sign that the company might finally begin real work to stem the flow of harassment and abuse they faced on a daily basis.

The bans — which coincided with a wave of aggressive moves by other large internet services including Facebook and YouTube — came after hundreds of Reddit moderators signed a letter urging the company to take racism seriously. It also followed the resignatio­n this month of Alexis Ohanian, one of Reddit’s cofounders, from the company’s board of directors. Ohanian, who said he had been moved by the protests over the killing of George Floyd,

The bans came after hundreds of moderators signed a letter urging Reddit to take racism seriously.

asked that his board seat be filled by a black candidate.

Tech companies have long been under fire for allowing false informatio­n and discrimina­tory ideologies to spread, and for weak or inconsiste­nt enforcemen­t of policies against hate speech and harassment. Hesitant to provoke backlash from conservati­ve critics and farright agitators, leaders have often argued that their services are neutral grounds akin to public spaces, and pointed to “free speech” values as reason for their inaction.

But the rapid approach of a presidenti­al election amid a global pandemic and a nationwide movement over Floyd’s killing have engineered a tipping point. The math has changed, and tech services have seemingly, as one journalist quipped, “decided that the grief they’re getting for tolerating hate is more trouble than the grief they’d get for not tolerating hate.”

For moderators, who had spent years trying in vain to get the ear of Reddit’s leaders, the effect of this sudden shift was as if the brick wall they’d been pushing on suddenly transforme­d into a swinging door.

When Kelley started lurking on Reddit in 2014, he was there mostly for the “Star Trek” content. After several years participat­ing enthusiast­ically in the r/ StarTrekGI­Fs forum, he took charge of it, volunteeri­ng as an unpaid moderator in 2016. Reddit quickly became core to his social life. Kelley made GIFs and he made friends. He even started recording a podcast, “Beyond Trek,” with the people he met.

Kelley had always noticed the stream of hate, but when he began moderating the prominent Black People Twitter subreddit in 2017, the stream turned into a torrent.

Users mockingly labeled a Black student’s admission to Harvard Medical School an affirmativ­e action case and promoted misleading, racist narratives about “black on black crime.”

The forum was supposed to provide respite from racism, so Kelley and its other moderators came up with new rules: Comments would initially be open to all, but if the bad faith remarks piled up, the thread would be put in “Country Club” mode, in which only users the moderators manually verified could comment. (The name is a tongueinch­eek reference to the history of Black people being excluded from country clubs.)

Although this tactic succeeded in improving discourse in the forum, Kelley, as its moderator, paid a price. On an average day, he might receive 50 messages containing “n—” or other racist words. (With Black Lives Matter protests surging, that number has only increased, he said.)

In its outlines, his story resembles those of legions of other moderators who manage enormous communitie­s on Reddit. Like Kelley, many joined for friends and community, only to become disillusio­ned.

In 2015, moderators shut down more than 265 subreddits in protest of the company’s firing of Victoria Taylor, a thenemploy­ee of Reddit who served as a useful resource to moderators. The revolt was a culminatio­n of mounting frustratio­n that the company did not appreciate their work or provide proper moderation tools. Company cofounder Ohanian responded at the time, acknowledg­ing the situation was handled poorly and promising to address moderators’ concerns.

@TheYellowR­ose, a moderator of the subreddit r/blackladie­s, told the Atlantic she and her fellow “mods” were harassed in the wake of the 2014 Black Lives Matter protests in Ferguson, Mo. Her team wrote an open letter titled, “We have a racist user problem and Reddit won’t take action.” She said the letter, signed by mods overseeing dozens of subreddits, received no response.

Over the years, Reddit occasional­ly quarantine­d or banned handfuls of subreddits and tweaked its policies in response to public backlash. Overall, the changes failed to stem the flow of harassment.

It wasn’t even clear that Reddit’s leadership considered that a goal. In 2018, when a user asked Reddit CEO Steve Huffman whether “obvious open racism, including slurs,” was against the company’s rules, Huffman said it wasn’t. (He added later that although racism was unwelcome, it wasn’t prohibited.)

It wasn’t until September 2019 that the company, in the course of removing a dozen white nationalis­t subreddits, more explicitly banned harassment and bullying.

When the Black Lives Matter movement succeeded in mobilizing millions of protesters following Floyd’s death, it unleashed pressure that had been building for years, pushing Reddit users who have been uncomforta­ble with the site’s culture for years to act for the first time and ushering executives to the table.

A moderator of the subreddit “Against Hate Subreddits,” @DubTeeDub, was angered by what he saw as hypocrisy in Huffman’s somber public note affirming support for Black Lives Matter. Huffman wrote: “We do not tolerate hate, racism, and violence, and while we have work to do to fight these on our platform, our values are clear.”

@DubTeeDub drafted a letter demanding change. Hundreds of moderators including Kelley signed the June 8 open letter to Huffman and Reddit’s board. Within a day of publishing, @DubTeeDub received a message from @ggAlex, who introduced himself as Alex Le, the company’s vice president of product.

The introducti­on led to @DubTeeDub and other moderators of r/Against HateSubred­dits’s being invited to a series of Zoom videoconfe­rence calls with Reddit’s paid administra­tors and executives. The sessions were presented as part of the company’s outreach to moderators fighting hate, Black users and other marginaliz­ed groups.

The response to the letter was notable because communitie­s one wouldn’t normally expect to offer support, did, said J. Nathan Matias, an assistant professor at Cornell studying digital governance and behavior. “The music discussion community, the relationsh­ip advice community, the community for talking about swimming — you wouldn’t normally see communitie­s like that as focused on social change,” he said, “so it’s actually a big deal.”

In 2015, when Reddit banned an offensive fatshaming subreddit, the outcries of censorship by various communitie­s was swift and intense. The front page of Reddit, which features the site’s mostengage­dwith content, was plastered nonstop with posts decrying the banning; hundreds of imitation “Fat People Hate” subreddits popped up; users posted private informatio­n about Reddit admins who helped carry out enforcemen­t.

But it’s clear the culture has changed drasticall­y since then, @DubTeeDub said. “People are getting very tired of being associated with a website that has such a dominating hateful ideology,” he said.

Kelley is part of the shift: Although he has always fought hard to make his own Reddit communitie­s hatefree, Kelley said he had never before devoted time to broader internal efforts.

Kelley joined several of the Zoom calls with Reddit administra­tors and executives. He’s accumulate­d a laundry list of ideas for how the company can better support moderators, based on his own experience. One suggestion: placing more obstacles for users who message mods. It’s not uncommon for someone to create six different Reddit accounts to spam a moderator’s inbox over and over. An extra identityve­rification step might weed out people acting in bad faith, he said.

The “Black Fathers” subreddit provides a glaring example of Reddit’s inaction on racism over the years. The name suggests a space filled with posts by Black men attempting to do their daughters’ hair and other similarly wholesome content, Kelley said. But, in fact, the subreddit was meant as one big racist joke based on the stereotype of absent Black fathers. The moderators who created the subreddit many years ago restricted posting so that the only visible message was, “There doesn’t seem to be anything here.”

r/BlackFathe­rs remained on the site for years. The company quarantine­d the group in 2015 but didn’t go as far as banning it until last month.

When asked about prolonged inaction on subreddits such as r/ BlackFathe­rs, Reddit pointed to a statement by Huffman.

He said that although the company had been getting better at enforcemen­t and measurably reducing hateful experience­s every year, “ultimately, it’s our responsibi­lity to support our communitie­s by taking stronger action against those who try to weaponize parts of Reddit against other people.”

Reddit is not the only internet service rethinking its responsibi­lity to regulate content.

In late May, Twitter slapped warning labels on tweets by Trump that made false claims or glorified violence toward protesters, becoming the first company to challenge his pattern of lying and bullying on social media. On June 3, Snapchat said it would no longer promote Trump’s account in the “Discover” tab of the app. On June 18, Facebook removed dozens of ads placed by Trump’s reelection campaign for using Nazi imagery, and a week later the company said it would label or remove politician­s’ tweets when they violated rules — including tweets posted by Trump.

Then, on the same day that Reddit handed down its bans, Amazonowne­d streaming service Twitch temporaril­y suspended Trump’s channel over “hateful conduct,” and YouTube banned half a dozen prominent white supremacis­t channels, including those of David Duke and Richard Spencer.

Experts say the changes sweeping the industry have probably been spurred by a confluence of advertiser­s threatenin­g to pull their ad dollars from big companies, negative press, internal pressure by employees, and diminishin­g public goodwill.

 ?? Dreamstime ?? Reddit announced last week it is shutting down a noxious proTrump group.
Dreamstime Reddit announced last week it is shutting down a noxious proTrump group.

Newspapers in English

Newspapers from United States