Twitch now will police users’ behavior outside the platform
Amazon’s livestreaming service Twitch announced this week that it will enforce its conduct policy on extreme behavior that happens outside its platform. That includes deadly violence, membership in a hate group, terrorism, threats of mass violence, nonconsensual sexual activity, exploitation of children, threats against Twitch staff and any threats of violence at a Twitch event.
“Taking action against misconduct that occurs entirely off our service is a novel approach for both Twitch and the industry at large, but it’s one we believe — and hear from you — is crucial to get right,” Twitch wrote Wednesday in a blog post, detailing its new rules that apply to all Twitch users.
Twitch’s rules previously focused on streamers’ behavior on the platform, and while it had historically taken action against serious misconduct that happened off the platform, it didn’t specify this in its guidelines. The update is a response to multiple incidents, including the wave of #MeToo allegations that swept the gaming industry last year. When several women raised concerns about Twitch streamers, alleging misconduct, the company realized its current policy around poor behavior that occurs outside the platform needed more clarity, Twitch spokesperson Gabriella Raila said.
The company wrote that “until now, we didn’t have an approach that scaled.” Previously, Twitch would review harassment that happened outside its platform and look at evidence before taking action, but it didn’t specifically address users who are leaders or members in hate groups or participate in other extreme behavior.
“There’s something quite provocative about this gesture at a time when major media companies like Twitter and Facebook were years late in deplatforming white supremacists and domestic terrorists who were openly spreading hate speech and inciting violence on their own social media pages,” said Laine Nooney, assistant professor and historian of video games at New York University.
The company has been taking different measures in recent months to clean up its platform. In January, Twitch beefed up its policy against hateful images and explicitly banned the Confederate flag. It also remade a popular gaming emote, PogChamp, after the man pictured in the emote tweeted comments that encouraged further violence at the Jan. 6 Capitol riot.
Twitch said it is hiring a third-party law firm to support investigations and that it increased the size of its internal team that works with law enforcement. The findings of investigations will be shared with the people involved, but will not be made public by Twitch. Those teams will also look for evidence to verify user reports.
People who report this behavior can submit evidence including direct links to public posts or uploaded content of the user breaking the rules. Twitch notes that screenshots can still be edited, so they need to be supported with other verifiable evidence or confirmed by the law firm as authentic. Evidence could include police reports, rape kits, texts, emails, photos or speaking to third parties to corroborate stories, according to Twitch’s Raila.
“In a way, Twitch’s revised guidelines push the platform toward reflecting community norms that are more akin to in-person social relations. I don’t need a friend to be violent in my home to have reason to break off a friendship; I merely need credible evidence that they were violent elsewhere,” Nooney said.