Twitter begins enforcing stricter content rules
Social media giant begins enforcing new rules to reduce abusive, hateful content on its site
SAN FRANCISCO >> Twitter on Monday suspended the accounts of two leaders of Britain First, a farright political group whose antiMuslim videos were retweeted by President Donald Trump last month.
The suspensions came on the same day Twitter started enforcing stricter rules aimed at combating hateful and abusive content. The move comes as the company and other social media firms ramp up efforts to police offensive content on their sites.
“We’re making these changes to create a safer environment for everyone,” Twitter said in a blog post.
Hours after the company announced it was enforcing its new rules, the accounts of Britain First’s leaders Paul Golding and Jayda Fransen were blocked, preventing the two from posting on the social media site. Britain First’s main account, @BritainFirstHQ, was also pulled down on Monday.
But even as they work to combat hate speech online, tech firms have faced criticism this year for how they interpret their online rules. While some groups applauded the enforcement of Twitter’s new rules on Monday, others accused the tech firm of censoring free speech.
In November, Oakland’s Muslim Advocates asked Twitter to pull down the accounts of Fransen and Britain First for violating rules about inciting violence against a particular group. Trump shared videos from the group which purportedly showed Muslims committing violent acts.
Muslim Advocates, which also sits on Twitter’s Trust and Safety Council, told the company in a letter that Britain First was trying to “perpetuate and fuel hatred and fear of Muslims.”
At first, Twitter claimed the tweets were newsworthy, but the company clarified that they did not violate its media policy, which allows some forms of graphic violence.
On Monday, Twitter suspended the accounts. The company declined to comment on which accounts they took down and why, but noted that Twitter is reviewing its new policy on hateful and abusive content.
Muslim Advocates applauded the decision by Twitter to pull down the accounts tied to Britain First.
“As social media sites have become the central organizing hub for America’s hate groups, companies like Twitter and Facebook have a responsibility to ensure that their platforms are not used to sow violence and hate,” said
Madihha Ahussain, special counsel for anti-Muslim bigotry for Muslim Advocates, in a statement.
But she also signaled that combating abuse and hate on and off social media sites is far from over.
“Despite the progress that still needs to be made, today marks an important step toward reducing the online bigotry and violence that too often spills over to the rest of the country,” Ahussain said.
As Twitter started enforcing its new rules on Monday, some white nationalists and white supremacy groups criticized the tech firms for pulling down their social media accounts.
The accounts of white nationalist Jared Taylor and American Renaissance — a white supremacist magazine he founded — were also suspended by Monday. In a post on Gab, American Renaissance denied that it had violated Twitter’s online rules, including
its new ones.
“This is pure, politicallymotivated censorship,” the group wrote.
Other well-known white nationalists, including National Policy Institute President Richard Spencer, remained on Twitter.
Some of the new Twitter rules that went into effect on Monday included permanently suspending accounts affiliated with groups that use violence to advance their cause. To figure out if an account has ties to these groups, Twitter said it will look at users’ behavior, such as whether they’re funding a violent extremist group, promoting their acts or recruiting for them.
The new rules don’t apply to military or government entities or groups with elected public officials, the company said.
Twitter said it will also allow some hateful imagery and hate symbols in tweets, but users will be warned that it’s sensitive content.
“While we want people to feel free to share media that reflects their creativity or individuality, or to show what’s happening in the world, we will take action when it crosses the line into abuse towards a person, group, or protected category,” the company’s help center states.
Twitter, which allows users to display pseudonyms, said that it will also take down accounts that use their display names, profile bio or user name to target a protected group of people with violent threats or abusive slurs.
With 330 million monthly active users, Twitter also acknowledged that its interpretation of the new rules may not always be perfect. “In our efforts to be more aggressive here, we may make some mistakes and are working on a robust appeals process,” Twitter said.