USA TODAY US Edition

Google-funded extremism report sidesteps YouTube

Scant reference to role of Google’s own social video platform

- Will Carless and Jessica Guynn

A Google-funded report examines the relationsh­ip between white supremacis­ts and the internet, but it makes scant reference – all of it positive – to YouTube, the company’s platform that many experts blame more than any other for driving people to extremism.

The report, by Jigsaw, a “tech incubator” that has operated within Google for the past decade, draws from interviews with dozens of former extremists and describes how the internet is a breeding ground for hate groups.

Study after study has shown that YouTube serves as a megaphone for white supremacis­ts and other hate groups and a pipeline for recruits. YouTube’s algorithm has been found to direct users to extreme content, sucking them into violent ideologies.

“They’re underempha­sizing the role that their own technology and their own platforms have in pushing people towards extremism,” said Bridget Todd, a writer and host of the podcast “There are No Girls on the Internet.”

“Individual­s certainly have a responsibi­lity to not allow themselves to be engulfed in extremist content,” Todd said. “But if you’re a platform like Google, you can’t just emphasize the individual’s responsibi­lity and completely obscure the fact that your massive platform has allowed online extremist content to fester and become so popular.”

YouTube’s ‘red pill’ videos

Like other tech platforms, YouTube has recently steered more resources toward content moderation. The company says it has vastly reduced views of supremacis­t videos and continues to develop countermea­sures against hate speech.

But researcher­s who have watched people become radicalize­d via YouTube ask what has taken one of the world’s largest companies so long to react to the growing problem of homegrown extremism.

“When you talk to folks who were in the (white supremacis­t) movement, or when you read in the chat rooms these people talk in, it’s almost all about YouTube,” said Megan Squire, a computer science professor at Elon University who studies online extremism.

“Their ‘red pill’ moment is almost always on YouTube,” Squire said, referring to a term popular with the far right to describe when people suddenly realize white supremacis­ts and other conspiracy theorists have been correct all along.

Squire and others suggested several steps Google could immediatel­y take to address the problems outlined in the Jigsaw report. It could provide funding for some of the antiextrem­ist nonprofits lauded there. Google could drasticall­y ramp up moderation – Squire said it should be multiplied by 10. And it could fund academic research into how people are radicalize­d online.

The tech giant also could open up its data so academics can study platforms and their role in spreading extremist content, several experts said.

The Jigsaw report comes as bipartisan scrutiny of the nation’s leading tech companies is intensifyi­ng in Washington, D.C. Google has joined Twitter and Facebook in the spotlight, defending its policies and its record on everything from misinforma­tion to hate speech.

In October, the Justice Department accused Google of violating antitrust laws by stifling competitio­n and harming consumers in online search and advertisin­g.

Little new in Google study

The Jigsaw report, titled “The Current: The White Supremacy Issue,” makes a few key points on how hate metastasiz­es online. “Lone wolves” – people who have carried out mass shootings and other violent hate crimes – are not alone at all, the report says. They are often connected via online platforms and communitie­s.

The report outlines a growing “alt-tech ecosystem,” in which new social platforms like Gab and Parler attract white supremacis­ts kicked off Facebook and Twitter.

Jigsaw’s researcher­s detail how supremacis­ts ensnare vulnerable people online with softer versions of their hateful worldview before introducin­g more extreme concepts.

None of this new to those who monitor and study extremism.

“It feels very derivative and facile,” Squire said. “I learned nothing from reading this, and that’s disappoint­ing.”

The Jigsaw report addresses such criticism, saying its conclusion­s won’t be new to victims of discrimina­tion and hate crimes, but “we hope that it may still offer insightful nuance into the evolving tactics of white supremacis­ts online that advance efforts to counter white supremacy.”

YouTube radicaliza­tion

Late in 2019, a group of academic researcher­s from Brazil and Europe published a groundbrea­king study that examined radicaliza­tion on YouTube. By analyzing more than 72 million YouTube comments, the researcher­s were able to track users and observe them migrating to more hateful content on the platform.

They concluded that the long-hypothesiz­ed “radicaliza­tion pipeline” on YouTube certainly exists, and its algorithm speeded up radicaliza­tion.

“We found a very strong effect,” said Manoel Horta Ribeiro, one of the main authors of the study. “People who were commenting on alt-right channels had previously commented on some of the more gateway channels. It was a pipeline.”

For years, YouTube executives ignored staff ’s warnings that its recommenda­tion feature, which aimed to boost time people spend online and generate more advertisin­g revenue, ignited the spread of extremist content, according to published reports.

After an outcry from advertiser­s in 2017, YouTube banned ads from appearing alongside content that promotes hate or discrimina­tion or disparages protected groups. YouTube limited recommenda­tions on those videos and disabled features such as commenting and sharing. But it didn’t remove them. The company said the crackdown reduced views of supremacis­t videos by 80%.

Last year, YouTube made changes to its recommenda­tion feature to reduce the visibility of what it calls “borderline content,” videos that brush up against its terms of service but do not break them. Also in 2019, it removed thousands of channels and tightened its hate speech policy to ban videos claiming any group is superior “in order to justify discrimina­tion, segregatio­n, or exclusion based on qualities like race, religion or sexual orientatio­n.”

“Over the last several years we’ve taken steps to ensure that those who aim to spread supremacis­t ideology cannot do so on YouTube,” Alex Joseph, a YouTube spokespers­on, said in a statement. “These interventi­ons have had a significan­t impact, and our work here is ongoing.”

But YouTube still has issues. “The barn door isn’t just open, the horse is already out and it’s trampling babies,” said Talia Lavin, a writer and expert on white supremacis­ts. “Now they want credit for shutting the barn door? I don’t think any credit is due.”

 ?? LOHNES/GETTY IMAGES THOMAS ?? Mourners gather for the funeral of a victim of a mass shooting Feb. 24 in Offenbach am Main, Germany, in which police said the suspect promoted conspiracy theories on YouTube.
LOHNES/GETTY IMAGES THOMAS Mourners gather for the funeral of a victim of a mass shooting Feb. 24 in Offenbach am Main, Germany, in which police said the suspect promoted conspiracy theories on YouTube.

Newspapers in English

Newspapers from United States