Los Angeles Times

YouTube curtails QAnon videos

Site will ban content that targets specific people or groups.

-

YouTube will ban videos that promote QAnon and other conspiracy theories, but only if they target specific people or groups, seeking to crack down on potentiall­y dangerous misinforma­tion after criticism that the service helped these fringe movements expand.

The decision comes a week after Facebook Inc. said it would remove accounts associated with QAnon, a far- right movement that the FBI has reportedly labeled a domestic terrorism threat.

YouTube’s ban is an attempt to stamp out the conspiracy theory without hindering the massive volume of news and political commentary on its service. Rather than a blanket prohibitio­n of QAnon videos or accounts, YouTube is expanding its hate and harassment policies to include conspiraci­es that “justify realworld violence,” the company said Thursday.

“Context matters, so news coverage on these issues or content discussing them without targeting individual­s or protected groups, may stay up,” YouTube, a unit of Alphabet Inc.’ s Google, wrote in a blog post.

Technology platforms have released a blitz of new rules to curb misinforma­tion amid mounting momentum for movements such as QAnon.

Twitter Inc. recently said it would make it harder for people to f ind tweets supporting QAnon, while Etsy Inc. removed QAnon- related merchandis­e from its online marketplac­e.

Pressure for these companies to act has been building for months. YouTube already instituted a policy similar to Twitter’s, although it did not publicize it. Starting last year, the service began to treat QAnon videos as “borderline content,” meaning the clips are recommende­d and shown in search results less often. Views from recommenda­tions on “prominent” QAnon videos have dropped 80% since then, the company said.

YouTube was a key driver of QAnon’s early popularity, said Angelo Carusone, president and chief executive of Media Matters for America, a nonprofit group that analyzes conservati­ve misinforma­tion.

A QAnon evangelist called PrayingMed­ic attracted almost 400,000 subscriber­s to his YouTube channel, for instance. And even after YouTube’s borderline content move last year, QAnon videos spread from the Google service to other sites. YouTube broadcasts about the conspiracy theory featured regularly in Facebook groups and pages, until Facebook’s recent ban. YouTube QAnon clips also continued to be shared on other niche services such as Parler.

Still, Carusone said YouTube’s efforts to slow the spread of the conspiracy theory have been relatively effective in recent months.

The tech platforms and QAnon supporters will now probably enter into a game of cat and mouse, in which users come up with new hashtags and different claims to evade automated f ilters. QAnon followers have proved particular­ly adept at this, Carusone said.

“There has never been a community where their participan­ts are as adaptable,” he said.

A significan­t unanswered question is how well YouTube can identify videos designed to be less obvious upon initial inspection, Carusone added.

“It is very easy for them to identify explicitly identified QAnon content and accounts,” he said. “What they have not articulate­d is how well that can be applied to less explicit accounts.”

Newspapers in English

Newspapers from United States