USA TODAY US Edition

YouTube doubles down on extremist, problem videos

- Mike Snider @mikesnider USA TODAY

YouTube says it’s accelerati­ng its efforts to combat online extremism content with a new tack: made it harder to find them.

“While we and others have worked for years to identify and remove content that violates our policies, the uncomforta­ble truth is that we, as an industry, must acknowledg­e that more needs to be done. Now,” Kent Walker, senior vice-president and general counsel of Google, said in a blog post on Google.org and an op-ed in the Financial Times.

The prevalence of extremist content on YouTube became an issue again this month when it was revealed one of the three attackers in the London Bridge terror incident June 3 had been influenced by YouTube videos of Ahmad Musa Jebril, a Dearborn, Mich., cleric popular who has developed an internatio­nal following in recent years with Islamic State fighters. The three attackers, who were killed by police, drove a van into pedestrian­s on London Bridge and got out of the van to stab others in a market, killing eight and injuring dozens.

In March, Google and YouTube found themselves facing irate advertiser­s, with many pulling their business, after they found their ads played on videos promoting terrorism and extremist content on the video service. They moved to establish a 10,000-viewer requiremen­t for access into its YouTube Partner Program, which lets creators earn revenue via ads running on their videos.

The tech giant also improved its use of machine learning technology to prevent ads from being automatica­lly run with extremist or other violent content. Now it’s using that research to train its staff, and it’s rolling out other measures, including adding warnings to extremist videos and preventing comments, which will make it harder for them to get popular.

“We will now devote more engineerin­g resources to apply our most advanced machine learning research to train new ‘content classifier­s’ to help us more quickly identify and remove such content,” Walker said.

YouTube will also get more “Trusted Flaggers,” human experts who help spot problem videos, Walker said. The non-government­al organizati­ons that help Google and YouTube find troublesom­e content will be nearly doubled, with 50 more NGOs added to 63 current participan­ts. Google will support them with operationa­l grants, he said. Videos that do not clearly violate YouTube’s policies that bar those that encourage terrorism — and that do contain inflammato­ry religious or supremacis­t content — will appear with a warning and will not be able to gain revenue with ads. Nor will viewers be able to endorse or comment on them, Walker said. “That means these videos will have less engagement and be harder to find,” he said. “We think this strikes the right balance between free expression and access to informatio­n without promoting extremely offensive viewpoints.”

In addition to its Creators for Change program, launched last year to promote voices against hate and radicaliza­tion, YouTube will work with Jigsaw, an incubator company within Google’s parent company Alphabet, to redirect potential ISIS recruits to anti-terrorist videos.

Along with Facebook, Microsoft and Twitter, Google and YouTube are working to establish an internatio­nal online terrorism forum. “Together, we can build lasting solutions that address the threats to our security and our freedoms,” Walker said.

 ?? DANNY MOLOSHOK, AP ??
DANNY MOLOSHOK, AP

Newspapers in English

Newspapers from United States