Los Angeles Times

Google targets hate videos

Firm takes steps to remove offensive clips from YouTube after some advertiser­s flee.

- By Rachel Spacek rachel.spacek@latimes.com

The Web giant takes steps to remove offensive clips from YouTube after some advertiser­s f lee.

Months after some advertiser­s fled Google over concerns about ads appearing alongside YouTube videos that promote hate and extremism, the Internet giant has announced new steps aimed at tackling such content.

“There should be no place for terrorist content on our services,” Google said in a Sunday blog post outlining ways that it will identify problemati­c videos and remove them from YouTube — or at least stop them from being monetized and make them harder to find.

In March, after a report by Britain’s the Times showed examples of ads appearing next to videos by homophobic British preacher Steven Anderson and American white supremacis­t David Duke, brands including AT&T, Verizon and Enterprise Rent-A-Car said they would halt or reduce deals to advertise with Google.

The uproar centered on ads placed on YouTube as well as websites and apps that use Google’s ad technology. It was a real concern for Google’s parent company, Alphabet Inc., which has struggled to generate significan­t profits outside of advertisin­g.

In its Sunday blog post, Google said one way that it will fight extremist-related content is by devoting more resources to apply advanced machine-learning research. More than half of the terrorism-related content Google has removed in the last six months was found and assessed by video analysis models, it said; this step will build on that.

Google also said it plans to increase the number of independen­t experts in YouTube’s Trusted Flagger program, in which a tier of trusted people alert the company to problemati­c videos. It will add 50 expert nongovernm­ental organizati­ons to the 63 that are already part of the program.

When it comes to videos that are troublesom­e but do not clearly violate the company’s policies, such as those that contain inflammato­ry supremacis­t content, Google said it will take a tougher stance. It said those videos will be preceded by a warning and will not have advertisem­ents, will not be recommende­d and will not be eligible for comments.

“We think this strikes the right balance between free expression and access to informatio­n without promoting extremely offensive viewpoints,” the company said in the blog post, written by Google general counsel Kent Walker.

Finally, Google announced the expansion of two programs that try to sway people’s opinions: Creators for Change, which promotes YouTube videos that are against hate and radicaliza­tion, and Redirect Method, which uses targeted online ads to reach potential Islamic State recruits and redirect them to anti-terrorism videos.

Seamus Hughes, deputy director of the Program on Extremism at George Washington University, said Google’s announceme­nt is a positive step and signifies that “they are looking at this with fresh eyes.”

And since Google is such a large company, he said, it will lead other companies that host user-generated material to make further changes in fighting terrorism-related content.

AT&T was not moved by Google’s announceme­nt.

“We are deeply concerned that our ads may have appeared alongside YouTube content promoting terrorism and hate,” it said, reiteratin­g a statement it made in March. “Until Google can ensure this won’t happen again, we are removing our ads from Google’s nonsearch platforms.”

 ?? Bertrand Guay AFP/Getty Images ?? A GOOGLE booth in Paris. “There should be no place for terrorist content on our services,” Google said.
Bertrand Guay AFP/Getty Images A GOOGLE booth in Paris. “There should be no place for terrorist content on our services,” Google said.

Newspapers in English

Newspapers from United States