San Francisco Chronicle

Google, YouTube pushing to thwart zealotry

- By Daisuke Wakabayash­i

YouTube has struggled for years with videos that promote offensive viewpoints but do not necessaril­y violate the company’s guidelines for removal. Now it is taking a new approach: Bury them.

The issue has gained new prominence amid media reports that one of the London Bridge attackers became radicalize­d by watching YouTube videos of an American Islamic preacher, whose sermons have been described as employing extremely charged religious and sectarian language.

On Sunday, Google, YouTube’s parent company, announced a set of policies to curb extremist videos. For videos that are clearly in violation of its community guidelines, such as those promoting terrorism, Google said it would quickly identify and remove them. The process for handling videos that do not necessaril­y violate specific rules of conduct is more complicate­d.

Under the policy change, Google said offensive videos that did not meet its standard for removal — for example, videos promoting the subjugatio­n of religions or races without inciting violence — would come with a warning and could not be monetized with advertisin­g, or be recommende­d, endorsed or commented on by users. Such videos were already not allowed to include advertisin­g, but they were not restricted in any other way.

“That means these videos will have less engagement and be harder to find,” Kent Walker, Google’s general counsel and senior vice president, wrote in a company blog post. “We think this strikes the right balance between free expression and access to informatio­n without promoting extremely offensive viewpoints.”

Google, which has relied on automated video analysis for the removal of most of its terrorism-related content, said it would devote more engineers to help identify and remove potentiall­y problemati­c videos. It also said it would enlist experts from nongovernm­ental

organizati­ons to help determine which videos were violent propaganda and which were religious or newsworthy speech.

The Mountain View company said it would rely on the specialize­d knowledge of groups with experts on issues like hate speech, selfharm and terrorism. The company also said it planned to work with counter extremist groups to help identify content that tried to radicalize or recruit extremists.

By allowing anyone to upload videos to YouTube, Google has created a thriving service that appeals to people with a wide range of interests. But it has also become a magnet for extremists, who can reach a wide audience for their racist or intolerant views.

Google has long wrestled with how to curb that content without inhibiting the freedom that makes YouTube popular.

Part of the challenge is the sheer volume of videos uploaded. The company has said that more than 400 hours of video content is uploaded every minute, and YouTube has been unable to police that in real time. Users flag offensive videos for review, while the company’s algorithms comb the site for potential problems. Videos with nudity, graphic violent footage or copyrighte­d material are usually taken down quickly.

Companies throughout the tech industry are working on how to keep services for user-generated content open without allowing them to become dens of extremism. Like YouTube, social media companies have found that policing content is a never-ending challenge. Last week, Facebook said it will use artificial intelligen­ce and human moderators to root out extremist content. Twitter said it suspended 377,000 accounts in the second half of 2016 for violations related to the “promotion of terrorism.”

In the aftermath of terror attacks in Manchester and London, British Prime Minister Theresa May criticized large Internet companies for providing the “safe space” that allows radical ideologies to spread. According to news media reports, friends and relatives of Khuram Shazad Butt, identified as one of the three knifewield­ing attackers on London Bridge, were worried about the influence of YouTube videos of sermons by Ahmad Musa Jibril, an Islamic cleric from Michigan.

Jibril’s sermons demonstrat­e YouTube’s quandary because he “does not explicitly call to violent jihad, but supports individual foreign fighters and justifies the Syrian conflict in highly emotive terms,” according to a report by the Internatio­nal Center for the Study of Radicaliza­tion and Political Violence.

A spokesman for YouTube said the new policies were not the result of any single violent episode, but part of an effort to improve its service. Google did not respond to a question about whether Jibril’s videos would fall under Google’s guidelines for videos containing inflammato­ry language but not violating its policies. Jibril still has videos on YouTube, but without ads.

In its blog post, Google acknowledg­ed that “more needs to be done” to remove terrorismr­elated content. YouTube said it would do more in “counter-radicaliza­tion” efforts, including targeting potential Islamic State recruits with videos that could change their minds about joining the organizati­on. Google said that in previous counterrad­icalizatio­n attempts, users clicked on ads at an “unusually high rate” to watch videos that debunk terrorism recruitmen­t messages.

Google also announced a series of measures to identify extremist videos more quickly, an effort that the company started this year as YouTube tries to assure advertiser­s that it is safe for their marketing dollars.

YouTube came under fire this year when the Times of London and other news outlets found examples of brands that inadverten­tly funded extremist groups through automated advertisin­g — a byproduct of YouTube’s revenuesha­ring model that provides content creators a portion of ad dollars.

Brands such as AT&T and Enterprise Rent-ACar pulled ads from YouTube. Google responded by changing the types of videos that can carry advertisin­g, blocking ads on videos with hate speech or discrimina­tory content. Google also created a system to allow advertiser­s to exclude specific sites and channels in YouTube and Google’s display network.

Newspapers in English

Newspapers from United States