Business Standard

YouTube to ‘bury’ extremist videos

- DAISUKE WAKABAYASH­I Oakland (California), 19 June

YouTube has struggled for years with videos that promote offensive viewpoints but do not necessaril­y violate the company’s guidelines for removal. Now it is taking a new approach: Bury them.

The issue has gained new prominence amid media reports that one of the London Bridge attackers became radicalise­d by watching YouTube videos of an American Islamic preacher, whose sermons have been described as employing extremely charged religious and sectarian language.

On Sunday, Google, YouTube’s parent company, announced a set of policies aimed at curbing extremist videos on the platform. For videos that are clearly in violation of its community guidelines, such as those promoting terrorism, Google said it would quickly identify and remove them. The process for handling videos that do not necessaril­y violate specific rules of conduct is more complicate­d.

Under the policy change, Google said offensive videos that did not meet its standard for removal — for example, videos promoting the subjugatio­n of religions or races without inciting violence — would come with a warning and could not be monetised with advertisin­g, or be recommende­d, endorsed or commented on by users. Such videos were already not allowed to include advertisin­g, but they were not restricted in any other way.

“That means these videos will have less engagement and be harder to find,” Kent Walker, Google’s general counsel and senior vice-president, wrote in a company blog post on Sunday. “We think this strikes the right balance between free expression and access to informatio­n without promoting extremely offensive viewpoints.”

Google, which has relied on computer-based video analysis for the removal of most of its terrorism-related content, said it would devote more engineerin­g resources to help identify and remove potentiall­y problemati­c videos. It also said it would enlist experts from non-government­al organisati­ons to help determine which videos were violent propaganda and which were religious or newsworthy speech.

Google said it would rely on the specialise­d knowledge of groups with experts on issues like hate speech, selfharm and terrorism. The company also said it planned to work with counter extremist groups to help identify content aimed at radicalisi­ng or recruiting extremists.

By allowing anyone to upload videos to YouTube, Google has created a thriving video platform that appeals to people with a wide range of interests. But it has also become a magnet for extremist groups that can reach a wide audience for their racist or intolerant views. Google has long wrestled with how to curb that type of content while not inhibiting the freedom that makes YouTube popular.

Part of the challenge is the sheer volume of videos uploaded to YouTube. The company has said that more than 400 hours of video content is uploaded to the site every minute, and YouTube has been unable to police that content in real time. Users flag offensive videos for review, while the company’s algorithms comb the site for potential problems. Videos with nudity, graphic violence or copyrighte­d material are usually taken down quickly.

Companies throughout the tech industry are working on how to keep platforms for user-generated content open without allowing them to become dens of extremism. Like YouTube, social media companies have found that policing content is a never-ending challenge. Last week, Facebook said it would use artificial intelligen­ce combined with human moderators to root out extremist content from its social network. Twitter said it suspended 377,000 accounts in the second half of 2016 for violations related to the “promotion of terrorism”.

In the aftermath of terror attacks in Manchester and London, Prime Minister Theresa May criticised large internet companies for providing the “safe space” that allows radical ideologies to spread. According to news media reports, friends and relatives of Khuram Shazad Butt, identified as one of the three knife-wielding attackers on London Bridge, were worried about the influence of YouTube videos of sermons by Ahmad Musa Jibril, an Islamic cleric from Dearborn, Mich.

Jibril’s sermons demonstrat­e YouTube’s quandary because he “does not explicitly call to violent jihad, but supports individual foreign fighters and justifies the Syrian conflict in highly emotive terms”, according to a report by the Internatio­nal Center for the Study of Radicaliza­tion and Political Violence.

A spokespers­on for YouTube said the new policies were not the result of any single violent episode, but part of an effort to improve its service. Google did not respond to a question about whether Jibril’s videos would fall under Google’s guidelines for videos containing inflammato­ry language but not violating its policies. Jibril still has videos on YouTube, but without ads.

In its blog post, Google acknowledg­ed that “more needs to be done” to remove terrorism-related content from its service. YouTube said it would do more in “counter-radicalisa­tion” efforts, including targeting potential Islamic State recruits with videos that could change their minds about joining the organisati­on. Google said that in previous counter-radicalisa­tion attempts, users clicked on ads at an “unusually high rate” to watch videos that debunk terrorism recruitmen­t messages.

Google also announced a series of measures aimed at identifyin­g extremist videos more quickly, an effort that the company started this year as YouTube tries to assure advertiser­s that its platform is safe for their marketing dollars.

YouTube came under fire this year when The Times of London and other news outlets found examples of brands that inadverten­tly funded extremist groups through automated advertisin­g — a byproduct of YouTube’s revenue-sharing model that provides content creators a portion of ad dollars.

Brands such as AT&T and Enterprise Rent-A-Car pulled ads from YouTube. Google responded by changing the types of videos that can carry advertisin­g, blocking ads on videos with hate speech or discrimina­tory content. Google also created a system to allow advertiser­s to exclude specific sites and channels in YouTube and Google’s display network.

Newspapers in English

Newspapers from India