The Phnom Penh Post

YouTube announces plan ‘to fight online terrorism’

- Travis M Andrews

YOUTUBE has long struggled with the conundrum of how to police videos that advocate hateful ideologies but don’t specifical­ly encourage acts of violence.

The basic problem is such videos don’t break any of the platform’s specific guidelines. Banning some videos based on ideology and not others could lead to a slippery slope that could damage the primary appeal ofYouTube: that users can upload their own content, so long as it’s not illegal, without fear of being censored.

On Sunday, Google, which owns YouTube, announced new policies to help police such content in a blog post by Kent Walker, Google’s general counsel and senior vice president, titled, “Four steps we’re taking today to fight online terror.” It also appeared as an op-ed in the Financial Times.

The first two steps focus on identifyin­g and removing videos that specifical­ly encourage terrorism. But, as Walker wrote, that isn’t always as simple as it sounds, particular­ly since as of 2012, one hour of content is uploaded to the platform each second, as AdWeek reported, noting that makes a century of video every 10 days.

“This can be challengin­g: a video of a terrorist attack may be informativ­e news reporting by the BBC, or glorificat­ion of violence if uploaded in a different context by a different user,” Walker wrote.

Currently, YouTube uses a combinatio­n of video analysis software and human content flaggers to find and delete videos that break its guidelines.

The first step,Walker wrote, is to devote more resources “to apply our most advanced machine learning research” to the software, which means applying artificial intelligen­ce to the software that will be able to learn over time what content breaks these guidelines.

The second step is to increase the number of “independen­t experts in YouTube’s Trusted Flagger Program”, which is composed of users who report inappropri­ate content directly to the company. Specifical­ly, Google plans to add to the programme 50 experts from nongovernm­ental organisati­ons whom it will support with operationa­l grants to review content.

The third step, meanwhile, focuses on content that doesn’t actually break the site’s guidelines but nonetheles­s pushes hateful agendas, “for example, videos that contain inflammato­ry religious or supremacis­t content”.

Take Ahmad Musa Jibril, a Palestinia­n American preacher who espouses radical Islamic views in line with the beliefs of Islamic State, for example. A 2014 report by the Internatio­nal Center for the Study of Radicaliza­tion and PoliticalV­iolence found that more than half of recruits to the militant group, also known as ISIS, follow Jibril on social media.

One of the London Bridge attackers reportedly became a follower of Jibril through social networks such as YouTube, the BBC reported.

But while these videos may help radicalise certain individual­s, the ICRS report found that Jibril “does not explicitly call to violent jihad, but supports individual foreign fighters and justifies the Syrian conflict in highly emotive terms”.

Therefore, he doesn’t violateYou­Tube’s content guidelines.

Since YouTube cannot delete these videos and others of its kind, the company’s basic plan is to simply hide them as best they can.

“These will appear behind an interstiti­al warning and they will not be monetized, recommende­d or eligible for comments or user endorsemen­ts,” Walker wrote.

The final step is to use“targeted online advertisin­g to reach potential ISIS recruits” and then redirect them “towards anti-terrorism videos that can change their minds about joining”.

Newspapers in English

Newspapers from Cambodia