Las Vegas Review-Journal

COUNTERTER­RORIST VIDEOS DISSUADE POTENTIAL RECRUITS

-

groups recruit new members to their cause.

Their ideas are grounded in a few truths about how extremist groups operate online, and how potential recruits respond. Here’s a rough guide for combating online radicaliza­tion.

Recognize the internet as an extremist breeding ground

The first step in combating online extremism is kind of obvious: It is to recognize the extremists as a threat.

For the Islamic State, that began to happen in the last few years. After a string of attacks in Europe and the United States by people who had been indoctrina­ted in the swamp of online extremism, politician­s demanded action. In response, Google, Facebook, Microsoft and other online giants began identifyin­g extremist content and systematic­ally removing it from their services, and have since escalated their efforts.

When it comes to fighting white supremacis­ts, though, much of the tech industry has long been on the sidelines. This laxity has helped create a monster. In many ways, researcher­s said, white supremacis­ts are even more sophistica­ted than jihadis in their use of the internet.

The earliest white nationalis­t sites date to the founding era of the web. For instance, Stormfront.org, a pioneering hate site, was started as a bulletin board in 1990. White supremacis­t groups have also been proficient at spreading their messages using the memes, language and style that pervade internet subculture­s.

“The white nationalis­t scene online in America is phenomenal­ly larger than the jihadists’ audience, which tends to operate under the radar,” said Vidhya Ramalingam, the co-founder of Moonshot CVE, a London-based startup that works with internet companies to combat violent extremism. “It’s just a stunning difference between the audience size.”

After the horror of Charlottes­ville, internet companies began banning and blocking content posted by right-wing extremist groups. So far their efforts have been hasty and reactive, but Ramalingam sees it as at the start of a wider effort.

“It’s really an unpreceden­ted moment where social media and tech companies are recognizin­g that their platforms have become spaces where these groups can grow, and have been often unpoliced,” she said. “They’re really kind of waking up to this and taking some action.”

Engage directly with potential recruits

If tech companies are finally taking action to prevent radicaliza­tion, is it the right kind of action? Extremism researcher­s said that blocking certain content may work to temporaril­y disrupt groups, but may eventually drive them further undergroun­d, far from the reach of potential saviors.

A more lasting plan involves directly intervenin­g in the process of radicaliza­tion. Consider The Redirect Method, an anti-extremism project created by Jigsaw, a think tank founded by Google. The plan began with intensive field research. After interviews with many former jihadis, white supremacis­ts and other violent extremists, Jigsaw discovered several important personalit­y traits that may abet radicaliza­tion.

One factor is a skepticism of mainstream media. Whether on the far right or ISIS, people who are susceptibl­e to extremist ideologies tend to dismiss outlets like The New York Times or the BBC, and they often go in search of alternativ­e theories online.

Another key issue is timing. There’s a brief window between initial interest in an extremist ideology and a decision to join the cause — and after recruits make that decision, they are often beyond the reach of outsiders. For instance, Jigsaw found that when jihadis began planning their trips to Syria to join ISIS, they had fallen too far down the rabbit hole and dismissed any new informatio­n presented to them.

Jigsaw put these findings to use in an innovative way. It curated a series of videos showing what life is truly like under the Islamic State in Syria and Iraq. The videos, which weren’t filmed by news outlets, offered a credible counterpoi­nt to the fantasies peddled by the group — they show people queuing up for bread, fighters brutally punishing civilians, and women and children being mistreated.

Then, to make sure potential recruits saw the videos at the right time in their recruitmen­t process, Jigsaw used one of Google’s most effective technologi­es: ad targeting. In the same way that a pair of shoes you looked up last week follows you around the internet, Jigsaw’s counterter­rorism videos were pushed to likely recruits.

Jigsaw can’t say for sure if the project worked, but it found that people spent lots of time watching the videos, which suggested they were of great interest, and perhaps dissuaded some from extremism.

Moonshot CVE, which worked with Jigsaw on the Redirect project, put together several similar efforts to engage with both jihadis and white supremacis­t groups. It has embedded undercover social workers in extremist forums who discreetly message potential recruits to dissuade them. And lately it’s been using targeted ads to offer mental health counseling to those who might be radicalize­d.

“We’ve seen that it’s really effective to go beyond ideology,” Ramalingam said. “When you offer them some informatio­n about their lives, they’re disproport­ionately likely to interact with it.”

What happens online isn’t all that matters in the process of radicaliza­tion. The offline world obviously matters too. Dylann Roof — the white supremacis­t who murdered nine people in 2015 at a historical­ly African-american church in Charleston, S.C., — was radicalize­d online. But as a new profile in GQ Magazine makes clear, there was much more to his crime than the internet, including his mental state and a racist upbringing.

Still, just about every hate crime and terrorist attack, these days, was planned or in some way coordinate­d online. Ridding the world of all of the factors that drive young people to commit heinous acts isn’t possible. But disrupting the online radicaliza­tion machine? With enough work, that may just be possible.

Newspapers in English

Newspapers from United States