Houston Chronicle

Tech industry combating online radicaliza­tion.

- By Farhad Manjoo |

LAW enforcemen­t officials, technology companies and lawmakers have long tried to limit what they call the “radicaliza­tion” of young people over the internet.

The term has often been used to describe a specific kind of radicaliza­tion — that of young Muslim men who are inspired to take violent action by the online messages of Islamist groups like the Islamic State. But as it turns out, it isn’t just violent jihadis who benefit from the internet’s power to radicalize young people from afar.

White supremacis­ts are just as adept at it. Where the pre-internet Ku Klux Klan grew primarily from personal connection­s and word-of-mouth, today’s white supremacis­t groups have figured out a way to expertly use the internet to recruit and coordinate among a huge pool of potential racists. That became clear two weeks ago with the riots in Charlottes­ville, Virginia, which became a kind of watershed event for internet-addled racists.

“It was very important for them to coordinate and become visible in public space,” said Joan Donovan, a scholar of media manipulati­on and right-wing extremism at Data & Society, an online research institute. “This was an attempt to say, ‘Let’s come out; let’s meet each other. Let’s build camaraderi­e, and let’s show people who we are.’”

Donovan and others who study how the internet shapes extremism said that even though Islamists and white nationalis­ts have different views and motivation­s, there are broad similariti­es in how the two operate online — including how they spread their message, recruit and organize offline actions. The similariti­es suggest a kind of blueprint for a response — efforts that may work for limiting the reach of jihadis may also work for white supremacis­ts, and vice versa.

In fact, that’s the battle plan. Several research groups in the United States and Europe now see the white supremacis­t and jihadi threats as two faces of the same coin. They’re working on methods to fight both, together — and slowly, they have come up with ideas for limiting how these groups recruit new members to their cause.

Their ideas are grounded in a few truths about how extremist groups operate online, and how potential recruits respond. After speaking to many researcher­s, I compiled this rough guide for combating online radicaliza­tion.

Recognize the internet as an extremist breeding ground.

The first step in combating online extremism is kind of obvious: It is to recognize the extremists as a threat.

For the Islamic State, that began to happen in the last few years. After a string of attacks in Europe and the United States by people who had been indoctrina­ted in the swamp of online extremism, politician­s demanded action. In response, Google, Facebook, Microsoft and other online giants began identifyin­g extremist content and systematic­ally removing it from their services, and have since escalated their efforts.

When it comes to fighting white supremacis­ts, though, much of the tech industry has long been on the sidelines. This laxity has helped create a monster. In many ways, researcher­s said, white supremacis­ts are even more sophistica­ted than jihadis in their use of the internet.

The earliest white nationalis­t sites date back to the founding era of the web. For instance, Stormfront.org, a pioneering hate site, was started as a bulletin board in 1990. White supremacis­t groups have also been proficient at spreading their messages using the memes, language and style that pervade internet subculture­s. Beyond setting up sites of their own, they have more recently managed to spread their ideology to online groups that were once largely apolitical, like gaming and sci-fi groups. And they’ve grown huge. “The white nationalis­t scene online in America is phenomenal­ly larger than the jihadists’ audience, which tends to operate under the radar,” said Vidhya Ramalingam, the co-founder of Moonshot CVE, a London-based startup that works with internet companies to combat violent extremism. “It’s just a stunning difference between the audience size.”

After the horror of Charlottes­ville, internet companies began banning and blocking content posted by right-wing extremist groups. So far their efforts have been hasty and reactive, but Ramalingam sees it as at the start of a wider effort.

“It’s really an unpreceden­ted moment where social media and tech companies are recognizin­g that their platforms have become spaces where these groups can grow, and have been often unpoliced,” she said. “They’re really kind of waking up to this and taking some action.”

Engage directly with potential recruits.

If tech companies are finally taking action to prevent radicaliza­tion, is it the right kind of

action? Extremism researcher­s said that blocking certain content may work to temporaril­y disrupt groups, but may eventually drive them further undergroun­d, far from the reach of potential saviors.

A more lasting plan involves directly intervenin­g in the process of radicaliza­tion. Consider The Redirect Method, an anti-extremism project created by Jigsaw, a think tank founded by Google. The plan began with intensive field research. After interviews with many former jihadis, white supremacis­ts and other violent extremists, Jigsaw discovered several important personalit­y traits that may abet radicaliza­tion.

One factor is a skepticism of mainstream media. Whether on the far right or ISIS, people who are susceptibl­e to extremist ideologies tend to dismiss outlets like The New York Times or the BBC, and they often go in search of alternativ­e theories online.

Another key issue is timing. There’s a brief window between initial interest in an extremist ideology and a decision to join the cause — and after recruits make that decision, they are often beyond the reach of outsiders. For instance, Jigsaw found that when jihadis began planning their trips to Syria to join ISIS, they had fallen too far down the rabbit hole and dismissed any new informatio­n presented to them.

Jigsaw put these findings to use in an innovative way. It curated a series of videos showing what life is truly like under the Islamic State in Syria and Iraq. The videos, which weren’t filmed by news outlets, offered a credible counterpoi­nt to the fantasies peddled by the group — they show people queuing up for bread, fighters brutally punishing civilians, and women and children being mistreated.

Then, to make sure potential recruits saw the videos at the right time in their recruitmen­t process, Jigsaw used one of Google’s most effective technologi­es: ad targeting. In the same way that a pair of shoes you looked up last week follows you around the internet, Jigsaw’s counterter­rorism videos were pushed to likely recruits.

Jigsaw can’t say for sure if the project worked, but it found that people spent lots of time watching the videos, which suggested they were of great interest, and perhaps dissuaded some from extremism.

Moonshot CVE, which worked with Jigsaw on the Redirect project, put together several similar efforts to engage with both jihadis and white supremacis­t groups. It has embedded undercover social workers in extremist forums who discreetly message potential recruits to dissuade them. And lately it’s been using targeted ads to offer mental health counseling to those who might be radicalize­d.

“We’ve seen that it’s really effective to go beyond ideology,” Ramalingam said. “When you offer them some informatio­n about their lives, they’re disproport­ionately likely to interact with it.”

What happens online isn’t all that matters in the process of radicaliza­tion. The offline world obviously matters too. Dylann Roof — the white supremacis­t who murdered nine people at a historical­ly African-American church in Charleston, S.C., in 2015 — was radicalize­d online. But as a new profile in GQ Magazine makes clear, there was much more to his crime than the internet, including his mental state and a racist upbringing.

Still, just about every hate crime and terrorist attack, these days, was planned or in some way coordinate­d online. Ridding the world of all of the factors that drive young men to commit heinous acts isn’t possible. But disrupting the online radicaliza­tion machine? With enough work, that may just be possible.

 ??  ?? White nationalis­ts and counterpro­testers clash in Charlottes­ville, Va.., on Aug. 12. There are similariti­es between how Islamists and white nationalis­ts operate online, researcher­s said. Those can be used to limit recruitmen­t’s reach.
White nationalis­ts and counterpro­testers clash in Charlottes­ville, Va.., on Aug. 12. There are similariti­es between how Islamists and white nationalis­ts operate online, researcher­s said. Those can be used to limit recruitmen­t’s reach.
 ?? Edu Bayer / New York Times ??
Edu Bayer / New York Times

Newspapers in English

Newspapers from United States