Toronto Star

A global network of angry young men

Experts sound alarm about growing threat from those radicalize­d online,

- MAY WARREN STAFF REPORTER

On one of the worst days in Toronto’s history — as victims lay in hospital beds and families received devastatin­g news about loved ones who didn’t survive — they cheered.

Across the internet, members of a deeply misogynist­ic subculture who call themselves incels, short for involuntar­y celibates, welcomed the news of what’s now known as the van attack.

One even hailed the killer as their “new saint,” according to images of posts provided to the Star by the Southern Poverty Law Center, a U.S. legal advocacy non-profit. “Joyous day,” said another. The 26-year-old driver of the white van who plowed down pedestrian­s on a busy North York stretch of Yonge Street that day in April 2018 identified himself as part of this “movement” without borders, and said he was “radicalize­d” online and inspired by its cult figures to complete his “mission,” according to a transcript of an interview with police recently made public. Alek Minassian will stand trial without a jury in Toronto on10 counts of murder and 16 counts of attempted murder next year.

Incels, experts say, are a rising threat, part of a global far-right ecosystem of angry young men who have been radicalize­d online and committed a rash of recent attacks from Christchur­ch, New Zealand, to El Paso, Texas. Not all of these men are incels, but they are part of this larger web. And, experts argue, not enough has changed since the van attack on the part of tech companies and platforms who are providing the forums for this hatred.

The Southern Poverty Law Center started tracking incels, under the banner of male supremacy, in 2018. There have always been misogynist­s, says its intelligen­ce project director, Heidi Beirich. But this is something new.

“What has happened over recent years is that members of the white supremacis­t milieu or the alt-right, whatever you want to call it, have increasing­ly come to their extremism by an online pathway that almost always involves extreme misogyny,” she said over the phone from the centre’s head office in Montgomery, Ala.

“You now have this embedded radicaliza­tion of largely, young white men, but not entirely, who harbour deep hatred of women and often that’s coupled with a lot of extremist beliefs,” she adds, “this is becoming an increasing domestic terrorism threat.”

Neo-Nazis in past eras, while not recognizin­g women as their equals, saw them as worthy of protection. “This new crop of extremists is definitely not doing that,” Beirich notes.

The term incel was coined by a Canadian woman in the early 1990s who created a community of lonely people struggling to find connection. But it’s now been overtaken by men with a deep hatred of women who find solace online.

They believe that women who refuse to have sex with them deserve violence. The Stacys (attractive women) prefer the Chads (attractive men) over the incels, who are at the bottom but deserve to be at the top. In the middle are the “normies,” regular people.

This ideology has “accelerate­d into multiple attacks” in recent years, starting with self-proclaimed incel Elliot Rodger, who killed six people and injured 14 near the University of California, Santa Barbara, in 2014, Beirich says.

Police also found the 29-yearold killer in the summer 2018 Danforth shooting had a possible interest in incel culture, though they didn’t find a clear motive or associatio­n with terrorist or hate groups. A nearly yearlong investigat­ion into the shooting revealed Faisal Hussain had a copy of a misogynist­ic manifesto Rodger left behind.

There’s been a surge in attacks tied to the broader network of far-right internet hate over the last few months. These men did not call themselves incels, but are part of the larger trend of angry young men radicalize­d online.

The members of this diffuse global community meet, communicat­e and inspire each other online, on forums ranging from niche message boards to mainstream sites used by billions.

The Christchur­ch shooter — who killed 51 people — streamed part of his March attack on two mosques on Facebook Live. He also penned a white supremacis­t manifesto that was shared on Twitter and 8chan, an anonymous message board popular with racists and misogynist­s.

The following month, a man in Poway, Calif., posted a racist, anti-Semitic letter on 8chan before allegedly killing one and injuring three in a shooting at a synagogue on the last day of Passover.

Then in August, a Dallas man drove to a busy Walmart in El Paso. It’s been reported in U.S. media that police believe he was inspired by Christchur­ch, deliberate­ly targeted Hispanic people and posted a racist antiimmigr­ant manifesto on 8chan before the attack. Twenty-two people were killed and 24 injured, including a two-monthold baby who’s parents died trying to shield him from the bullets. The shooting is being investigat­ed by the FBI as possible domestic terrorist attack and hate crime.

After El Paso, 8chan’s own founder called for it to be shut down, according to the New York Times. Cloudflare, an internet security company, cut off its support in August and the site, described on its Twitter profile as “The Darkest Reaches of the Internet,” is now offline.

Just a week later, a 21-year-old Norwegian man allegedly killed his sister and stormed a local mosque, wounding one person. The Guardian reported that, this time, he left messages on a new message board called Endchan, saying he was inspired by Christchur­ch and El Paso. It’s being investigat­ed as an act of terrorism. Endchan has been offline in recent days. After the attack, its administra­tors tweeted they’d been recently hit by “a large influx of 8chan refugees … drasticall­y changing the pace in which the site operates.”

It can be hard to squash every smaller site that takes in “people who’ve been booted off Facebook and Twitter with hateful views,” Beirich says. As soon as one cracks down or goes dark, the worst people on it pop up somewhere else.

But smaller sites have fewer users and are often “preaching to the choir,” Beirich adds. More mainstream sites like Facebook, Twitter and Google (which owns YouTube) can have a huge impact and reach billions.

It’s those bigger tech companies that need to step up, as they’re the places where new people will be recruited and radicalize­d, she says.

Until the August 2017 white supremacis­t rally in Charlottes­ville, Va., the tech companies were not recognizin­g white and male supremacis­t hate as a problem, Beirich adds. But “now the kind of conversati­on we’re having is, why is your implementa­tion so terrible?”

The worst posts cheering on

the killer on the day of the van attack were from a now defunct niche website called incel.me. Minassian said in the police interview that he was “radicalize­d” on Reddit and 4chan. Hours before the attack, he posted on 4chan using coded incel language announcing an imminent attack, hoping to inspire others. But it was on the much more mainstream Facebook that he left his last message.

Reddit took steps to curtail incels in November 2017 by taking down a forum devoted to them and that fall announced a new policy to ban content that incites, encourages or glorifies violence.

“Communitie­s focused on this content and users who post such content will be banned from the site,” added a spokespers­on for the company.

Representa­tives from 4chan did not respond to requests from the Star for comment. Requests for comment to an administra­tor email and Twitter account associated with 8chan were not returned.

A Facebook spokespers­on responded that “individual­s and organizati­ons who spread hate, attack or call for the exclusion of others on the basis of who they are have no place on our services.”

The social network’s policy on dangerous individual­s and organizati­ons states that they do not allow those “who are engaged in ‘organized hate.’ ” It continues to review “individual­s, pages, groups and content” that breaches its community standards. YouTube Canada spokespers­on Nicole Bell wrote in an email that hate speech and content that promotes violence has “no place” on the platform, and the company has “heavily invested” in both humans and technology to quickly detect, review and remove this content.

“Since the Toronto van attack in 2018, we’ve been taking a close look at our approach toward hateful content in consultati­on with dozens of experts in subjects like violent extremism, supremacis­t, civil rights and free speech, and as a result of that consultati­on we announced major changes in June to tackle these issues,” she added.

Aspokesper­son for Twitter referred the Star to their global policy strategist’s U.S. congressio­nal testimony from June 2019, where he explained they’ve suspended more than 1.5 million accounts for violations related to terrorism from August 2015-2018 and have seen a steady decrease in terrorist organizati­ons trying to use their service over the years.

Stephanie Carvin, an assistant professor of internatio­nal relations at Carleton University, acknowledg­es finding, reviewing and removing this content can be tough. But, she notes, it’s been done before.

“There’s always going to be dark corners of the internet, but we were petty successful about taking down Islamic State propaganda,” she says.

“The far right is far more affluent and it’s far less cohesive. But still, it should be easy to identify the nodes of these networks.”

Carvin says anonymous online communitie­s provide a form for these men that pushes them toward violence.

She says social media companies need to do a better job of enforcing their own terms and conditions. “It’s hard, but you run a business, is this how you want your business being used?” she asks.

The companies depend on user-reporting, artificial intelligen­ce and human judgment calls by moderators to enforce their policies. But there have been multiple reports that this is not done consistent­ly.

A 2017 investigat­ion by ProPublica found “uneven” enforcemen­t of Facebook’s hate speech policies, and after asking the social media giant about its handling of 49 offensive posts, the company acknowledg­ed its content reviewers had made the wrong call on almost half of them. Another investigat­ion by the U.S. non-profit earlier that year found Facebook’s policies tend to favour government­s and elites over individual­s with less power. Reuters found more than 1,000 examples of posts, comments and pornograph­ic images attacking the Rohingya and other Muslims that were still on Facebook in 2018, despite Mark Zuckerberg’s assurances that the company was cracking down.

CNBC reported in August that Twitter users have been switching their country location to Germany, where local laws require companies to pull down Nazi content quickly, in order to escape online anti-Semitism and racism they are still experienci­ng on the site.

In his interview with police, Minassian said he did not deliberate­ly target women. He said he just saw a crowded area and decided to “go for it.” But he referred to two incel mass killers in the interview transcript: Rodger and Chris Harper-Mercer, who killed 10 people at an Oregon community college in 2015. He said he communicat­ed with both online, and himself inspired a man in Edmonton to commit an attack.

These claims have not been independen­tly verified by the Star.

In the last year or so since the van attack, law enforcemen­t agencies have begun to recognize incels as a new public safety threat, Carvin says.

CSIS referred to the van attack, as well as the 2016 Quebec City mosque shooting, in its 2018 annual public report, published in June, under the heading of “Right Wing Extremism.”

The move, Carvin says, signals a new priority.

While Christchur­ch catalyzed this shift, the van attack “may have been the start of the momentum,” she says.

The incel ideology is not as clear-cut as that of other terrorist movements, “and more of a collection of random grievances aimed at women generally,” she adds. But, she notes, under the Criminal Code, terrorist acts can be committed “in whole or in part for a political, religious or ideologica­l purpose, objective or cause … with the intention of intimidati­ng the public.”

“These guys are drawing their ideas, in part,” online, she says.

That this threat is not being taken more seriously, by companies and society as a whole, is because of the normalizat­ion of rape culture and violence against women, says Nicolette Little, a critical media studies researcher at the University of Calgary.

“If you look at some of the uproar that might happen around what people more standardly think of as a terrorist attack and compare it to the kind of uproar or lack thereof around this kind of thing, I think that’s a really interestin­g point to consider,” she says.

“It seems like these events happen, like this van attack, and there is sort of the willingnes­s to do something, but then it just fades so quickly.”

At the same time, she cautions against giving these men too much oxygen, and even questions the continued use of the name they’ve given themselves.

“I think we might want to step away from the term incel,” Little says. “And start calling them what they are, which is really angry loathsome misogynist­s, who are doing terrible things out of a strange mix of selfloathi­ng and hatred of women.”

 ??  ??
 ?? COURT EXHIBIT ?? Alek Minassian, left, said he was “radicalize­d” online in an interview with Toronto police Det. Rob Thomas.
COURT EXHIBIT Alek Minassian, left, said he was “radicalize­d” online in an interview with Toronto police Det. Rob Thomas.
 ?? ROBYN BECK AFP/GETTY IMAGES FILE PHOTO ?? Elliot Rodger, who, in 2014, killed six and injured 14 in California, inspired many incel attacks.
ROBYN BECK AFP/GETTY IMAGES FILE PHOTO Elliot Rodger, who, in 2014, killed six and injured 14 in California, inspired many incel attacks.

Newspapers in English

Newspapers from Canada