The Bakersfield Californian

White supremacis­ts riling up thousands over social media

- BY AMANDA SEITZ

WASHINGTON — The social media posts are of a distinct type. They hint darkly that the CIA or the FBI are behind mass shootings. They traffic in racist, sexist and homophobic tropes. They revel in the prospect of a “white boy summer.”

White nationalis­ts and supremacis­ts, on accounts often run by young men, are building thriving, macho communitie­s across social media platforms like Instagram, Telegram and TikTok, evading detection with coded hashtags and innuendo.

Their snarky memes and trendy videos are riling up thousands of followers on divisive issues including abortion, guns, immigratio­n and LGBTQ rights. The Department of Homeland Security warned Tuesday that such skewed framing of the subjects could drive extremists to violently attack public places across the U.S. in the coming months.

These type of threats and racist ideology have become so commonplac­e on social media that it’s nearly impossible for law enforcemen­t to separate internet ramblings from dangerous, potentiall­y violent people, Michael German, who infiltrate­d white supremacy groups as an FBI agent, told the Senate Judiciary Committee on Tuesday.

“It seems intuitive that effective social media monitoring might provide clues to help law enforcemen­t prevent attacks,” German said. “After all, the white supremacis­t attackers in Buffalo, Pittsburgh and El Paso all gained access to materials online and expressed their hateful, violent intentions on social media.”

But, he continued, “so many false alarms drown out threats.”

DHS and the FBI are also working with state and local agencies to raise awareness about the increased threat around the U.S. in the coming months.

The heightened concern comes just weeks after a white 18-year-old entered a supermarke­t in Buffalo, N.Y., with the goal of killing as many Black patrons as possible. He gunned down 10.

That shooter claims to have been introduced to neo-Nazi websites and a livestream of the 2019 Christchur­ch, New Zealand, mosque shootings on the anonymous, online messaging board 4Chan. In 2018, the white man who gunned down 11 at a Pittsburgh synagogue shared his antisemiti­c rants on Gab, a site that attracts extremists. The year before, a 21-year-old white man who killed 23 people at a

Walmart in the largely Hispanic city of El Paso, Texas, shared his anti-immigrant hate on the messaging board 8Chan.

References to hate-filled ideologies are more elusive across mainstream platforms like Twitter, Instagram, TikTok and Telegram. To avoid detection from artificial intelligen­ce-powered moderation, users don’t use obvious terms like “white genocide” or “white power” in conversati­on.

They signal their beliefs in other ways: a Christian cross emoji in their profile or words like “anglo” or “pilled,” a term embraced by far-right chatrooms, in usernames. Most recently, some of these accounts have borrowed the pop song “White Boy Summer” to cheer on the leaked Supreme Court draft opinion on Roe v. Wade, according to an analysis by Zignal Labs, a social media intelligen­ce firm.

Facebook and Instagram owner Meta banned praise and support for white nationalis­t and separatist­s movements in 2019 on company platforms, but the social media shift to subtlety makes it difficult to moderate the posts. Meta says it has more than 350 experts, with background­s from national security to radicaliza­tion research, dedicated to ridding the site of such hateful speech.

“We know these groups are determined to find new ways to try to evade our policies, and that’s why we invest in people and technology and work with outside experts to constantly update and improve our enforcemen­t efforts,” David Tessler, the head of dangerous organizati­ons and individual­s policy for Meta, said in a statement.

A closer look reveals hundreds of posts steeped in sexist, antisemiti­c,

racist and homophobic content.

In one Instagram post identified by The Associated Press, an account called White Primacy appeared to post a photo of a billboard that describes a common way Jewish people were exterminat­ed during the Holocaust.

“We’re just 75 years since the gas chambers. So no, a billboard calling out bigotry against Jews isn’t an overreacti­on,” the pictured billboard said.

The caption of the post, however, denied gas chambers were used at all. The post’s comments were even worse: “If what they said really happened, we’d be in such a better place,” one user commented. “We’re going to finish what they started someday,” another wrote.

The account, which had more than 4,000 followers, was immediatel­y removed Tuesday, after the AP asked Meta about it. Meta has banned posts that deny the Holocaust on its platform since 2020.

U.S. extremists are mimicking the social media strategy used by the Islamic State group, which turned to subtle language and images across Telegram, Facebook and YouTube a decade ago to evade the industry-wide crackdown of the terrorist group’s online presence, said Mia Bloom, a communicat­ions professor at Georgia State University.

“They’re trying to recruit,” said Bloom, who has researched social media use for both Islamic State terrorists and far-right extremists. “We’re starting to see some of the same patterns with ISIS and the far-right. The coded speech, the ways to evade AI. The groups were appealing to a younger and younger crowd.”

 ?? MATT ROURKE / AP FILE ?? An investigat­or works at the scene after a mass shooting at a supermarke­t, in Buffalo, N.Y., May 16. A white man who gunned down 11 at the supermarke­t shared his anti semitic rants on Gab, a site that attracts extremists.
MATT ROURKE / AP FILE An investigat­or works at the scene after a mass shooting at a supermarke­t, in Buffalo, N.Y., May 16. A white man who gunned down 11 at the supermarke­t shared his anti semitic rants on Gab, a site that attracts extremists.

Newspapers in English

Newspapers from United States