Cult of Columbine endures online
School shooters glorified by some on social media
When two Colorado students murdered 12 classmates and a teacher at Columbine High School on April 20, 1999, they committed the first school shooting of the internet era.
Facebook, the iPhone and YouTube had not yet been invented. Yet the traces left online by Eric Harris and Dylan Klebold took hold in each evolving online technology – chat rooms, social media, video – and 25 years later spark obsessive interest among a generation that wasn’t even alive in 1999.
A dark subculture latched onto the details of the killers’ ’90s online life – including their plans of terror and lyrics to their favorite song, “Stray Bullet” – and the investigative reports that followed.
Today, researchers find a cult of Columbine thriving among young internet users.
TikTok profiles with the shooters’ names and photos are festooned with hearts and ribbons. Videos splice together old footage and stills of the shooters. First-person-shooter game simulations of the Columbine massacre regularly pop up on TikTok, where they sometimes fetch hundreds of thousands of views. Social media posts feature Roblox characters dressed to look like the killers.
While some platforms say they work to find and eliminate violent content, online experts and school shooting survivors continue to say the megacompanies haven’t done enough.
Social media companies are “raking in the money,” Anne Marie Hochhalter, who was shot and paralyzed during the Columbine attack, told USA TODAY.
“They’re not going to do anything about it because then the traffic will go down.”
The subculture’s real-life effects have been deadly. Later shooters, including those at Sandy Hook Elementary School in 2012 and Virginia Tech in 2007, studied and emulated the Columbine attack. Documents posted online by copycats frequently mention Columbine.
In January, a 17-year-old student in Iowa shot six people, killing one, before turning the gun on himself. Shortly before, he posted a photo on TikTok of himself in a bathroom stall with a duffel bag, with the words, “Now we wait.”
The post was set to music, an old electronic-metal song from the ’90s called “Stray Bullet.”
Columbine imagery easy to find
Much of the online content related to Columbine today is created by young people or designed to appeal to them, found researchers Moustafa Ayad and Isabelle Frances-Wright at the Institute for Strategic Dialogue.
“The abundance and types of materials that the killers produced resonates with young people in a way that we haven’t really seen with other school shootings,” Frances-Wright said, “which has allowed it to perpetuate and live on all these years later.”
Setting up accounts pretending to be minors, Ayad and Frances-Wright quickly found 127 videos glorifying a range of mass shooters on TikTok and X, formerly Twitter. One TikTok video featured the Columbine shooters in fictional Disney posters. It had amassed nearly 400,000 views in three months.
TikTok has since taken down the videos researchers flagged. A spokesperson for TikTok said the company doesn’t tolerate such content, and employs more than 40,000 trust and safety professionals to moderate content, spending more than $2 billion this year “to provide a safer platform,” the spokesperson wrote in an email.
Yet just this week, after a quick search, Frances-Wright provided USA TODAY with several TikTok videos showing school shootings, including some that used Columbine imagery.
Though the researchers have found videos appearing to show gameplay on the Roblox platform, where users play scenarios mostly built by other users, it was unclear whether those games ever appeared on Roblox itself. The characters may have been designed using Roblox, without ever being uploaded.
A spokesperson for Roblox said the company has strict community standards prohibiting “content portraying, glorifying, or supporting terrorist and violent extremist organizations.”
“We have a dedicated team focused on proactively identifying and swiftly removing such content as well as banning the individuals who create it,” the spokesperson said in an email.
Videos on TikTok and elsewhere try to evade detection by blending content that glorifies mass shooters with legitimate educational content, or by using the killers’ nicknames or other coded language. Statements like “I don’t condone” or hashtags like “true crime” or “fake” are added to mislead platform moderators. Some accounts switch privacy settings so posts are only available to followers.
Despite the attempts at camouflage, accounts are still frequently banned. So the profiles redirect their followers to less-moderated platforms like Discord and Telegram, Ayad and FrancesWright found. That’s where the really dark stuff happens. Open glorification of mass shooters, violent gore and hate speech can be more freely shared. And indoctrination and radicalization are more likely to occur.
In closed discussion groups on Discord, for example, players share tips on building mass casualty simulation games and how to make gaming avatars that look like the Columbine shooters, the researchers said.
Discord and Telegram did not respond to requests for comment.
The deadly allure of the dark side
Ryan Broll, associate professor of sociology and anthropology at the University of Guelph in Ontario, Canada, says the internet is an accelerant for “dark fandom” fixated on the perpetrators of violent acts.
“These communities usually form online because they are inherently deviant communities and they can more easily find people who share interest in these topics online,” said Broll, who studied a Columbine subreddit. “Although people have always been interested in crime and violence, the internet is essential to the size and longevity of dark fandoms.”
Finding kinship online can normalize violent urges, says Peter Langman, a psychologist and author of “Why Kids Kill: Inside the Minds of School Shooters.”
He pointed to a website devoted to the Columbine massacre and other mass murders. In a recent chat about which serial killer or mass murderer they related to the most, most said the Columbine killers. “I also relate to Eric and Dylan,” commented one person. “Like most people lol.”
Three registered users of the website have gone on to commit mass murders, Langman said.
Of the hundreds of school shootings across the United States in the last 25 years, Columbine remains the most influential, researchers Jenni Raitanen and Atte Oksanen from the Emerging Technologies Lab at Tampere University in Finland found. They attribute the shooting’s enduring influence to the oftcited idea that it was retribution for bullying.
“The Columbine perpetrators claimed that their massacre was a political act, conducted in the name of other oppressed students,” the researchers wrote in a 2018 paper. In essence, they claimed to be carrying out their attack in the name of angry, disaffected and angst-ridden youth everywhere.
Those ideas were long ago debunked. Columbine’s former principal Frank DeAngelis says much of the Columbine content portrays the shooters falsely. The FBI concluded the killers, who said in home videos that they hoped the attack would inflict “the most deaths in U.S. history,” were driven by a desire for mass carnage and lasting notoriety, not teenage angst.
Yet today’s online subculture celebrates many of the same false claims.
“The two killers of Columbine are heroes to some of these kids and they shouldn’t be,” said DeAngelis, who retired in 2014 and assists communities across the country after mass shootings. “It scares me.”
Kris Mohandie, a forensic psychologist who has assessed youth offenders influenced by Columbine, said susceptible young people can be shaped by the content they interact with and produce online. They are drawn to the Columbine shooters “because it aligns with dark impulses and their sense of alienation, and what they think looks cool,” Mohandie said.
That’s certainly what happened to Lindsay Souvannarath. Then 23, the Illinois student was drawn to the Columbine story via chat rooms and forums. A budding artist and novelist, she sought feedback from her peers online and eventually fell headfirst into the subculture.
Souvannarath met her co-conspirator when he commented on some of her artwork, and the two swapped music recommendations, fashion tips and theories about the Columbine attackers, she told “The Night Time” podcast – before planning a deadly attack on a mall in Halifax, Nova Scotia, on Valentine’s Day, 2015.
“We thought we were actually them somehow,” Souvannarath told the podcast. “Like their spirits had found their ways to us, and we were them.”
Souvannarath was arrested as she flew into Halifax the day before the planned mass shooting. She pleaded guilty to conspiracy to commit murder and is serving a life sentence.
A ‘collective fail’ for social media
Hochhalter, the Columbine survivor, called on Facebook five years ago to take down pages glorifying the shooters, saying she feared they would inspire others.
One of the pages had more than 2,000 likes and said its “mission” was to “never forget and always honor these heroes.”
Facebook deleted the pages within hours, saying they breached the company’s rules.
But as the 25th anniversary of the massacre approached, Hochhalter said the online communities of “Columbiners” have only multiplied. She and other survivors have received abuse and death threats and have had to call in the FBI to investigate their tormentors, she said.
On the platforms where the ideas spread, almost nothing is being done, she said.
“These people who are at the helm of the social media companies … those are the true extremists,” Hochhalter said. “Because they’re allowing all of it to happen.”
Mohandie says the policies and enforcement at most social media companies are “grossly inadequate.”
Social media companies, some of which are worth billions of dollars, “have an ethical and social responsibility to do more,” he said. “They get a collective fail. All of them.”
After years of building robust content moderation systems, social media companies facing political pressure and economic headwinds have pulled back on gatekeeping, making it harder to distinguish content that crosses the line.
What’s more, this is just the kind of content – dark and edgy – that is “algorithmically sticky,” said Natasha Zinda, a content creator and activist. Posts that push right up to and even beyond a platform’s conduct rules are often exactly the sort of content that gets clicks, she said.
“Algorithms like to push hate,” she said. “Our internet, and our culture on the internet, is all about engagement – whether it’s good or bad.”
The Institute for Strategic Dialogue researchers agreed. Some platforms are doing better than others when it comes to moderating content, Frances-Wright said. But none is doing enough.
Better moderation and supervision need to take place at every stop in the radicalization pipeline, Frances-Wright said: From platforms where simulations are being created, to TikTok where it is distributed, to secret spaces like Discord and Telegram where it is openly discussed and new plots are planned.
Zinda also noted that the last layer of defense is parents.
As a parent herself, Zinda said she appreciates how difficult it is for parents to control what their children view. But every parent whose child is gaming or spending significant time online needs to prioritize talking with them about what they are doing and monitoring their children’s internet activity.
“It’s a click away,” Zinda said. “And you need to be talking with your kids daily about what that is.”