Times-Call (Longmont)

Study: Youtube recommenda­tions send violent gun videos to 9-year-olds

- By David Klepper

WASHINGTON >> When researcher­s at a nonprofit that studies social media wanted to understand the connection between Youtube videos and gun violence, they set up accounts on the platform that mimicked the behavior of typical boys living in the U.S.

They simulated two nineyear-olds who both liked video games. The accounts were identical, except that one clicked on the videos recommende­d by Youtube, and the other ignored the platform’s suggestion­s.

The account that clicked on Youtube’s suggestion­s was soon flooded with graphic videos about school shootings, tactical gun training videos and how-to instructio­ns on making firearms fully automatic. One video featured an elementary school-age girl wielding a handgun; another showed a shooter using a .50 caliber gun to fire on a dummy head filled with lifelike blood and brains. Many of the videos violate Youtube’s own policies against violent or gory content.

The findings show that despite Youtube’s rules and content moderation efforts, the platform is failing to stop the spread of frightenin­g videos that could traumatize vulnerable children — or send them down dark roads of extremism and violence.

“Video games are one of the most popular activities for kids. You can play a game like “Call of Duty” without ending up at a gun shop — but Youtube is taking them there,” said Katie Paul, director of the Tech Transparen­cy Project, the research group that published its findings about Youtube on Tuesday. “It’s not the video games, it’s not the kids. It’s the algorithms.”

The accounts that followed Youtube’s suggested videos received 382 different firearms-related videos in a single month, or about 12 per day. The accounts that ignored Youtube’s recommenda­tions still received some gun-related videos, but only 34 in total.

The researcher­s also created accounts mimicking 14-year-old boys; those accounts also received similar levels of gun- and violence-related content.

One of the videos recommende­d for the accounts was titled “How a Switch Works on a Glock (Educationa­l Purposes Only).” Youtube later removed the video after determinin­g it violated its rules; an almost identical video popped up two weeks later with a slightly altered name; that video remains available.

A spokeswoma­n for Youtube defended the platform’s protection­s for children and noted that it requires users under 17 to get their parent’s permission before using their site; accounts for users younger than 13 are linked to the parental account. “We offer a number of options for younger viewers,” the company wrote in emailed statement. “... Which are designed to create a safer experience for tweens and teens.”

Along with Tiktok, the video sharing platform is one of the most popular sites for children and teens. Both sites have been criticized in the past for hosting, and in some cases promoting, videos that encourage gun violence, eating disorders and self-harm. Critics of social media have also pointed to the links between social media, radicaliza­tion and real-world violence.

The perpetrato­rs behind many recent mass shootings have usedsocial media and video streaming platforms to glorify violence or even livestream their attacks. In posts on Youtube, the shooter behind the attack on a 2018 attack on a school in Parkland, Fla., that killed 17 wrote “I wanna kill people,” “I’m going to be a profession­al school shooter” and “I have no problem shooting a girl in the chest.”

The neo-nazi gunman who killed eight people earlier this month at a Dallasarea shopping center also had a Youtube account that included videos about assembling rifles, the serial killed Jeffrey Dahmer and a clip from a school shooting scene in a television show.

In some cases, Youtube has already removed some of the videos identified by researcher­s at the Tech Transparen­cy Project, but in other instances the content remains available. Many big tech companies rely on automated systems to flag and remove content that violates their rules, but Paul said the findings from the Project’s report show that greater investment­s in content moderation are needed.

In the absence of federal regulation, social media companies must do more to enforce their own rules, said Justin Wagner, director of investigat­ions at Everytown for Gun Safety, a leading gun control advocacy organizati­on. Wagner’s group also said the Tech Transparen­cy Project’s report shows the need for tighter age restrictio­ns on firearms-related content.

“Children who aren’t old enough to buy a gun shouldn’t be able to turn to Youtube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities,” Wagner said in response to the Tech Transparen­cy Project’s report.

Similar concerns have been raised about Tiktok after earlier reports showed the platform was recommendi­ng harmful content to teens.

Tiktok has defended its site and its policies, which prohibit users younger than 13. Its rules also prohibit videos that encourage harmful behavior; users who search for content about topics including eating disorders automatica­lly receive a prompt offering mental health resources.

Newspapers in English

Newspapers from United States