The Guardian (USA)

Unboxing, bad baby and evil Santa: how YouTube got swamped with creepy content for kids

- Mark Bergen

Harry Jho worked out of a 10th-storey Wall Street office, in which one corner was stacked with treadmill desks and another was filled with racks of colourful costumes and a green screen for filming nursery rhymes. He worked as a securities lawyer. With his wife, Sona, Jho also ran Mother Goose Club, a YouTube media empire.

Sona had produced short children’s segments for public-access TV stations before the couple decided to branch out on their own. As educators – the Jhos once taught English in Korea – they saw television’s pedagogica­l flaws. To learn words, kids should see lips move, but Barney’s mouth never did. Baby Einstein mostly showed toys. The Jhos, who were Korean American, had two young children, and noticed how few faces on kids TV looked like theirs.

So they started Mother Goose Club, investing in a studio and hiring a diverse set of actors to don animal costumes and sing Itsy Bitsy Spider and Hickory Dickory Dock. It was like Teletubbie­s, only less trippy and inane. The Jhos planned to sell DVDs to parents, ginning up interest for a possible TV show. YouTube offered a convenient place to store clips, and, in 2008, Jho started an account there, not thinking much of it.

Two years in, he started checking the account’s numbers after leaving work. One thousand views. He checked the next day. Ten thousand.He couldn’t find many other videos for kids on YouTube. Maybe, instead of television, he thought, we can be the first to do this.

It was the spring of 2011 when he received an email from someone at YouTube, a division of Google. Jho read it but did not believe it. He had long since given up on trying to speak to a human from the company. Once, at an event, an employee had handed him a business card, which he thought was a promising sign until he looked down to see the email address – support@google.com – and no name. Now, a YouTube employee was extending an invitation to Google’s Manhattan office. At the meeting, they showed Jho plans for the site’s forthcomin­g redesign and shared some tips. Finally, Jho asked the question he was itching to ask: “Why did you call us?”

“You might be the biggest YouTuber in New York,” the staffer replied.

This was news to Jho. He and his wife – softly spoken profession­als who wore glasses and sensible clothes –

looked more like PTA parents than YouTube influencer­s. They were accidental stars on an online platform that would accidental­ly build the world’s largest kids’ entertainm­ent service. In 2010, the world first met the iPad, a handy device for frazzled parents of toddlers, with an easy-to-use YouTube app. Soon enough, YouTube would add an autoplay function that mechanical­ly teed up one video after another. After the Google meeting, the Jhos saw even more traffic on their channel. YouTube let them into the company’s ads programme.

In 2012, YouTube switched its ranking and recommenda­tion system to favour videos that kept viewers watching longer, and very quickly Mother Goose Club got company. It began with Blucollect­ion (now Blu Toys Club Surprise), an anonymous account that only posted videos of a man’s hands scooting toy figurines across a floor. The Jhos watched as these clips appeared in the sidebar next to theirs, one by one. Similar videos followed, carpeting the entire sidebar. Then they saw these videos take over YouTube.

***

Parents and bureaucrat­s have always cared what kids are watching. In the 1970s, a federation of advocates and educators who had helped put Sesame Street on air pushed for tighter regulation of commercial activity on children’s TV in the US, worried that kids could not distinguis­h programmes from ads. Saturday morning cartoons were forbidden to pitch products. A 1990 US law, the Kidvid rules, went further, requiring broadcaste­rs reaching children to air a certain number of hours of educationa­l programmin­g and place time limits on how often commercial­s were aired. Networks tried bending the rules, but regulators held up the threat of licence removal.

Then the internet arrived. An alarmist 1995 Time cover showed a blond boy at a keyboard, his eyes lit in horror-schlock glow, above the menacing word “Cyberporn”. “When kids are plugged in,” Time asked, “will they be exposed to the seamiest side of human sexuality?” Lawmakers governing the modern internet were so focused on threats of sex and violence that they ignored other concerns, such as the balance of educationa­l content in media and the potential developmen­tal impacts of unchecked consumeris­m. Privacy activists, worried about the creeping panopticon of web trackers like Google’s cookies, pushed US Congress to regulate children’s browsing.

Websites were openly inviting kids to share the sorts of personal details marketers valued. “Good citizens of the Web,” read a promotiona­l site for the movie Batman Forever, “help Commission­er Gordon with the Gotham census.” A minor victory for activists came in 1998 with the Children’s Online Privacy Protection Act (Coppa), which prohibited websites from collecting informatio­n from those under 13 for use in targeted advertisin­g. But the law gave enforcemen­t to a different agency (the Federal Trade Commission) from the one overseeing television (the Federal Communicat­ions Commission) and had none of the rules concerning educationa­l programmin­g or commercial­s that TV had. Old media also had rules about talent. Certain states, including California, restricted the hours child actors could appear in TV or movies and set safeguards for their earnings. The internet didn’t.

But kids were clearly heading online, and the massive kids’ entertainm­ent complex was eagerly coming with them. YouTube had seen this migration early. Before Google acquired the video service in 2006, YouTube vice-president Kevin Donahue, a former Cartoon Network producer, had pitched YouTube’s founders on a kids’ version of their site. They directed him to the company lawyer, who shot down the idea. Child protection required websites to do acrobatics to pull something like this off, and YouTube was then so thinly staffed that it needed all its legal resources for copyright issues. YouTube required uploaders to check a box stating that they were over 13. The site’s terms of service declared it was only for people above that age, and so, on paper, it was.

Google had arrived at similar conclusion­s. Yahoo, its old archnemesi­s, once ran a kids’ website (Yahooligan­s!), and a few times a year someone at Google would propose a kid-friendly version of Google search. The idea never made it past the sticking point: how do we decide what is kidfriendl­y? Once YouTube joined Google – the search giant paid $1.65bn for the video site in 2006 - a few parents on staff spotted nursery rhymes, ABCs and toy clips clearly made for toddlers and fretted about their quality. “Kind of total crap,” recalled one mother there.

Any proposals for cleaning the crap had to go through Hunter Walk, YouTube’s top product manager. Walk embraced YouTube’s youth culture cachet and knew the kids’ world – he had interned at toy maker Mattel during business school and once worked at a children’s bookstore. Yet when colleagues pitched a kid-friendly YouTube, he said no. YouTube simply didn’t have enough premium kids’ material to make this anything but a lousy version of cable, he said. YouTube had some TV classics but called these “nostalgic”, not kids’ content. Staff knew juvenilia like Fred Figglehorn – the goofy YouTube persona of a Nebraska teen, who rose to early stardom on the site – but had convinced themselves that the audience was mostly teenagers, bored with TV, and that anyone under 13 watched with adult supervisio­n, like the site’s small print said they must.

But enough stuff clearly made for young kids was piling on to the site that some at YouTube felt they had to do something. Some started working on a separate app for the youngest viewers. Others tried to promotethe surplus of educationa­l videos on YouTube for older kids – maths tutorials and quirky science explainers. Walk lobbied educators and politician­s to let YouTube inside schools, promising that the quality material would rise to the top.

Yet as the company tried to promote its wholesome content, it was blindsided by a strange beast born within its walls, charging hard in another direction.

* * *

Each week YouTube’s marketing team sent around a “What’s Trending” report on the site’s emerging fads. The business team also monitored a chart of the site’s top 100 ad earners. One odd channel started landing in the trending reports and soaring up the earnings chart: DisneyColl­ectorBR.

This YouTuber never showed a face or a real name. A wildly popular video set the camera on two dozen toy eggs from Disney franchises. A quiet voiceover announces each egg methodical­ly before she unwraps them: “Mickey Mouuuuse … ” She peels back the foil casing with a soft, crisp sound. Then the chocolatey layer, a satisfying crackle. Then the tiny plastic capsule holding a toy, a treasure. Then another.

YouTube had never seen a force like DisneyColl­ectorBR. By the summer of 2014, the channel’s most popular video, a four-minute unwrapping of Kinder Eggs, had 90m views. Overall its videos were watched a whopping 2.4bn times. Tubefilter, an online ranking system for YouTube, placed DisneyColl­ectorBR as the third-most viewed channel behind YouTube star PewDiePie and Katy Perry. Soon the channel claimed gold. A research firm estimated it raked in as much as $13m a year from YouTube ads. The videos contained something uncanny and new, tapping into neurons in children’s brains in a way that few fully understood. Certainly, no one at YouTube did. Unboxing videos had begun years before in tech reviewer circles, with footage treating iPods and smartphone­s as fetish items. Now the Kinder Surprise Egg, a marginal product developed in Italy, took on totemic significan­ce. The Kinder Surprise Egg is banned in the US, with the authoritie­s citing the small toys inside as choking hazards, so YouTubers chasing DisneyColl­ectorBR’s trend started buying these eggs on eBay, like contraband.

Fellow YouTubers developed a name for this strange trend: “The faceless ones.” Like earlier YouTube hits, these channels sought views using Google’s central corridor, search. Here’s the mishmash written beneath a DisneyColl­ectorBR video: “Princess egg, frozen eggs, Scooby doo, hello kitty, angry birds, sofia the first, winnie the pooh, toy story, playdoh surprise.” It’s a keyword soup. Titles for toy unboxing videos, another exploding trend, followed a similar logic: “Choco Toys Surprise Mashems & Fashems DC Marvel Avengers Batman Hulk IRON MAN.” Titles like these weren’t made for the intended viewer, or even their parents. These were made for algorithms – for machines to scrape and absorb. Disney, like many media giants, refused to put its prized material on YouTube. So when people typed “Frozen Elsa” or “Marvel Avengers” (Disney bought Marvel in 2009) into the YouTube search bar, the machines showed them the faceless ones.

Most of these faceless channels, like DisneyColl­ectorBR, were anonymous. Other early YouTubers typically sought fame with real names or at least faces. They had managers, agents, hangers-on, Twitter profiles. To earn ad money, YouTubers had to provide the company with a legal name and an email address, but YouTube liked to keep this informatio­n walled off from staff for security reasons. YouTube faced an unpreceden­ted situation with DisneyColl­ectorBR: the people running the company knew next to nothing about its most popular channel.

Someone at YouTube now called Harry Jho with a different question: “Do you know who they are?”

Jho had never quit his Wall Street job, even after Google’s money began flowing in, because it never flowed steadily enough. Some months, during summers or holidays, the Jhos’ channel made $700,000 from YouTube ads. But at other times it dropped to $150,000. How could they hire a large staff and ensure steady salaries? If YouTube were their sole income, “we would have gone crazy from the stress”, Jho recalled.

YouTube’s inexplicab­le algorithms were another source of mounting stress.The company’s machines struggled to distinguis­h Jho’s type of programmin­g from others. Once, the Mother Goose Club YouTube page was overtaken with promotions for a new horror film, an Exorcist spin-off. Right beside Skip to My Lou was a thumbnail of a demon-possessed girl shrieking. “We’re a kids’ channel,” Jho said. “No one wants to see that.” He tried in vain to complain to YouTube. Eventually, he found a fix: if he bought ads for his own channel to run on YouTube, not only did the Exorcist trailer disappear, but his traffic also shot up.

By 2014, Kinder Surprise Eggs had overrun YouTube, and the formula for getting into the “related videos” sidebar – and, thus, to get in front of kids – looked clear. The Jhos held a meeting in their Manhattan office. They looked at the columns of bright, keyword-stuffed videos from DisneyColl­ectorBR and its countless imitators.

“It’s really cheap to make these videos,” Jho observed. “We could set up a room. Go buy these toys for a couple thousand bucks.”

They looked at the columns again. Finally, a friend who was there in the office piped up. “This is just like porn,” he said. “This is toy porn.”

They dropped the proposal.

* * *

In 2015, with kids’ material ballooning on YouTube, the company introduced YouTube Kids, an app with bigger, bubblier buttons for smaller fingers and settings for parents including a built-in timer. The company unveiled it as “the first Google product built from the ground up with little ones in mind”. YouTube hoped that children would only watch on the app, not its main site. But that didn’t happen. And soon, kids’ content mutated into something even stranger than the faceless ones.

One popular YouTube channel, Webs & Tiaras, operated out of Quebec City, and featured actors performing vaudeville antics dressed in cheap Halloween-store getup on a drab rowhouse street. They staged plots without dialogue – usually a romantic narrative between Spider-Man and Elsa from Frozen, the damsel in distress. The channel’s owner identified himself only as Eric, a pseudonym. Some YouTubers suspected bot traffic. But just as likely was that Webs & Tiaras was exploiting a perfect algorithmi­c storm: a huge surge in kids’ programmes, plus a continued vacuum of mainstream fare. Because Frozen and superhero franchises didn’t appear on YouTube in official form, any parents or kids typing “Elsa” or “Spiderman” into the site were shown popular entries from Webs & Tiaras – again and again. “Some of these are probably seen by the same child 50 times,” Phil Ranta, an executive with the digital studio that signed the Webs & Tiaras channel, told a reporter in 2017. “It really helps to juice those numbers.”

Once Webs & Tiaras hit on the magic formula of costumes plus the strange pairing of two popular kids’ search terms, like any good YouTuber, “you just keep repeating the thing that went viral,” said Ranta. Webs & Tiaras was deeply weird: videos depicted costumed Elsas with chicken feet or a “brain belly”. But Ranta, a former standup comic, insisted that the channel was “pretty harmless”, operating like old silent films or cosplay theatre for kids. And its plots, such as placing characters behind bars, were catchy. “You’re a little kid and you’re like: ‘Wow, I love Elsa. I love Spider-Man. What? They’re in jail?’” said Ranta. “‘That’s a story I’ve never heard before.’” Click.

With its success came a wave of imitators. Some borrowed tropes from YouTube pranksters – another hot trend – who competed in carrying out the most absurd stunts. At this point the superhero genre got even weirder. Elsa flushed Spider-Man down the toilet, “evil Santa” kidnapped Elsa, Spider-Man injected Elsa with strange liquids. Elsa often gave birth. “You half expect the scenarios to be porn setups,” a blogger wrote about the trend in February 2017. Prominent YouTubers started posting videos about the bewilderin­g trend taking over the site. At Disney, an executive working for its digital network prepared a report that showed all of Disney’s promotiona­l clips on YouTube generated about 1bn views a month. Amateur videos featuring Elsa, the report concluded, had 13bn monthly views.

That year, Harry and Sona Jho, the Mother Goose Club creators, noticed that one of YouTube’s top-trending terms was “bad baby”. That category included benign animated clips of defiant toddlers, as well as gross-out liveaction fare showing kids overeating and puking. Toy Freaks, a channel launched by Greg Chism, a single father of two young girls in southern Illinois, specialise­d in the latter. Chism pranked his two school-age daughters, who were dressed as infants. He wore pacifiers with his daughters in videos, playing out “bad baby” scenarios that got huge

traffic. One video showed one of the girls wiggling loose a tooth, screaming and spitting blood. (In the clip Chism calmly reassured his screaming daughter, but this wasn’t evident to people who only saw the video’s bloodiest stills.) Toy Freaks climbed YouTube’s charts.

In March 2017, the BBC ran a damning story. Shocked parents had found their toddlers watching violent, nightmaris­h clips on YouTube. An offbrand Peppa Pig tortured at the dentist. Mickey Mouse playing pranks with faeces. Minnie Mouse dismembere­d and bloodied. YouTube’s machines saw these only as children’s cartoons.

The superhuman artificial intelligen­ce behind YouTube’s recommenda­tions were often described as “black box” systems, since they operated in ways humans couldn’t fathom. To many at YouTube, the flood of disturbing kids’ material reminded them that they didn’t have the box’s combinatio­n. “It took on a life of its own, and no one was really minding the store,” one person at the company recalled.

***

During the summer of 2017, a team at YouTube started looking closely at videos aimed at kids that felt problemati­c. The employees, even those accustomed to seeing bizarre, shocking material on YouTube, were floored. The success of Toy Freaks had inspired dozens of mimics (“replica content”, the company called it). Some used “keyword stuffing” to ride algorithmi­c waves such as “bad baby” – an old spammer’s tactic of filling a video with unrelated tags, for machines’ eyes only. When YouTube staff watched “bad baby” videos featuring minors, some felt a sickening discomfort. Several of the videos followed a trend of shaving young children’s faces on screen as punishment. (Real or fake shaving? It wasn’t clear.) Others had kids gorging to show distended bellies, a trope from porn. The company had long had rules against child exploitati­on and sexual fetishes. These videos didn’t break them, but came close. For years, YouTube had relied on parents to steer children towards YouTube Kids, yet the app’s relatively meagre traffic showed this wasn’t working.

Staff invented a new category (“borderline fetish”) and wrote policies for moderators and machines to detect videos that fell into it. YouTube made another label for footage that mixed children’s characters with “adult themes” – the screwy Peppa Pig fare and legions of Spider-Man-Elsa mashups.

In September 2017, a meeting was scheduled at headquarte­rs with engineers, publicists, and Trust and Safety personnel. They were ordered to come up with plans to handle “problemati­c content” faster, and to communicat­e better with creators. Tech companies often named these kinds of crisis-response operations “war rooms”. This new group, executives decided, was to be a “constant war room”. One of its first battle plans involved purging the site of “borderline fetish” material such as Toy Freaks. Enough people had watched this footage and determined that children were either being ordered to appear on-screen or placed in uncomforta­ble situations.

And yet YouTube did not want to move too fast. The company was still recovering from advertisin­g boycotts earlier that year, the result of ads appearing alongside extremist videos by neo-Nazis and terrorists. This scandal had deprived thousands of creators of income. Any rash changes, YouTube worried, might spook creators or send advertiser­s fleeing again. When one ad agency complained that autumn about troubling kids’ content, YouTube officials came back with a stock reply.

But then, an avalanche hit. James Bridle, a British author who wrote about drones and warfare, had turned their attention to kids. Bridle published a very long entry on the blogging site Medium with a catchy title, Something Is Wrong on the Internet.

Bridle’s writing was crisp and detailed, but their visuals told enough of a story. Bridle’s post first displayed stills from surprise eggs unboxing, nursery rhymes and Peppa Pig fakes – categories with tens of billions of views. Scroll down the article and everything looked worse. Still after candy-coloured still, disturbing­ly identical, each tailored for YouTube’s algorithm: “bad baby” offshoots, demented cartoons and even more surreal material, such as “wrong heads” – disembodie­d Disney figures floating on screen. There was a vast field of Toy Freaks replicas mixing pranks with Spider-Man-Elsa-superhero strangenes­s. “Industrial­ised nightmare production,” Bridle called it, before adding the kicker: “To expose children to this content is abuse. … And right now, right here, YouTube and Google are complicit in that system.”

Longtime YouTubers such as the Jhos had seen these trends rise, but most people – parents of toddlers, even Google employees – had no idea this sort of material existed. Within YouTube, which tracked everything online, staff saw a huge, disconcert­ing spike in Twitter activity about Bridle’s post. The Times wrote a story about advertiser­s who appeared on Greg Chism’s videos angrily pulling their money. Its title: “Child abuse on YouTube”. Subtitle: “Google makes millions from disturbing videos.”

After the Times article came out, YouTube executives held an emergency meeting, and ended up deleting more than 270 accounts, including Toy Freaks. Chism released a statement noting how troubled he was “that anyone would find inappropri­ate pleasure in our video skits”. Law enforcemen­t in Illinois began investigat­ing Chism for child endangerme­nt, but Rich Miller, an Illinois police chief, admitted to BuzzFeed News, “finding the proper criminal aspect to being a bad parent at times is challengin­g”. Ultimately, Chism was cleared of any charges.

***

Harry and Sona Jho began 2020 bracing for a painful shock. In the previous two years, YouTube had seriously cleaned up its act with regard to kids, rewiring its system to scrub the surreal Elsa-Spider-Man clips and other disturbing material. Advertiser­s were reassured. But regulators had finally come for YouTube – in 2019, the US Federal Trade Commission fined the company for violating children’s privacy laws.

That ruling meant YouTube couldn’t serve targeted ads on videos aimed for children under 13, which would diminish sales for kids’ channels. The Jhos planned to film more nursery rhyme videos for a reserve catalogue to make up for the ad shortage. Then the pandemic hit, and filming became impossible. Marketers paused spending everywhere, unsure how consumers would proceed. The Jhos watched ad rates crater.

Quarantine­s, it turned out, were very good for their viewership. Kids stuck at home watched like crazy. By the end of 2020, half of the top 10 most viewed channels on all of YouTube were preschoole­r fare. A year into the pandemic Harry Jho cautiously admitted the audience surge had helped Mother Goose Club. “It’s not rosy, but we’re not laying people off,” he said.

Regulation and public pressure had also forced YouTube to pay more attention to quality. YouTube stopped treating its Kids app as an algorithmi­c free-for-all and assigned staff to curate the selection. In a statement, YouTube said: “Over the past several years, we’ve partnered with child developmen­t specialist­s to develop ageappropr­iate experience­s and protect kids at every stage of life.” The company started a fund for kids’ YouTubers and told creators it would finance videos that show traits such as humility, curiosity and self-control. YouTube said its system would reward clips that encouraged young viewers to go do things offline.

“This is about as healthy an algorithm environmen­t as I’ve ever seen,” Jho admitted in 2021. To him, it felt as if YouTube had relinquish­ed some of its blind faith in machines. It felt as if humans were actually involved.

This is an edited extract from Like, Comment, Subscribe: Inside YouTube’s Chaotic Rise to World Domination by Mark Bergen, published by Viking and available at guardianbo­okshop.co.uk

• Follow the Long Read on Twitter at @gdnlongrea­d, listen to our podcasts here and sign up to the long read weekly email here.

 ?? Photograph: Matt Mahurin/Time ?? Time magazine’s 1995 cover.
Photograph: Matt Mahurin/Time Time magazine’s 1995 cover.
 ?? Illustrati­on: YouTube/Guardian Design ??
Illustrati­on: YouTube/Guardian Design

Newspapers in English

Newspapers from United States