San Francisco Chronicle - (Sunday)

The making of a YouTube radical

- By Kevin Roose

MARTINSBUR­G, W.Va. — Caleb Cain pulled a Glock pistol from his waistband, took out the magazine and casually tossed both onto the kitchen counter.

“I bought it the day after I got death threats,” he said.

The threats, Cain explained, came from rightwing trolls in response to a video he had posted on YouTube a few days earlier. In the video, he told the story of how, as a liberal college dropout struggling to find his place in the world, he had gotten sucked into a vortex of far-right politics on YouTube.

“I fell down the ‘altright’ rabbit hole,” he said in the video.

Cain, 26, recently swore off the alt-right nearly five years after discoverin­g it, and has become a vocal critic of the offshoot of conservati­sm, which mixes racism, white nationalis­m and populism. He is scarred by his experience of being radicalize­d by what he calls a “decentrali­zed cult” of far-right YouTube personalit­ies, who convinced him that Western civilizati­on was under threat from Muslim immigrants and cultural Marxists, that innate IQ difference­s explained racial disparitie­s and that feminism was a dangerous ideology.

“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” he

said. “I was brainwashe­d.”

Over years of reporting on internet culture, I’ve heard countless versions of Cain’s story: An aimless young man — usually white, frequently interested in video games — visits YouTube looking for direction or distractio­n and is seduced by a community of far-right creators.

Some young men discover far-right videos by accident, while others seek them out. Some travel all the way to neoNazism, while others stop at milder forms of bigotry.

The common thread in many of these stories is YouTube and its recommenda­tion algorithm, the software that determines which videos appear on users’ home pages and in the “Up Next” sidebar next to a video that is playing. The algorithm is responsibl­e for more than 70% of all time spent on the site.

The radicaliza­tion of young men is driven by a complex stew of emotional, economic and political elements, many having nothing to do with social media. But critics and independen­t researcher­s say YouTube has inadverten­tly created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocativ­e videos with exposure and advertisin­g dollars, and an algorithm that guides users down personaliz­ed paths meant to keep them glued to their screens.

“There’s a spectrum on YouTube between the calm section — the Walter Cronkite, Carl Sagan part — and Crazytown, where the extreme stuff is,” said Tristan Harris, a former design ethicist at Google, YouTube’s parent company. “If I’m YouTube and I want you to watch more, I’m always going to steer you toward Crazytown.”

In recent years, social media services have grappled with the growth of extremism. Many have barred a handful of far-right influencer­s and conspiracy theorists, including Alex Jones of InfoWars, and tech companies have taken steps to limit the spread of political misinforma­tion.

YouTube, whose rules prohibit hate speech and harassment, took a more laissez-faire approach to enforcemen­t for years. The San Bruno company recently announced that it is updating its policy to ban videos espousing neo-Nazism, white supremacy and other bigoted views. The company also said it is changing its recommenda­tion algorithm to reduce the spread of misinforma­tion and conspiracy theories.

With 2 billion monthly active users uploading more than 500 hours of video every minute, YouTube’s traffic is estimated to be the second highest of any website, behind only Google.com. According to the Pew Research Center, 94% of Americans ages 18 to 24 use YouTube, a higher percentage than for any other online service. Like many Bay Area tech companies, YouTube is outwardly liberal in its corporate politics. It sponsors floats at LGBT pride parades and celebrates diverse creators, and its chief executive endorsed Hillary Clinton in the 2016 presidenti­al election. President Trump and other conservati­ves have said social networks are biased against right-wing views and have used takedowns like those announced by YouTube as evidence for those claims.

In reality, YouTube has been a godsend for hyper-partisans on all sides. It has allowed them to bypass traditiona­l gatekeeper­s and broadcast their views to mainstream audiences, and has helped onceobscur­e commentato­rs build lucrative media businesses.

It has also been a useful recruiting tool for farright extremist groups. Bellingcat, an investigat­ive news site, analyzed messages from far-right chat rooms and found that YouTube was cited as the most frequent cause of members’ “red-pilling” — an internet slang term for converting to farright beliefs. A European research group, Vox-Pol, conducted a separate analysis of nearly 30,000 Twitter accounts affiliated with the alt-right. It found that the accounts linked to YouTube more often than to any other site.

“YouTube has been able to fly under the radar because until recently, no one thought of it as a place where radicaliza­tion is happening,” said Becca Lewis, who studies online extremism for the nonprofit Data & Society. “But it’s where young people are getting their informatio­n and entertainm­ent, and it’s a space where creators are broadcasti­ng political content that, at times, is overtly white supremacis­t.”

I visited Cain in West Virginia after seeing his YouTube video denouncing the far right. We spent hours discussing his radicaliza­tion. To back up his recollecti­ons, he downloaded and sent me his entire YouTube history, a log of more than 12,000 videos and more than 2,500 search queries dating to 2015.

These interviews and data points form a picture of a disillusio­ned young man, an internetsa­vvy group of rightwing reactionar­ies and a powerful algorithm that learns to connect the two. It suggests that YouTube may have played a role in steering Cain, and other young men like him, toward the far-right fringes.

It also suggests that, in time, YouTube is capable of steering them in very different directions.

From an early age, Cain was fascinated by internet culture. As a teenager, he browsed 4Chan, the lawless message board. He played online games with his friends and devoured videos of intellectu­als debating charged topics like the existence of God.

The internet was an escape. Cain grew up in postindust­rial Appalachia and was raised by his conservati­ve Christian grandparen­ts. He was smart, but shy and socially awkward, and he carved out an identity during high school as a countercul­tural punk. He went to community college, but dropped out after three semesters.

Broke and depressed, he resolved to get his act together. He began looking for help in the same place he looked for everything: YouTube.

One day in late 2014, YouTube recommende­d a self-help video by Stefan Molyneux, a Canadian talk show host and self-styled philosophe­r.

Like Cain, Molyneux had a difficult childhood, and he talked about overcoming hardships through self-improvemen­t. He seemed smart and passionate, and he wrestled with big questions like free will, along with practical advice on topics such as dating and job interviews.

Molyneux, who calls himself an “anarchocap­italist,” also had a political agenda. He was a men’s rights advocate who said feminism was a form of socialism and that progressiv­e gender politics were holding young men back. He offered conservati­ve commentary on pop culture and current events, explaining why Disney’s “Frozen” was an allegory about female vanity, or why the fatal shooting of an unarmed black teenager by a white police officer was proof of the dangers of “rap culture.”

Cain was a liberal who cared about social justice, worried about wealth inequality and believed in climate change. But he found Molyneux’s diatribes fascinatin­g, even when they disagreed.

“He was willing to address young men’s issues directly, in a way I’d never heard before,” Cain said.

In 2015 and 2016, as Cain dived deeper into his YouTube recommenda­tions, he discovered a universe of rightwing creators. Over time, he watched dozens of clips by Steven Crowder, a conservati­ve comedian, and Paul Joseph Watson, a prominent right-wing conspiracy theorist who was barred by Facebook this year. He became entranced by Lauren Southern, a far-right Canadian activist, whom he started referring to as his “fashy bae,” or fascist crush.

These people weren’t all shouty demagogues. They were entertaine­rs, building their audience with satirical skits, debates and interviews with like-minded creators. Some of them were part of the alt-right, a loose cohort of proTrump activists who sandwiched white nationalis­m between layers of internet sarcasm. Others considered themselves “alt-lite,” or merely anti-progressiv­e.

If alienation was one ingredient in Cain’s radicaliza­tion, and persuasive partisans like Molyneux were another, the third was a series of product decisions YouTube made starting in 2012.

For years, the site’s recommenda­tions algorithm had been programmed to maximize views, by showing users videos they were likely to click on. But creators had learned to game the system, inflating their views by posting videos with exaggerate­d titles or choosing salacious thumbnail images.

In response, YouTube’s executives announced that the recommenda­tion algorithm would give more weight to watch time, rather than views. That way, creators would be encouraged to make videos that users would finish, users would be more satisfied and YouTube would be able to show them more ads.

The bet paid off. Within weeks of the algorithm change, the company reported that overall watch time was growing, even as the number of views shrank. According to a 2017 report, YouTube’s watch time grew 50% a year for three consecutiv­e years.

A month after its algorithm tweak, YouTube changed its rules to allow all video creators to run ads with their videos and earn a portion of their revenue. Previously, only popular channels that had been vetted by YouTube were able to run ads.

Neither change was intended to benefit the far right, and YouTube’s algorithm had no inherent preference for extreme political content. It treated a white nationalis­t monologue no differentl­y from an Ariana Grande cover or a cake icing tutorial.

But the far right was well positioned to capitalize on the changes. Many right-wing creators already made long video essays, or posted video versions of their podcasts. Their inflammato­ry messages were more engaging than milder fare. And now that they could earn money from their videos, they had a financial incentive to churn out as much material as possible.

In 2015, a research team from Google Brain, Google’s much-lauded artificial intelligen­ce division, began rebuilding YouTube’s recommenda­tion system around neural networks, a type of AI that mimics the human brain. In a 2017 interview with the Verge, a YouTube executive said the new algorithm was capable of drawing users deeper into the platform by figuring out “adjacent relationsh­ips” between videos that a human would never identify. The new algorithm worked well, but it wasn’t perfect. One problem, according to several of the current and former YouTube employees, was that the AI tended to pigeonhole users into specific niches, recommendi­ng videos that were similar to ones they had already watched. Eventually, users got bored.

Google Brain’s researcher­s wondered whether they could keep YouTube users engaged for longer by steering them into different parts of YouTube, rather than feeding their existing interests. And they began testing a new algorithm that incorporat­ed a different type of AI, called reinforcem­ent learning.

The new AI, known as Reinforce, was a kind of long-term addiction machine. It was designed to maximize users’ engagement over time by predicting which recommenda­tions would expand their tastes and get them to watch not just one more video but many more.

Reinforce was a huge success. In a talk at an AI conference in February, Minmin Chen, a Google Brain researcher, said it was YouTube’s most successful launch in two years. Sitewide views increased nearly 1%, she said — a gain that, at YouTube’s scale, could amount to millions more hours of daily watch time and millions more dollars in advertisin­g revenue per year. She added that the new algorithm was already starting to alter users’ behavior.

“We can really lead the users toward a different state, versus recommendi­ng content that is familiar,” Chen said.

In interviews, YouTube officials denied that the recommenda­tion algorithm steered users to more extreme content. The company’s internal testing, they said, has found just the opposite — that users who watch one extreme video are, on average, recommende­d videos that reflect more moderate views. They declined to share this data or give any specific examples of users who were shown more moderate videos after watching more extreme videos.

The officials stressed, however, that YouTube realized it had a responsibi­lity to combat misinforma­tion and extreme content.

“While we’ve made good progress, our work here is not done, and we will continue making more improvemen­ts this year,” a YouTube spokesman, Farshad Shadloo, said in a statement.

By the night of Nov. 8, 2016, Cain’s transforma­tion was complete. He spent much of the night watching clips of Clinton’s supporters crying after the election was called in Trump’s favor. His YouTube viewing history shows that at 1:41 a.m., just before bed, he turned on a live stream hosted by Crowder, with the title “TRUMP WINS!”

“It felt like a punkrock moment, almost like being in high school again,” Cain said.

That year, Cain’s YouTube consumptio­n had skyrockete­d. He got a job packing boxes at a furniture warehouse, where he would listen to podcasts and watch videos by his favorite YouTube creators all day. He fell

“If I’m YouTube and I want you to watch more, I’m always going to steer you toward Crazytown.” Tristan Harris, former design ethicist at Google

asleep to YouTube videos at night, his phone propped up on a pillow. In all, he watched nearly 4,000 YouTube videos in 2016, more than double the number he had watched the previous year.

Not all of these videos were political. Cain’s viewing history shows that he sought out videos about his other interests, including cars, music and cryptocurr­ency trading. But the bulk of his media diet came from far-right channels. And after the election, he began exploring a part of YouTube with a darker, more radical group of creators.

These people didn’t couch their racist and anti-Semitic views in sarcastic memes, and they didn’t speak in dog whistles. One channel run by Jared Taylor, editor of the white nationalis­t magazine American Renaissanc­e, posted videos with titles like “‘Refugee’ Invasion Is European Suicide.” Others posted clips of interviews with white supremacis­ts like Richard Spencer and David Duke.

In 2018, nearly four years after Cain had begun watching rightwing YouTube videos, a new kind of video began appearing in his recommenda­tions.

These videos were made by left-wing creators, but they mimicked the aesthetics of rightwing YouTube, down to the combative titles and the mocking use of words like “triggered” and “snowflake.”

One video was a debate about immigratio­n between Southern and Steven Bonnell, a liberal YouTuber known as Destiny. Cain watched the video to cheer on Southern, but Cain reluctantl­y declared Bonnell the winner.

Cain also found videos by Natalie Wynn, a former academic philosophe­r who goes by the name ContraPoin­ts. Wynn wore elaborate costumes and did dragstyle performanc­es in which she explained why Western culture wasn’t

under attack from immigrants, or why race was a social construct.

Unlike most progressiv­es Cain had seen take on the right, Bonnell and Wynn were funny and engaging. They spoke the native language of YouTube, and they didn’t get outraged by far-right ideas. Instead, they rolled their eyes, and made them seem shallow and unsophisti­cated.

“I noticed that rightwing people were taking these old-fashioned, knee-jerk, reactionar­y politics and packing them as edgy punk rock,” Wynn told me. “One of my goals was to take the excitement out of it.”

When Cain first saw these videos, he dismissed them as left-wing propaganda. But he watched more, and he started to wonder if people like Wynn had a point. Her videos persuasive­ly used research and citations to rebut the right-wing talking points he had absorbed.

“I just kept watching more and more of that content, sympathizi­ng and empathizin­g with her and also seeing that, wow, she really knows what she’s talking about,” Cain said. Wynn and Bonnell are part of a new group of YouTubers who are trying to build a counterwei­ght to YouTube’s far-right flank. This group calls itself BreadTube, a reference to the left-wing anarchist Peter Kropotkin’s 1892 book, “The Conquest of Bread.” It also includes people such as Oliver Thorn, a British philosophe­r who hosts the channel Philosophy­Tube, where he posts videos about topics like transphobi­a, racism and Marxist economics.

The core of BreadTube’s strategy is a kind of algorithmi­c hijacking. By talking about many of the same topics that far-right creators do — and, in some cases, by responding directly to their videos — left-wing YouTubers are able to get their videos recommende­d to the same audience.

“Natalie and Destiny made a bridge over to my side,” Cain said, “and it was interestin­g and compelling

enough that I walked across it.”

BreadTube is still small. Wynn, the most prominent figure in the movement, has 615,000 subscriber­s, a fraction of the audience drawn by the largest right-wing creators.

“Unfortunat­ely the alt-right got a big head start on finding ways to appeal to white men,” said Emerican Johnson, a YouTuber who runs a left-wing channel called Non-Compete. “We’re late to the party. But I think we will build a narrative that will stand strong against that altright narrative.”

After the New Zealand shooting, Cain decided to try to help. He recently started his own YouTube channel — Faraday Speaks, in homage to the 19th century scientist Michael Faraday — where he talks about politics and current events from a left-wing perspectiv­e. He wants to show young men a way out of the far right before more white nationalis­t violence ensues.

“You have to reach people on their level, and part of that is edgy humor, edgy memes,” he said. “You have to empathize with them, and then you have to give them the space to get all these ideas out of their head.”

Shortly after his first video was uploaded, Cain began receiving threats from alt-right trolls on 4Chan. One called him a traitor, and made a reference to hanging him. That was when he bought the gun. Several weeks ago, he moved out of West Virginia, and is working at a new job while he develops his YouTube channel.

What is most surprising about Cain’s new life, on the surface, is how similar it feels to his old one. He still watches dozens of YouTube videos every day and hangs on the words of his favorite creators. It is still difficult, at times, to tell where the YouTube algorithm stops and his personalit­y begins.

 ?? Justin T. Gellerson / New York Times ?? Caleb Cain, who spent years consumed by what he calls a “decentrali­zed cult” of far-right YouTube personalit­ies, now is a vocal critic of extremism.
Justin T. Gellerson / New York Times Caleb Cain, who spent years consumed by what he calls a “decentrali­zed cult” of far-right YouTube personalit­ies, now is a vocal critic of extremism.

Newspapers in English

Newspapers from United States