San Francisco Chronicle

YouTube’s a pawn as some game the news of tragedy

- By Kevin Roose

When Elmer Williams’ wife told him that a mass shooting had taken place at a church in Texas, he leaped into action. First, he skimmed a handful of news stories about the massacre. Then, when he felt sufficient­ly informed, he went into his home video studio, put on his trademark aviator sunglasses, and hit record.

Roughly an hour later, Williams, 51, a popular rightwing YouTube personalit­y who calls himself the Doctor of Common Sense, had filmed, edited and uploaded a threeminut­e monologue about the Sutherland Springs church shooting to his YouTube page, which had roughly 90,000 subscriber­s. Authoritie­s had not yet named a suspect, but that didn’t deter Williams, who is black, from speculatin­g that the gunman was probably “either a Muslim or black.”

Later, after the shooter was identified as a white man named Devin Kelley, Williams posted a follow-up video. He

claimed that Kelley was most likely a Bernie Sanders supporter associated with antifa — a left-wing antifascis­t group — who may have converted to Islam. Despite having no evidence for those claims, Williams argued them passionate­ly, saying that photos of Kelley circulatin­g online suggested that he was a violent liberal.

“Sometimes, you can tell a lot from a person’s picture,” Williams said.

I came across Williams’ videos several hours after the massacre, when one of them appeared prominentl­y in YouTube’s search results about the shooting, alongside other videos making unverified claims that had been posted by pages with names like TruthNews Network and the Patriotic Beast.

YouTube has long been a haven for slapdash political punditry, a certain type of hyper-prolific conspiraci­st has emerged recently as a dominant force. By reacting quickly and voluminous­ly to breaking news, these rapid-response pundits — the YouTube equivalent of talk radio shockjocks — have climbed the site’s search results, and exposed legions of viewers to their far-fetched theories.

In a phone interview from his home in Houston, Williams told me that he had created more than 10,000 YouTube videos over an eight-year period, posting as many as 20 monologues per day, and racking up an estimated 200 million views.

His hit production­s have included fact-challenged videos like “Barack and Michelle Obama Both Come Out The Closet,” which garnered 1.6 million views, and “Hillary Clinton Is On Crack Cocaine,” which had 665,000. He was admitted to YouTube’s partner program, which allows popular posters to earn money by displaying ads on certain types of videos, and claims to have made as much as $10,000 a month from his channel.

“I like to call myself a reporter who reports the news for the common person,” Williams said.

Whether motivated by profit or micro-celebrity, the success of sensationa­lists like Williams has become a vexing problem for companies like Facebook, Twitter and Google, which owns YouTube.

These companies sort and prioritize informatio­n for their users, and most have built ranking systems that boost news from mainstream outlets over stories from less credible sources. But those algorithms can be gamed in breaking news situations by users who work fast, uploading their videos in the valuable minutes between when news breaks and when the first wave of legitimate articles and videos appears.

“Before reliable sources put up stories, it’s a bit of a free-for-all,” said Karen North, a professor studying social media at the University of Southern California. “People who are in the business of posting sensationa­lized opinions about the news have learned that the sooner they put up their materials, the more likely their content will be found by an audience.”

The phenomenon is not limited to YouTube. After last month’s mass shooting in Las Vegas, a Facebook safety check page featured a story from a site called “AltRight News” that made false statements about the gunman, and Google’s search results displayed a conspiracy theory from 4Chan, the notoriousl­y toxic message board. After last month’s terrorist attack in New York City, a trending topic page on Twitter briefly featured a story from Infowars, a conservati­ve site that is popular among the conspiracy-minded.

Conservati­ves have argued that YouTube unfairly targets their videos while allowing liberal channels, such as the Young Turks, to post heated commentary. And some dispute that there is any conscious gaming going on.

“There is absolutely no strategy,” said Paul Joseph Watson, an editorat-large at Infowars and a popular YouTube personalit­y who has 1.1 million subscriber­s. On the day of the Texas church shooting, one of Watson’s tweets appeared as a result in Google searches for the shooter’s name, although it has since disappeare­d.

Tech companies, already under fire for the ease with which they allowed Russia to interfere in last year’s election, have also vowed to take a harder stance on domestic misinforma­tion. Twitter’s acting general counsel, Sean Edgett, told congressio­nal investigat­ors that the company would take steps to keep false stories from being featured on trending topic pages.

“It’s a bad user experience, and we don’t want to be known as a platform for that,” Edgett said.

YouTube, whose community guidelines prohibit hateful and threatenin­g content, has begun using artificial intelligen­ce to help identify offensive videos. But conspiracy theories don’t announce themselves, and machines can’t yet handle the complicate­d business of fact-checking.

In Williams’ case, human interventi­on seems to have been necessary. Last week, shortly after I asked YouTube some questions about Williams’ account, all of his videos disappeare­d, and his profile was replaced by a message saying, “This account has been terminated due to multiple or severe violations of YouTube’s policy prohibitin­g hate speech.”

Williams, who said he had recently left his job as an operations manager at a hazardous materials plant to focus on full-time punditry, has tangled with YouTube’s hate speech policies before. The company shut down one of his previous accounts for similar infraction­s, which he claimed cost him 250,000 subscriber­s and a lucrative income source.

“If YouTube didn’t punish me,” Williams said, “I could easily be making over $30,000 a month.”

YouTube said that Williams’ account was banned “as soon as it was flagged to us,” because its terms of service prohibit repeat rule-breakers from opening new accounts. It also said that its terms prohibit advertisin­g from appearing on videos featuring “controvers­ial and sensitive events, tragedies, political conflicts and other sensitive topics.”

Even before this week’s crackdown, Williams was branching out. He sells cell phone ringtones on his website, and was considerin­g starting his own paid streaming service. Just hours after he was banned by YouTube, Williams posted a video on Vimeo, another videohosti­ng service. He pledged to keep insulting his favorite targets — Democrats, Hillary Clinton, Barack Obama — and not shy away from controvers­y, no matter what the policies said.

“I don’t want to be on YouTube anymore,” Williams said. “It’s too communist.”

 ?? Bryan R. Smith / New York Times ?? Partisan and incendiary newshounds often insert themselves into online search results by quickly posting commentary after major news events, like this October truck attack in Manhattan.
Bryan R. Smith / New York Times Partisan and incendiary newshounds often insert themselves into online search results by quickly posting commentary after major news events, like this October truck attack in Manhattan.
 ?? Chang W. Lee / New York Times ?? Part of the problem YouTube faces with attacks like the Manhattan one is that facts might not be known for hours, so it’s hard to tell which reports are false.
Chang W. Lee / New York Times Part of the problem YouTube faces with attacks like the Manhattan one is that facts might not be known for hours, so it’s hard to tell which reports are false.

Newspapers in English

Newspapers from United States