YouTube defense against crisis has a flaw
Last week, Wikipedia received an unusual prompt.
For one thing, it dealt with sensitive subject matter. And it didn’t come from a person — it came from YouTube, the largest video site on the internet.
During an interview at South by Southwest in Austin, YouTube’s chief executive, Susan Wojcicki, announced that the company she leads would enlist Wikipedia’s help to deal with the proliferation of conspiracy theories and misinformation on its platform.
The plan was presented as just one of many ways that YouTube, which is owned by Google, would address mounting concerns about its content. But it highlighted a jarring dynamic: Here was Google, a company with revenue in excess of $100 billion last year, calling on a volunteer-built, donationfunded nonprofit organization to help it solve a crisis.
Specifically, she said, YouTube would soon begin experimenting with what it called “information cues” sourced from the online encyclopedia. The cues would appear as captions and article links beneath videos that dealt with topics related to popular conspiracy theories — she used the moon landing and “chemtrails” as examples.
Justin Brookman, director for consumer privacy and technology policy at Consumers Union, called the plan “a disingenuous cop-out that will make Wikipedia’s job harder.”
Others noted what seemed like an obvious point: Can’t anyone edit Wikipedia, including the conspiracy theorists themselves?
From the Wikimedia Foundation, which oversees Wikipedia, the first response was confusion: YouTube is doing what?
Katherine Maher, executive director of the Wikimedia Foundation, sounded as if she wouldn’t have minded a heads up. “When the announcement came out, we were surprised that we hadn’t been contacted,” Maher said in an interview.
She had learned about YouTube’s plans at the same time as everyone else — including Wikipedia’s army of volunteer contributors, some of whom were not pleased with the idea that an internet colossus had casually declared that it would outsource one of its knottiest problems to a relatively small nonprofit organization.
“Wikipedia is not something that just exists,” Maher said. “It takes work and it requires labor.”
The main problem with YouTube’s presumptuous announcement, some suggested, is that Wikipedia is not necessarily geared toward breaking news — and conspiracy theories tend to move at lightning speed during times of crisis.