Editorial: Techs must muck out the ‘sewer part of the Internet’
E10
S everal major companies suspended advertising on YouTube last week amid revelations of disturbing videos targeting and featuring children. They did so, as it happens, around the time that Facebook promised to let users know whether it had fed them Russian propaganda and Google was straining to confine its news search to, well, news.
Such stories follow an increasingly familiar and depressing pattern: We are reminded that a technology company’s vast, unexamined output necessarily includes generous portions of garbage. Company executives vow to repair or redouble the automated processes that admitted the stuff in the first place. And they and we agree to nervously forget about it until the next time the information sewer backs up.
YouTube’s latest reckoning can be traced, perhaps fittingly, to a viral post on another online publishing platform, Medium. Under the straightforward title “Something is wrong on the internet,” author James Bridle sampled some of the troves of questionable videos designed, by some combination of people and algorithms, to rack up views by children and therefore ad revenue for their producers and the Googleowned site. The clips range from the merely strange and mindless, such as endless unwrappings of packages containing toys, to the much more troubling, such as violent, abusive and suggestive scenes involving children and popular characters.
Recent reports elsewhere have plumbed other dark corners of YouTube, including malicious videos on the YouTube Kids app, predatory commentary on videos featuring children and a malfunctioning system for reporting abuses. The advertiser backlash is the second the company has faced this year; in March, several prominent brands reacted after corporate messages were linked with racist and extremist videos.
YouTube’s vice president for product management, Johanna Wright, said in a recent statement that the company had removed 50 channels and thousands of videos with questionable depictions of children. It also took the half-measure of removing advertising from more than 3 million videos showing “family entertainment characters engaged in violent, offensive or otherwise inappropriate behavior.” The company
promised to use “machine learning technology,” “trusted flaggers” and “thousands of people ... working around the clock” to catch more, with the caveat that some content is “nuanced or challenging to make a clear decision on.”
The trouble, of course, is that no one is making decisions about what’s on YouTube — slogan: “Broadcast Yourself ” — until it’s already there. Likewise, even in offering a mild antidote to toxic content, Facebook is forcing its customers to do the work. The network promised to afford some users who interacted with Russian propaganda an opportunity to visit its online help center for an answer to the question, “How can I see if I’ve liked or followed a Facebook page or Instagram account created by (Kremlin agitprop shop) the Internet Research Agency?” It’s a cagey and underwhelming response to misinformation that may have reached half the nation’s population with an intent no less dire than subversion of the government.
Google, for its part, could “de-rank” content from known Russian propaganda sources, a top executive recently suggested, in an effort to keep what he called “the sewer part of the Internet” out of news searches, though the company later backpedaled. Google has repeatedly tried and failed to contain fake news, with bogus search results drawing attention most recently in the wake of the mass shooting in a Texas church. Eric Schmidt, the chairman of Google parent Alphabet, told a security conference that when company algorithms attempt to distinguish between a “fact A” and “fact B” posited with equal vehemence, “It’s very difficult for us to understand truth.”
Truth is one casualty of systems for which, as Bridle put it, “human oversight is simply impossible.” Security is another. Next time we go searching for information about an election or distractions for our children, at least we can say we were warned.