Several advertisers pull YouTube campaigns
Several major companies suspended their advertising campaigns on YouTube last week after learning their ads were displayed on videos that appeared to sexualize children.
In distancing themselves from YouTube, the companies cited the video platform’s seeming inability to police its content so their ads don’t appear in offensive videos. The companies included Deutsche Bank, German supermarket chain Lidl, sportswear company Adidas, candy makers Mars and Cadbury, and alcohol company Diageo, which produces Smirnoff vodka, Captain Morgan rum and Crown Royal whiskey.
The suspension was in response to an article published in the Times of London, which said the companies’ advertisements appeared on videos showing children in various states of undress, according to the Wall Street Journal.
Some of the videos, for example, featured “young girls filming themselves in underwear, doing the splits, brushing their teeth or rolling around in bed,” according to the Times.
While some of the videos appeared to be uploaded by the children themselves, the comments sections were filled with sexual remarks — including statements encouraging the children to perform sexual acts on camera.
A Mars spokesperson told Business Insider the company was “shocked and appalled” that its advertising appeared with “such exploitative and inappropriate content.” Likewise, a spokesperson for Lidl told Reuters that such content was “completely unaccept-
able” and that YouTube’s policies were “ineffective.”
The video platform, which is owned by Google, says that it forbids videos or comments that sexualize children. Its official policy states that posting such content “will immediately result in an account termination.” But one video showing a prepubescent girl in a nightgown racked up more than 6.5 million views and a number of lewd and sexual comments, the Times reported. Advertisements for several large brands ran with this video.
“There shouldn’t be any ads running on this content and we are working urgently to fix this,” a YouTube spokesman told Reuters.
Johanna Wright, YouTube’s vice president of product management, said in a statement the company will be taking an “even more aggressive stance” against videos aimed at sexualizing or harming minors.
But policing content and ensuring that advertising doesn’t run with offensive clips has been a long-running problem for the company.
YouTube released a similar statement in March, when several companies including Coca-Cola, PepsiCo, Walmart, Dish Network, Starbucks and General Motors stopped advertising after learning that their ads were running alongside videos featuring racist and anti-Semitic content.
YouTube also issued a statement in June, when the United Kingdom’s major political parties pulled their commercials after they appeared with videos that promoted “extremist ideology,” the Wall Street Journal reported.
The problem YouTube faces is twofold.
First is the overwhelming amount of content constantly being generated. The Guardian reported that 300 hours of video are uploaded every minute.
YouTube uses a combination of human and automated watchdogs to look for offensive content, but much of that content is often overlooked. There simply aren’t enough humans to monitor so much video, and many claim the protective algorithms often don’t work.
The second problem is how the ads are disseminated. Companies have three choices when placing their advertisements: They can be paired with a specific type of content, a particular set of keywords or a certain demographic profile. YouTube then automatically plays the ads with the corresponding videos.
But these categories can be misleading. The videos of young girls that attracted sexualized comments were not, on their face, sexual. So if a company requested its ad play with family friendly content, for example, there’s a good chance it could have ended up on one of these videos.