San Francisco Chronicle

Firms leave YouTube over ad placement

Campaigns suspended amid reports of video clips that sexualize children

- By Travis M. Andrews

Several major companies suspended their advertisin­g campaigns on YouTube on Friday after learning their ads were displayed on videos that appeared to sexualize children.

In distancing themselves from YouTube, the companies cited the service’s seeming inability to police its content so their ads don’t appear in offensive videos. The companies included Deutsche Bank, German supermarke­t chain Lidl, sportswear company Adidas, candy makers Mars and Cadbury, and alcohol company Diageo, which produces Smirnoff vodka, Captain Morgan rum and Crown Royal whiskey.

The suspension was in response to an article published in the Times of London last week, which said the companies’ advertisem­ents appeared on videos showing children in various states of undress, according to the Wall Street Journal. Some of these videos, for example, featured “young girls filming themselves in underwear, doing the splits, brushing their teeth or rolling around in bed,” according to the London Times.

While some of the videos appeared to be uploaded by the children themselves, the comments sections were filled with sexual remarks — including statements encouragin­g the children to perform sexual acts on camera.

A Mars representa­tive told Business Insider the company was “shocked and appalled” that its advertisin­g appeared with “such exploitati­ve and inappropri­ate content.” Likewise, a Lidl spokespers­on told Reuters such content is “completely unacceptab­le” and that YouTube’s policies were “ineffectiv­e.”

The video service, owned by Google, says that it forbids videos or comments that sexu-

alize children. Its official policy states that posting such content “will immediatel­y result in an account terminatio­n.” Regardless, one video showing a prepubesce­nt girl in a nightgown racked up more than 6.5 million views and a number of lewd and sexual comments, the Times reported. Advertisem­ents for several large brands ran with this video.

“There shouldn’t be any ads running on this content and we are working urgently to fix this,” a YouTube spokesman said on Friday, according to Reuters.

Johanna Wright, YouTube’s vice president of product management, said in a statement the company will be taking an “even more aggressive stance” against videos aimed at sexualizin­g or harming minors.

But policing content and ensuring that advertisin­g doesn’t run with offensive clips has been a long-running problem for the video service.

YouTube released a similar statement in March, when several companies including Coca-Cola, PepsiCo, Walmart, Dish Network, Starbucks and General Motors stopped advertisin­g after learning that their ads were running alongside videos featuring racist and anti-Semitic content.

YouTube also issued a statement in June, when the United Kingdom’s major political parties pulled their commercial­s from YouTube after they appeared with videos that promoted “extremist ideology,” the Wall Street Journal reported.

The problem YouTube faces is twofold.

First is the overwhelmi­ng amount of content constantly being generated. Users watch 1 billion hours of video each day on the site. The Guardian reported that 300 hours of video are uploaded every minute.

YouTube uses a combinatio­n of human and automated watchdogs to look for offensive content, but much of that content is often overlooked. There simply aren’t enough humans to monitor so much video, and many claim the protective algorithms in place often don’t work.

“They work by correlatin­g patterns within the content — such as the use of particular word combinatio­n or image elements — that have previously been flagged by human content moderators as benign violations of the platform content policies,” Ansgar Koene, senior research fellow at the University of Nottingham’s School of Computer Science, told Wired. “The algorithms are therefore incapable of detecting novel types of violations.”

The second problem is how the ads are disseminat­ed. Companies have three choices when placing their advertisem­ents, according to the Wall Street Journal. They can be paired with a specific type of content, a particular set of keywords or a certain demographi­c profile. YouTube then automatica­lly plays the ads with the correspond­ing videos.

But these categories can be misleading. The videos of young girls that attracted sexualized comments were not, on their face, sexual. So if a company requested its ad play with family-friendly content, for example, there’s a good chance it could have ended up on one of these videos.

“We have to accept that under the current model of rapid, instant publishing, content moderation will never be completely perfect,” Koene told Wired. “If we really want to block all content that violates the platform rules, then we would have to move to a model where platform users submit content they want to publish to an editor for approval, as we do when publishing in journals. This would transform the current Web 2.0 platforms into traditiona­l media channels.”

Newspapers in English

Newspapers from United States