Albuquerque Journal

After Facebook scrutiny, are Google, YouTube next?

Methods of serving ads could come under the microscope

- BY RYAN NAKASHIMA AND MATT O’BRIEN

MENLO PARK, Calif. — Facebook has taken the lion’s share of scrutiny from Congress and the media about data-handling practices that allow savvy marketers and political agents to target specific audiences, but it’s far from alone. YouTube, Google and Twitter also have giant platforms awash in more videos, posts and pages than any set of human eyes could ever check. Their methods of serving ads against this sea of content may come under the microscope next.

Advertisin­g and privacy experts say a backlash is inevitable against a “Wild West” internet that has escaped scrutiny before. There continues to be a steady barrage of new examples where unsuspecti­ng advertiser­s had their brands associated with extremist content on major platforms.

In the latest discovery, CNN reported that it found more than 300 retail brands, government agencies and technology companies had their ads run on YouTube channels that promoted white nationalis­ts, Nazis, conspiracy theories and North Korean propaganda.

Child advocates have also raised alarms about the ease with which smartphone-equipped children are exposed to inappropri­ate videos and deceptive advertisin­g.

“I absolutely think that Google is next and long overdue,” said Josh Golin, director of the Bostonbase­d Campaign for a Commercial-Free Childhood, which asked the Federal Trade Commission to investigat­e Google-owned YouTube’s advertisin­g and data collection practices earlier this month.

YouTube has repeatedly outlined the ways it attempts to flag and delete harmful videos, but its screening efforts have often missed the mark.

It also allows advertiser­s avoid running ads on sensitive content that don’t violate YouTube guidelines but don’t fit with a company’s brand. Those methods appear to have failed.

“YouTube has once again failed to correctly filter channels out of our marketing buys,” said a statement Friday from 20th Century Fox Film, which learned that its ads were running on videos posted by a self-described Nazi. YouTube has since deleted the offending channel, but the Hollywood firm says it has unanswered questions about how it happened in the first place.

“All of our filters were in place in order to ensure that this did not happen,” Fox said, adding it has asked for a refund of any money shared with the “abhorrent channel.”

YouTube said Friday that it has made “significan­t changes to how we approach monetizati­on” with “stricter policies, better controls and greater transparen­cy” and said it allows advertiser­s to exclude certain channels from ads. It also removes ads when it’s notified of problems running beside content that doesn’t comply with its policies. “We are committed to working with our advertiser­s and getting this right.”

So far, just one major advertiser — Baltimoreb­ased retailer Under Armour — had said it had withdrawn its advertisin­g in the wake of the CNN report, though the lull lasted only a few days last week when it was first notified of the problem. After its shoe commercial turned up on a channel known for espousing white nationalis­t beliefs, Under Armour worked with YouTube to expand its filters to exclude certain topics and keywords.

On the other hand, Procter & Gamble, which had kept its ads off of YouTube since March 2017, said it had come back to the platform but drasticall­y pared back the channels it would advertise on to under 10,000. It has worked on its own, with third parties, and with YouTube to create its restrictiv­e list.

 ?? ASSOCIATED PRESS ?? Clockwise, from upper left: a Google sign, the Twitter app, the YouTube TV logo and the Facebook app.
ASSOCIATED PRESS Clockwise, from upper left: a Google sign, the Twitter app, the YouTube TV logo and the Facebook app.

Newspapers in English

Newspapers from United States