The Palm Beach Post

Google trains computers to spot offensive content

-

MOU N TA I N V I E W, CA L I F. — Over the years, Google trained computer systems to keep copyrighte­d content and pornograph­y off its YouTube service. But after seeing ads from Coca-Cola, Procter & Gamble and WalMart appear next to racist, anti-Semitic or terrorist videos, its engineers realized their computer models had a blind spot: They did not understand context.

Now teaching computers to understand what humans can readily grasp may be the key to calming fears among big-spending advertiser­s that their ads have been appearing alongside videos from extremist groups and other offensive messages.

Google engineers, product managers and policy wonks are trying to train computers to grasp the nuances of what makes certain videos objectiona­ble. Advertiser­s may tolerate use of a racial epithet in a hip-hop video, for example, but may be horrified to see it used in a video from a racist skinhead group.

That ads bought by wellknown companies can occa- sionally appear next to offensive videos has long been considered a nuisance to YouTube’s business. But the issue has gained urgency in recent weeks, as The Times of London and other outlets have written about brands that inadverten­tly fund extremists through automated advertis- ing — a byproduct of a system in which YouTube shares a portion of ad sales with the creators of the content those ads appear against.

This glitch in the company’s giant, automated process turned into a public-rela-

Newspapers in English

Newspapers from United States