Las Vegas Review-Journal (Sunday)

Google not swaying voters, but it could

- By LEONID BERSHIDSKY

Long before artificial intelligen­ce brings about the singularit­y, algorithms are having an influence over our most important decisions, including which candidate to back in elections. The danger this could go too far is real and we probably need some well-considered regulatory interventi­on.

Last week, the U.S. psychologi­st Robert Epstein published a harsh article about Google’s alleged manipulati­on of its search suggestion feature. When you start typing in Google’s search window, it suggests ways to autocomple­te the request. Epstein showed, for example, that when a user entered “Hillary Clinton is … ,” Google suggested finishing the sentence with “is winning” or “is awesome.” Other search engines, Bing and Yahoo, completed the sentence differentl­y: “Hillary Clinton is a liar.”

Epstein went on to give other examples of the purported bias and claimed that his research showed that the manipulati­on of search suggestion­s could “shift between 800,000 and 3.2 million votes” in the U.S. presidenti­al election.

There are several reasons to question Epstein’s claims. He has a history with Google, which blocked his website in 2012 because it detected a malware infection, and the dispute escalated into a public fight. Next, the article was published by Sputnik, a notorious Russian stateowned propaganda site. I suspect the choice of publicatio­n instantly hurt Epstein’s credibilit­y with the mainstream press.

Finally, the most suggestive findings in Epstein’s piece are easily refuted. The suggestion­s are a moving target. I entered “Hillary Clinton is” into the Google search box on Wednesday and got “Hillary Clinton is dead” and “Hillary Clinton is toast” as the first results. The algorithm did suggest “awesome,” too, but then the suggestion­s for “Donald Trump is” were similar: “Donald Trump is dead” and “Donald Trump is orange” — but also “Donald Trump is going to win.”

The other search engines are harsher on both. Bing’s suggestion­s included completing the Trump request with “a lying racist” and “the antichrist” and one for Clinton one with “a lying crook” and “a practicing witch.”

Epstein, however, is definitely on to something. Google isn’t hiding that its algorithm picks and chooses among its autocomple­te suggestion­s.

On June 10, Tamar Yehoshua, who leads the company’s mobile search team, explained on Google’s official blog that the suggestion algorithm “is designed to avoid completing a search for a person’s name with terms that are offensive or disparagin­g” because results based merely on the frequency of real-life requests were “too often” of that nature.

Yehoshua made no apology for it. Instead, she asked users to let Google know when they ran across a suggestion they considered offensive. She also stressed that the suggestion­s didn’t determine the search — people could ignore them and look for whatever they wanted. That’s beside the point: Interestin­g autocomple­te options often divert users from their original query.

But Epstein is right about something more profound than the

 ??  ??

Newspapers in English

Newspapers from United States