Las Vegas Review-Journal (Sunday)
Google not swaying voters, but it could
Long before artificial intelligence brings about the singularity, algorithms are having an influence over our most important decisions, including which candidate to back in elections. The danger this could go too far is real and we probably need some well-considered regulatory intervention.
Last week, the U.S. psychologist Robert Epstein published a harsh article about Google’s alleged manipulation of its search suggestion feature. When you start typing in Google’s search window, it suggests ways to autocomplete the request. Epstein showed, for example, that when a user entered “Hillary Clinton is … ,” Google suggested finishing the sentence with “is winning” or “is awesome.” Other search engines, Bing and Yahoo, completed the sentence differently: “Hillary Clinton is a liar.”
Epstein went on to give other examples of the purported bias and claimed that his research showed that the manipulation of search suggestions could “shift between 800,000 and 3.2 million votes” in the U.S. presidential election.
There are several reasons to question Epstein’s claims. He has a history with Google, which blocked his website in 2012 because it detected a malware infection, and the dispute escalated into a public fight. Next, the article was published by Sputnik, a notorious Russian stateowned propaganda site. I suspect the choice of publication instantly hurt Epstein’s credibility with the mainstream press.
Finally, the most suggestive findings in Epstein’s piece are easily refuted. The suggestions are a moving target. I entered “Hillary Clinton is” into the Google search box on Wednesday and got “Hillary Clinton is dead” and “Hillary Clinton is toast” as the first results. The algorithm did suggest “awesome,” too, but then the suggestions for “Donald Trump is” were similar: “Donald Trump is dead” and “Donald Trump is orange” — but also “Donald Trump is going to win.”
The other search engines are harsher on both. Bing’s suggestions included completing the Trump request with “a lying racist” and “the antichrist” and one for Clinton one with “a lying crook” and “a practicing witch.”
Epstein, however, is definitely on to something. Google isn’t hiding that its algorithm picks and chooses among its autocomplete suggestions.
On June 10, Tamar Yehoshua, who leads the company’s mobile search team, explained on Google’s official blog that the suggestion algorithm “is designed to avoid completing a search for a person’s name with terms that are offensive or disparaging” because results based merely on the frequency of real-life requests were “too often” of that nature.
Yehoshua made no apology for it. Instead, she asked users to let Google know when they ran across a suggestion they considered offensive. She also stressed that the suggestions didn’t determine the search — people could ignore them and look for whatever they wanted. That’s beside the point: Interesting autocomplete options often divert users from their original query.
But Epstein is right about something more profound than the