Google changes target ‘fake news’
Correcting autocomplete
Besides taking steps to block fake news from appearing in its search results, Google also has reprogrammed a popular feature that automatically tries to predict what a person is looking for as a search request as being typed. The tool, called “autocomplete,” has been overhauled to omit derogatory suggestions, such as “are women evil,” or recommendations that promote violence.
Google also adding a feedback option that will enable users to complain about objectionable autocomplete suggestions so a human can review the wording.
Facebook, where fake news stories and other hoaxes have widely circulated on its social network, also has been trying to stem the tide of misleading information by working with The Associated Press and other news organizations to review suspect stories and set the record straight when warranted. Facebook also has provided its nearly 2 billion users ways to identify posts believed to contain false information, something that Google is now allowing users of its search engine to do for some of the news snippets featured in its results.
Why Google cares
Google began attacking fake news in late December after several embarrassing examples of misleading information appeared near the top of its search engine. Among other things, Google’s search engine pointed to a website that incorrectly reported then President-elect Donald Trump had won the popular vote in the U.S. election, that President Barack Obama was planning a coup and that the Holocaust never occurred during World War II.
Only about 0.25 percent of Google’s search results were being polluted with falsehoods, Gomes said. But that was still enough to threaten the integrity of a search engine that processes billions of search requests per day largely because it is widely regarded as the internet’s most authoritative source of information.
“They have a lot riding on this, reputation wise,” said Lucy Dalglish, who has been tracking the flow of false information as dean of the University of Maryland’s journalism department. “If your whole business model is based turning up the best search results, but those results turn up stuff that is total crap, where does that get you?”
To address the problem, Google began revising the closely guarded algorithms that generate its search with the help of 10,000 people who rate the quality and reliability of the recommendations during tests. Google also rewrote its 140-page book of rating guidelines that help the quality-control evaluators make their assessments.
Google as referee
Fighting fake news can be tricky because in some cases what is viewed as being blatantly misleading by one person might be interpreted as being mostly true by another. If Google, Facebook or other companies trying to block false information err in their judgment calls, they risk being accused of censorship or playing favorites.
But doing nothing to combat fake news would probably have caused even bigger headaches.
If too much misleading information appears in Google’s search results, the damage could go beyond harm to its reputation for reliability.