Google’s ‘knowledge panels’ raise questions about authority
‘Rich answers’ appearing more often on search results, but context is often lacking
Google’s “knowledge panels” materialize at random, as unsourced and absolute as if handed down by God:
Betty White is 94 years old; the Honda Civic is 2016’s best car; Taipei is the capital of — ahem — the “small island nation” of Taiwan.
If you’ve ever Googled a person, place or thing — surveys suggest you almost definitely have — then you’ve encountered these aggressive, boldfaced modules, one of Google’s many bids for your fleeting attention.
Since their quiet, casual introduction in 2012, knowledge panels and other sorts of “rich answers” have mushroomed across Google, appearing atop the results on roughly onethird of its 100 billion monthly searches, not only in response to simple, numerical queries like “Betty White age,” but also to more complex, nuanced questions like “capital of Israel” or “D.C.’s best restaurant.”
To Google, that’s proof of its semantic search technology; to googlers, it’s a convenience that saves them a few clicks. But to skeptics, of whom there are a growing number, it’s a looming public literacy threat — one that arguably dwarfs the recent revelations that Facebook’s trending topics are curated by humans.
“It undermines people’s ability to verify information and, ultimately, to develop well-informed opinions,” said Dario Taraborelli, head of research at the Wikimedia Foundation and a social computing researcher who studies knowledge production online. “And that is something I think we really need to study and process as a society.”
For Taraborelli, the primary issue is that the panels aren’t terribly knowledgeable: they provide information but often leave out any context on where that information came from. That makes it difficult for readers to evaluate the accuracy of the statement.
They could just scroll down the page and click through some links, of course — but that becomes increasingly difficult as searchers migrate to voice and mobile, and as Google ex- pands its rich-answer offerings without differentiating which programs those results source from.
There are “snippets,” for instance, that pull a portion of text from a cited web page in response to a question such as “how to lose weight.” There are maps, sourced from Google’s local search program, that will direct you to local businesses if you search something such as “best pizza D.C.”
These are all concerning, as they algorithmically confer a lot of unearned authority. (There’s no indication as to what makes a restaurant the “best,” for instance.)
But most pertinent to our interests are the modules and carousels linked to Google’s Knowledge Graph, an advanced database sourced largely from Wikipedia and constructed in part from user search patterns.
According to a 2015 analysis by the digital marketing firm Stone Temple Consulting, these knowledge panels, which are frequently unattributed, are one of the fastest-growing types in Google’s arsenal.
In a 2012 blog post announcing the introduction of these modules, Google’s Amit Singhal rejoiced in the “critical first step” toward the future of search, an engine that “understands the world a bit more like people do.” Which is all well and good, until you get into subjects more complex than the time in Timbuktu.
Mark Graham, a geographer at the Oxford Internet Institute, recently did just that: he and colleague Heather Ford analyzed, in a paper published last month, how the city of Jerusalem was represented both on Wikipedia and in Google knowledge panels.
They found that while Wikipedia may explain the city’s contested geopolitical status in enormous depth — that portion runs to almost 1,500 words — the nuance was jettisoned completely when the article was de- boned and ingested by Google.
“Google, through its data and algorithms, now controls how we interact with many facets of the cities we live in,” Graham warned. “So we should be asking whether we are happy ceding decisions about how we live our everyday lives to them.”
In fact, as Graham dug into other contested cities, he discovered that Google’s knowledge panels regularly, if inadvertently, make rather important decisions for us: Taiwan, you’ll remember, is described as if it were an independent nation, when only 22 countries actually recognize it as such. Meanwhile, Google corrects searches for “Londonderry,” Ireland’s fourth-largest city, to “Derry,” the (unofficial) term favoured by Irish nationalists.
Since Google often does not cite its sources — a ploy, Taraborelli says, to make it seem more authoritative — there’s no way for users to doublecheck “answers” for bias or error.
In its defence, Google has changed certain types of panels, which suggests it is aware of the whole sourcing thing. Medical queries now pull up proprietary editorial panels factchecked by doctors at Google and the Mayo Clinic. Search for a food or recipe ingredient, the accompanying knowledge panel will also link you to the U.S. Agriculture Department’s database on nutrition.
“Our goal is to be useful; we realize we’ll never be perfect, just as a person’s or library’s knowledge is never complete,” a Google spokesperson said in a statement. “We’re constantly working to improve search and to make searching with Google easier and results more accurate for people.”
Unfortunately, as long as Google has a commercial interest in appearing omniscient, it probably won’t work to improve knowledge panel transparency.
That burden will fall instead to people like Taraborelli and non-profits such as the Wikimedia Foundation, which is working on an open-licence, machine-readable knowledge base that will both source all of its statements and accommodate conflicting sources.
The hope is Google will begin pulling from that database and citing sources, instead of dumbing down Wikipedia.
“It undermines people’s ability to verify information and, ultimately, to develop well-informed opinions.” DARIO TARABORELLI SOCIAL COMPUTING RESEARCHER, ON GOOGLE’S ‘KNOWLEDGE PANELS’