LIFE IN A BUBBLE
In Google we have our very own echo chamber. Lana Hart examines news sharing in an age of confirmation bias.
Lana Hart examines news sharing in an age of confirmation bias.
Early every morning, before the household wakes, I take three items to a chair in the living room: my reading glasses, my cellphone and a cuppa.
I click on the Google icon and my favourite three searching words immediately appear. I think: maybe today could be the day. Maybe something big happened overnight and it’s all starting to crumble now, at long last. I skim through the headlines. Yes, horrible … I can’t believe he said THAT … Oh, and what did so-and-so say about him? Another posturing threat to a foreign power? What a dick.
There aren’t many mornings when I’m not at least partially satisfied with my news. After all, Google knows what I like, and it gives me what I asked for, plus more. I didn’t need to type in “impeachment” or “haters”, because “Trump news today” provides all that I want.
I suppose I already know that there are algorithms and location-finding calculations working in the background to bring me my morning news.
After all, when I search on “driver’s licence test” I get results for New Zealand, not Mozambique, which is handy, right? And when I ask Google to tell me what’s on TV tonight, it doesn’t take me directly to the religious channels, in which I’ve never shown any online interest. No, Google has done its research on me, so to speak, and tailors my searches to what I prefer.
Despite my placid awareness that my search engine is starting to think like me, if I consider this too deeply I start to squirm a little in my seat, thanks to Eli Pariser.
The filter bubble
Pariser first coined the term “filter bubble” in a 2011 book claiming that instead of the internet being an impartial tool delivering information to us objectively, the order of the suggestions (the key determinant of what we click on) is shaped by other “signals”, such as our search history, how long we visited sites, when and where we are searching, and even what type of computer we are using. In fact, with 57 signals determining which Google links appear first, the same search can render different results as these and other factors change. In The Filter Bubble: What the Internet
is Hiding From You, Pariser argues that the personalisation of our information moves us quickly “to a world where the internet is showing us what it thinks we want to see, but not necessarily what we need to see”.
The effects of this categorisation and prioritisation of information lead us to believe that most people think as we do (the “majority illusion”) and to the absence of more critical thinking. With less contact with contradictory perspectives, we tend to become intellectually and politically lazy, adopt group-think and ruminate on the same views.
Pariser argues that this personalised information cultivates “a kind of invisible propaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar ... [with] less room for the chance encounters that bring insight and learning”.
The tendency to favour and use information that confirms our own beliefs — confirmation bias — is well understood in research of all kinds; avoiding it is a continual challenge so that researchers don’t arrive at conclusions based solely on what they already believe while ignoring data that is inconsistent with what they believe.
Carl Davidson, chief social scientist at Canterbury research agency Research First, says “a trap for learning is not to fool ourselves — if you want to be able to learn and improve, you have to entertain the possibility of being wrong. But we are wired to latch on to ideas and confirm what we already think we know. Most of us don’t entertain the possibility that these ideas could be wrong.”
Confirmation bias, Davidson explains, is associated with another kind of thinking faux pas called Fundamental Attribution Error. “If I make a mistake or someone I like makes a mistake or tells a lie, I attribute it to their state of mind or their condition at the time. Maybe they were tired, or busy, or not well briefed about the matter. But if someone who I disagree with makes the same mistake, I blame it on their attributes as a person: they are liars or stupid, for example.
“So we explain our own failings in terms of