LIFE IN A BUB­BLE

In Google we have our very own echo cham­ber. Lana Hart ex­am­ines news shar­ing in an age of con­fir­ma­tion bias.

Weekend Herald - Canvas - - CONTENTS -

Lana Hart ex­am­ines news shar­ing in an age of con­fir­ma­tion bias.

Early ev­ery morn­ing, be­fore the house­hold wakes, I take three items to a chair in the liv­ing room: my read­ing glasses, my cell­phone and a cuppa.

I click on the Google icon and my favourite three search­ing words im­me­di­ately ap­pear. I think: maybe to­day could be the day. Maybe some­thing big hap­pened overnight and it’s all start­ing to crum­ble now, at long last. I skim through the head­lines. Yes, hor­ri­ble … I can’t be­lieve he said THAT … Oh, and what did so-and-so say about him? An­other pos­tur­ing threat to a for­eign power? What a dick.

There aren’t many morn­ings when I’m not at least par­tially sat­is­fied with my news. Af­ter all, Google knows what I like, and it gives me what I asked for, plus more. I didn’t need to type in “im­peach­ment” or “haters”, be­cause “Trump news to­day” pro­vides all that I want.

I sup­pose I al­ready know that there are al­go­rithms and lo­ca­tion-find­ing cal­cu­la­tions work­ing in the back­ground to bring me my morn­ing news.

Af­ter all, when I search on “driver’s li­cence test” I get re­sults for New Zea­land, not Mozam­bique, which is handy, right? And when I ask Google to tell me what’s on TV tonight, it doesn’t take me di­rectly to the re­li­gious chan­nels, in which I’ve never shown any on­line in­ter­est. No, Google has done its re­search on me, so to speak, and tai­lors my searches to what I pre­fer.

De­spite my placid aware­ness that my search en­gine is start­ing to think like me, if I con­sider this too deeply I start to squirm a lit­tle in my seat, thanks to Eli Pariser.

The fil­ter bub­ble

Pariser first coined the term “fil­ter bub­ble” in a 2011 book claim­ing that in­stead of the in­ter­net be­ing an im­par­tial tool de­liv­er­ing in­for­ma­tion to us ob­jec­tively, the or­der of the sug­ges­tions (the key de­ter­mi­nant of what we click on) is shaped by other “sig­nals”, such as our search his­tory, how long we vis­ited sites, when and where we are search­ing, and even what type of com­puter we are us­ing. In fact, with 57 sig­nals deter­min­ing which Google links ap­pear first, the same search can ren­der dif­fer­ent re­sults as these and other fac­tors change. In The Fil­ter Bub­ble: What the In­ter­net

is Hid­ing From You, Pariser ar­gues that the per­son­al­i­sa­tion of our in­for­ma­tion moves us quickly “to a world where the in­ter­net is show­ing us what it thinks we want to see, but not nec­es­sar­ily what we need to see”.

The ef­fects of this cat­e­gori­sa­tion and pri­ori­ti­sa­tion of in­for­ma­tion lead us to be­lieve that most peo­ple think as we do (the “ma­jor­ity il­lu­sion”) and to the ab­sence of more crit­i­cal think­ing. With less con­tact with con­tra­dic­tory per­spec­tives, we tend to be­come in­tel­lec­tu­ally and po­lit­i­cally lazy, adopt group-think and ru­mi­nate on the same views.

Pariser ar­gues that this per­son­alised in­for­ma­tion cul­ti­vates “a kind of in­vis­i­ble pro­pa­ganda, in­doc­tri­nat­ing us with our own ideas, am­pli­fy­ing our de­sire for things that are fa­mil­iar ... [with] less room for the chance en­coun­ters that bring in­sight and learn­ing”.

Be­ing Wrong

The ten­dency to favour and use in­for­ma­tion that con­firms our own be­liefs — con­fir­ma­tion bias — is well un­der­stood in re­search of all kinds; avoid­ing it is a con­tin­ual chal­lenge so that re­searchers don’t ar­rive at con­clu­sions based solely on what they al­ready be­lieve while ig­nor­ing data that is in­con­sis­tent with what they be­lieve.

Carl David­son, chief so­cial sci­en­tist at Can­ter­bury re­search agency Re­search First, says “a trap for learn­ing is not to fool our­selves — if you want to be able to learn and im­prove, you have to en­ter­tain the pos­si­bil­ity of be­ing wrong. But we are wired to latch on to ideas and con­firm what we al­ready think we know. Most of us don’t en­ter­tain the pos­si­bil­ity that these ideas could be wrong.”

Con­fir­ma­tion bias, David­son ex­plains, is as­so­ci­ated with an­other kind of think­ing faux pas called Fun­da­men­tal At­tri­bu­tion Er­ror. “If I make a mis­take or some­one I like makes a mis­take or tells a lie, I at­tribute it to their state of mind or their con­di­tion at the time. Maybe they were tired, or busy, or not well briefed about the mat­ter. But if some­one who I dis­agree with makes the same mis­take, I blame it on their at­tributes as a per­son: they are liars or stupid, for ex­am­ple.

“So we ex­plain our own fail­ings in terms of

Newspapers in English

Newspapers from New Zealand

© PressReader. All rights reserved.