RE­AL­ITY CHECK

Idealog - - CON­TENT -

Ben Mack and Elly Strang look at the con­se­quences of the dig­i­tal revo­lu­tion

When the World Wide Web was first ush­ered into ex­is­tence by founder Tim Bern­ers-Lee in 1991, he and many oth­ers en­vi­sioned it as the dawn of a new era. For the first time in hu­man his­tory, peo­ple would be more con­nected than ever, while in­for­ma­tion would be free from cor­po­rate and govern­ment pow­ers and demo­crat­i­cally ac­ces­si­ble to all. More than 20 years later, the world and ev­ery­one we know within it can be found at our fin­ger­tips. Of course, tech­nol­ogy i s much big­ger than the i nter­net, but this de­vel­op­ment has ar­guably been the most i nflu­en­tial i n terms of i ts i mpact on so­ci­ety i n re­cent decades. And while there have been nu­mer­ous pos­i­tives, his vi­sion for an egal­i­tar­ian free-for- all hasn’t quite panned out the way he’d hoped. From the i nflu­ence of al­go­rithms run by com­pa­nies that hold enor­mous power, to the i ncreas­ing threat of cy­ber­crime as more de­vices con­nect to the i nter­net, to the men­tal health of tech en­trepreneurs, to signs of dig­i­tal ad­dic­tion among the gen­eral pop­u­lace, to the per­ils of cy­ber bul­ly­ing, to the l ack of di­ver­sity i n tech,

Ben Mack and Elly Strang give the dig­i­tal revo­lu­tion a re­al­ity check.

For many years,

sci­en­tists were un­der the im­pres­sion that hu­mans could only de­velop an ad­dic­tion for al­co­hol or drugs. This is un­der­stand­able, as sub­stance abuse is easy to iden­tify : there are well-doc­u­mented cases of peo­ple wind­ing up sick, broke or dead.

How­ever, re­cent re­search has re­vealed that other ac­tiv­i­ties give peo­ple a hit of dopamine right in the plea­sure cen­tre that’s as­so­ci­ated with ad­dic­tion. Case in point: the ping of your phone when you get a no­ti­fi­ca­tion lights up the same area of the brain as when you take co­caine.

Psy­chol­o­gist Adam Al­ter, au­thor of Ir­re­sistible: The Rise of Ad­dic­tive Tech­nol­ogy

and the Busi­ness of Keep­ing Us Hooked, says ad­dic­tive tech­nol­ogy be­hav­iours are so en­trenched in so­ci­ety, we barely no­tice it, and yet most sur­veys are unan­i­mous: a vast chunk of the pop­u­la­tion is hooked on their de­vices.

A study from the Univer­sity of Hong Kong in 2014 es­ti­mated that 420 mil­lion peo­ple around the world are ad­dicted to the in­ter­net. In 2017, this num­ber is likely to have shot up even more.

Al­ter says the dif­fer­ence with tech­nol­ogy ad­dic­tion is there hasn’t been much over­sight into the way it has im­mersed it­self so heav­ily into our daily lives, or the ram­i­fi­ca­tions of this.

“We fo­cus so much on its ob­vi­ous up­sides, which have pro­foundly dis­rupted our lives,” Al­ter says. “Many of its down­sides have crept up. For ex­am­ple, email be­gan as a low-level way of com­mu­ni­cat­ing from time to time, but now work­ers in many cul­tures feel teth­ered to email 24 hours a day. That didn’t hap­pen overnight, which is one rea­son why peo­ple haven’t paid as much at­ten­tion to it as a down­side.”

And although sub­stance ad­dic­tion is far more likely to kill you, Al­ter says both sub­stance and be­havioural ad­dic­tions share many of the same traits.

“They in­flu­ence the brain in sim­i­lar ways (though more strongly for sub­stances) and they both treat psy­cho­log­i­cal needs that aren’t met other­wise, in­clud­ing bore­dom, anx­i­ety, lone­li­ness and de­pres­sion,” he says.

Per­haps the most telling sign of this is that the late Steve Jobs, who in­vented the iPad, didn’t ac­tu­ally al­low his kids to use the de­vice he had cre­ated.

“We limit how much tech­nol­ogy our kids use in the home,” he told The New York Times.

Al­ter says this is akin to a drug dealer’s ‘don’t get high on your own sup­ply’ men­tal­ity.

“[Jobs] recog­nised that chil­dren and teens strug­gle to in­ter­act with other peo­ple, to do their home­work, and to gen­er­ally avoid us­ing screens when those screens are in front of them. The iPad, with all its cap­ti­vat­ing con­tent, is es­pe­cially dif­fi­cult to re­sist,” he says.

Hu­mans are nat­u­rally in­clined to crave this hit of plea­sure – be it from an­other per­son, an il­licit sub­stance or an ob­ject. But the more omi­nous side to tech­nol­ogy is the fact that on the other side of th­ese de­vices and apps, a hu­man is hand­craft­ing its fea­tures to be as ad­dic­tive as pos­si­ble.

On Net­flix, the next episode is lined up to play au­to­mat­i­cally un­less you tell it to stop. The level of dif­fi­culty to con­tinue is less than zero, as it seems eas­ier to keep binge watch­ing an en­tire sea­son of Or­ange is The New Black than to pull your­self out of the vor­tex.

On Snapchat, streaks ap­pear de­pend­ing on how many pictures two peo­ple have been send­ing back and forth to one an­other. If one per­son breaks the com­mu­ni­ca­tion, the streak will end, en­cour­ag­ing ad­dic­tive, fre­quent use of the app.

Al­ter says dig­i­tal prod­uct de­sign­ers in­ten­tion­ally build th­ese re­wards, such as likes, re­posts, com­ments and shares, to be ad­dic­tive. And in per­haps one of the most clev­erly ex­e­cuted be­havioural de­signs the world has seen, the trig­ger for this hit is not ac­tu­ally the tech­nol­ogy it­self, but other peo­ple. This means friends or fol­low­ers are con­stantly prompt­ing a per­son to con­tinue us­ing the ser­vice, and so the cy­cle con­tin­ues.

“The pos­si­bil­ity of th­ese re­wards is hard to re­sist in the same way that play­ing slot ma­chines, with the prom­ise of mone­tary re­wards, is hard to re­sist,” he says. “They also cre­ate ar­ti­fi­cial goals – reach 1000 fol­low­ers! Reach 100 likes! Con­quer all 300 lev­els of this game! Which hu­mans strug­gle to ig­nore once they ex­ist.”

But it’s im­por­tant to note tech­nol­ogy isn’t all omi­nous, ei­ther. Just as al­co­hol can be en­joyed re­spon­si­bly, so too can tech­nol­ogy. And, when it comes down to it, Al­ter says it all de­pends on how it’s com­modi­tised by com­pa­nies and con­sumed by in­di­vid­u­als.

Those feel­ing ad­dicted or over­whelmed just need to ac­tively mon­i­tor their be­hav­iour and tr y cut down, he says. He rec­om­mends pick­ing a cer­tain time of day, like din­ner time, to stop us­ing de­vices with screens.

“In my ex­pe­ri­ence, peo­ple en­joy this sa­cred pe­riod of time to such an ex­tent that they ex­tend that brief tech-free pe­riod to cover week­ends and nights more gen­er­ally.”

2016 was ar­guably the year so­cial me­dia turned sour, har­nessed by the forces of dark­ness to sub­vert democ­racy and free and open ex­pres­sion.

But we should have seen this com­ing – at least, if we broke out of our “fil­ter bub­bles”, a phrase coined by Up­wor­thy co-founder Eli Pariser to de­scribe the way tech­nol­ogy com­pa­nies feed us in­for­ma­tion we are more likely to agree with. One of the main praises of the in­ter­net is that it al­lows us to learn about our world and en­gage with peo­ple that have dif­fer­ent views. But, in­stead, we of­ten use tech­nol­ogy to cre­ate our own “safe spa­ces” and don’t make an ef­fort to en­gage or ex­pand our knowl­edge. In fact, stud­ies show that those who use so­cial me­dia are more likely to be lonely.

Vaughn Davis, owner and cre­ative di­rec­tor at ad­ver­tis­ing agency The Goat Farm and host of Sun­day So­cial on Ra­dioLIVE, says ex­po­sure to dis­sent­ing views and di­verse per­spec­tives is key for a healthy democ­racy – and some­thing tech­nol­ogy could ac­tu­ally as­sist with.

“A di­verse me­dia land­scape is a good thing for ev­ery­one,” he says. “All points of views are con­sid­ered, and power, what­ever its flavour, is more likely to be held to ac­count … One of the key short­com­ings of dig­i­tal and in par­tic­u­lar so­cial me­dia is the de­gree to which it’s cu­rated just for us,” he says. “Ten years ago, we would read a news­pa­per, or lis­ten to a news bul­letin, and hear sto­ries we hadn’t sought out. At worst, ir­rel­e­vant con­tent might make us tune out or turn the page. More of­ten, though, we’d be ex­posed to ideas, sto­ries and opin­ions that didn’t align with our own. And that’s a good thing.”

The dig­i­tal space, such as Twit­ter and Face­book, is any­thing but, he says. “Dig­i­tal me­dia is the op­po­site of that. We click on the sto­ries we think we’ll find in­ter­est­ing, and are un­likely to chal­lenge our think­ing. And the more we do that, the more the news site or so­cial net­work we’re on learns about us, and the more closely the sto­ries it serves re­flect our pref­er­ences and prej­u­dices. The nar­rower a lens we see the world through, the worse we’re all off.”

This isn’t new, of course. Hu­mans have al­ways cho­sen me­dia that matches their be­lief struc­ture. But it’s just in over­drive now as the al­go­rithms take over. The col­lec­tion of data – such as likes and dis­likes, or the types of sto­ries clicked on or sites vis­ited – is also a con­cern, Davis says, es­pe­cially be­cause cor­po­rate giants like Face­book and Google are ul­ti­mately busi­nesses be­fore any­thing else.

“The dom­i­nance of global me­dia com­pa­nies like th­ese two make this a very real is­sue,” he says. “Both are con­trolled not by ed­i­tors, but by al­go­rithms. And the way in which each of them de­cides to pro­mote cer­tain con­tent and sup­press oth­ers is a closely guarded com­mer­cial se­cret. This isn’t a triv­ial is­sue. On the com­mer­cial front, Google was re­cently fined € 2.4 bil­lion for serv­ing its own shop­ping ser­vice re­sults ahead of its pay­ing cus­tomers’. The ed­i­to­rial side is even more wor­ry­ing. Face­book or Google could, for ex­am­ple, ad­just their al­go­rithms to make news about po­lit­i­cal party A ap­pear more vis­i­ble than sto­ries about po­lit­i­cal party B. Sud­denly, it would seem, ev­ery­one is talk­ing about party A… they must have some­thing go­ing for them, right? But what if Face­book and Google de­cided to tweak the news to re­flect not our own per­sonal pref­er­ences, but some­one else’s agenda?

“Now I’m not say­ing ei­ther com­pany is do­ing this. There doesn’t seem to be much stop­ping them though. If a news­pa­per edi­tor is free to flat­ter one can­di­date and crit­i­cise an­other, why not a so­cial net­work? The chal­lenge as read­ers is to know when it’s hap­pen­ing, and to un­der­stand that what we see on­line is any­thing but ran­dom.”

Newspapers in English

Newspapers from New Zealand

© PressReader. All rights reserved.