▶ The minds be­hind ad­dic­tive apps are hav­ing sec­ond thoughts about their work, writes Rho­dri Mars­den

The National - News - - ARTS & LIFESTYLE -

Most of us will have a few re­grets about de­ci­sions we’ve made in the work­place, but few of those de­ci­sions will have had the same im­pact as ones made by ex­ec­u­tives and prod­uct de­sign­ers work­ing in Sil­i­con Val­ley.

With over 2 bil­lion of us us­ing smart­phones and 3 bil­lion con­nected to the in­ter­net, our daily choices are dis­pro­por­tion­ately af­fected by a small num­ber of tech­nol­ogy com­pa­nies – per­haps more than the peo­ple work­ing for those com­pa­nies re­alise.

But a grow­ing num­ber of for­mer em­ploy­ees have been go­ing pub­lic with their un­ease.

The lat­est is for­mer Face­book pres­i­dent Sean Parker, played by Justin Tim­ber­lake in the 2010 film The So­cial Net­work.

At an event last week run by Amer­i­can me­dia com­pany Ax­ios, Parker was frank about his dis­il­lu­sion­ment with the ser­vice he helped to cre­ate.

“I don’t know if I re­ally un­der­stood the con­se­quences... of a net­work when it grows to a bil­lion or 2 bil­lion peo­ple,” he said. “It lit­er­ally changes your re­la­tion­ship with so­ci­ety, with each other. “God only knows what it’s do­ing to our chil­dren’s brains.”

As end users, we may oc­ca­sion­ally pon­der the ef­fects of tech­nol­ogy on our own lives, but it’s telling when the peo­ple in­stru­men­tal in cre­at­ing that tech­nol­ogy start to de­velop mis­giv­ings.

As Sil­i­con Val­ley sev­er­ance pack­ages tend to come with non-dis­clo­sure agree­ments, few for­mer em­ploy­ees feel able to speak out, but some, like for­mer Twit­ter en­gi­neer Les­lie Mi­ley, specif­i­cally de­clined a sev­er­ance pack­age in order to do so. In an in­ter­view last week with Bloomberg, Mi­ley re­counted how, in his role as prod­uct safety and se­cu­rity man­ager, he ex­pressed fears to man­age­ment over the huge pro­lif­er­a­tion of dor­mant Twit­ter ac­counts based in Rus­sia and Ukraine as long ago as 2015, but his con­cerns weren’t ad­dressed. “They were more con­cerned with growth num­bers than fake and com­pro­mised ac­counts,” he said. Those ac­counts have since been used to spread pro-Rus­sian pro­pa­ganda.

The voices of dis­con­tent have a sim­i­lar theme: that growth in num­bers is paramount, and any col­lat­eral dam­age suf­fered as a con­se­quence is of lit­tle in­ter­est. “Any im­prove­ment not based on a hard met­ric was not a re­spected use of time,” said for­mer Google soft­ware en­gi­neer Katy Levinson to

Busi­ness In­sider late last year. “Us­abil­ity? No­body cared. If you couldn’t mea­sure it, no­body was in­ter­ested in it.” When for­mer Face­book em­ploy­ees came for­ward last month for an ar­ti­cle in Van­ity

Fair en­ti­tled “What Have I Done”, it was a sim­i­lar story.

“The half-tril­lion dol­lar pub­lic com­pany,” con­cluded the writer, Nick Bil­ton, “is first and fore­most a ma­chine that turns users into rev­enue. “Face­book’s prime di­rec­tive is to max­imise the num­ber of peo­ple ad­ver­tis­ing on its plat­form.”

It’s the meth­ods by which ser­vices have bred a de­pen­dence, or even an ad­dic­tion, that seem to pro­voke the most guilt in their cre­ators. We’re talk­ing the pull-to-re­fresh down­ward swipe that up­dates our feeds, the hec­tor­ing pop­ups and re­minders, the bright red no­ti­fi­ca­tion icons and auto-play­ing videos.

Justin Rosen­stein, the tech­ni­cal lead for the im­ple­men­ta­tion of the Face­book “Like” but­ton, was can­did about his feel­ings in an in­ter­view with

The Guardian last month.“It is very com­mon for hu­mans to de­velop things with the best of in­ten­tions,” he said, “and for them to have un­in­tended, neg­a­tive con­se­quences.”

Those con­se­quences were re­stated by Sean Parker as he considered the mo­ti­va­tions be­hind Face­book’s rapid de­vel­op­ment. “How do we con­sume as much of your time and con­scious at­ten­tion as pos­si­ble?” he said. “It’s a so­cial-val­i­da­tion feed­back loop… you’re ex­ploit­ing a vul­ner­a­bil­ity in hu­man psy­chol­ogy. “The in­ven­tors un­der­stood this con­sciously. And we did it any­way.”

Sara Wachter-Boettcher, whose new book, Tech­ni­cally

Wrong, de­tails many of the ways that tech­nol­ogy has failed us, is fas­ci­nated by the emer­gence of peo­ple fi­nally reck­on­ing with the things that they helped to cre­ate. “[Peo­ple work­ing in tech] have been given too much of a free pass for un­in­tended con­se­quences,” she says. “There’s a lot of stuff that could have been an­tic­i­pated had they paid more at­ten­tion to peo­ple who were telling them, years ago, but they’ve been very in­su­lar and com­fort­able with not think­ing about peo­ple who are un­like them­selves.”

Oc­ca­sion­ally we glimpse the na­ture of that Sil­i­con Val­ley bub­ble. In 2016, a se­nior soft­ware en­gi­neer at Twit­ter, Bran­don Carpenter, came un­der fire on Twit­ter for some changes that were be­ing im­ple­mented. His re­sponse (“Wow peo­ple on Twit­ter are mean”) was baf­fling to those who had been com­plain­ing about the way Twit­ter’s poli­cies had per­mit­ted and sus­tained abuse for many years, and for many peo­ple it summed up the dis­con­nect be­tween the peo­ple mak­ing the de­ci­sions and the peo­ple us­ing the prod­ucts.

Those de­ci­sions are seem­ingly taken, by and large, with­out wide con­sul­ta­tion; Face­book’s founder, Mark Zucker­berg, has fre­quently been crit­i­cised for sur­round­ing him­self with yes-men, while other com­pa­nies – no­tably Snapchat – are said to be de­lib­er­ately struc­tured to con­ceal the pur­pose of long term goals from its em­ploy­ees.

“I want you to imag­ine walk­ing into a room,” says for­mer Google em­ployee Tris­tan Har­ris at the open­ing of a TED talk he gave back in April.

“A con­trol room with a bunch of peo­ple, a hun­dred peo­ple, hunched over a desk with lit­tle dials, and that that con­trol room will shape the thoughts and feel­ings of a bil­lion peo­ple. This might sound like sci­ence fic­tion, but this ac­tu­ally ex­ists right now, today.”

There’s no doubt that we see ben­e­fits from some of the prom­ises and guar­an­tees made by the likes of Google, Face­book and Twit­ter. But as be­havioural de­signer Nir Eyal writes in his book Hooked: How to Build Habit-Form­ing Prod­ucts, we now have in­fi­nite dis­trac­tions com­pet­ing for our at­ten­tion. “Com­pa­nies are learn­ing to master novel tac­tics to stay rel­e­vant in users’ minds,” he says, “[and] their eco­nomic value is a func­tion of the strength of habits they cre­ate.” Are those habits healthy?

The peo­ple who in­stilled them in more than a bil­lion peo­ple have their doubts. For his part, Eyal has in­stalled a timer to cut off his ac­cess to the in­ter­net for a cer­tain num­ber of hours a day. Go fig­ure.

Justin Rosen­stein, top, and Sean Parker, above Bloomberg

Newspapers in English

Newspapers from UAE

© PressReader. All rights reserved.