No sub­sti­tute for people

The Northland Age - - Opinion - Danielle van Dalen

The cracks are start­ing to show on the fancy com­puter al­go­rithms we rely so heav­ily upon th­ese days.

The re­cent news that Face­book was in­volved in shar­ing the per­sonal in­for­ma­tion of quiz par­tic­i­pants and their friends with Cam­bridge An­a­lyt­ica has il­lus­trated the creep­ing in­flu­ence al­go­rithms have had on our lives. Af­ter the story broke, the New York Times re­ported that “the hash­tag #DeleteFace­book ap­peared more than 10,000 times within a two-hour pe­riod,” show­ing that people are start­ing to recog­nise that our in­creas­ing re­liance on al­go­rithms is not all pos­i­tive. We need to re­mem­ber that tech­nol­ogy can­not be a sub­sti­tute for re­la­tion­ships.

Al­go­rithms are pow­er­ful tools for cre­at­ing ef­fi­cient and tar­geted processes used in our day-to-day lives. They do a lot of awe­some things. Com­pa­nies like Net­flix and Uber har­ness them to en­sure that the right in­for­ma­tion gets to the right people, like which movies to ad­ver­tise for whom, or which driver to con­nect with which pas­sen­ger. Un­for­tu­nately, as the Cam­bridge An­a­lyt­ica ex­am­ple sug­gests, there are also risks to the wide­spread use of al­go­rithms and the per­sonal data they so of­ten rely upon, and pri­vacy breaches are just one ex­am­ple. In many lessob­vi­ous ar­eas of our lives, the al­go­rithms are show­ing their fair share of bugs too.

In 2016, the Arkansas Depart­ment of Hu­man Ser­vices be­gan to use al­go­rithms to de­ter­mine health­care op­tions and sup­ports for dif­fer­ent people. The al­go­rithm led, how­ever, to sig­nif­i­cant changes and re­duc­tions in many people’s med­i­cal care — one per­son even claimed “they were hos­pi­talised be­cause their care was cut.”

A court even­tu­ally dis­cov­ered that the soft­ware ven­dor had “mis­tak­enly used a ver­sion of the soft­ware that didn’t ac­count for di­a­betes is­sues”.

While not quite as life-or-death, com­pa­nies like Shell have also been us­ing al­go­rithms to fill job roles, but are now find­ing that th­ese al­go­rithms are “bad at pre­dict­ing rare events, such as when em­ploy­ees would ex­cel at a task they haven’t en­coun­tered be­fore”.

So while we’ve seen the po­ten­tial of tech­nolo­gies like this and get un­der­stand­ably ex­cited about the fu­ture pos­si­bil­i­ties and ef­fi­cient sys­tems we might cre­ate, th­ese ex­am­ples show that com­pletely re­ly­ing on th­ese processes is not al­ways the an­swer. Some­times they get it wrong. It’s easy to quickly trust the out­puts of a com­puter, but as th­ese ex­am­ples show, we sim­ply can­not af­ford to be so na¨ıve. So how do we re­spond? While quit­ting Face­book seems like a good start for many — even fu­tur­ist and tech­no­log­i­cal en­tre­pre­neur Elon Musk deleted his com­pany’s ac­counts — be­com­ing a tech­no­log­i­cal her­mit doesn’t seem help­ful, or even pos­si­ble, any­more.

Be­sides, the ef­fi­ciency-re­lated ben­e­fits of al­go­rithms are hard to give up. In­stead, we need to be aware of their lim­i­ta­tions and wary of their po­ten­tial un­in­tended con­se­quences.

We need to re­mem­ber that al­go­rithms of­ten fail to see the nu­ance that only hu­man in­ter­ac­tion can de­tect.

From so­cial me­dia to med­i­cal care or po­ten­tial em­ploy­ment, let’s in­tro­duce some more healthy sus­pi­cion when it comes al­go­rithms.

"Al­go­rithms are pow­er­ful tools for cre­at­ing ef­fi­cient and tar­geted processes used in our day-to-day lives. They do a lot of awe­some things."

Newspapers in English

Newspapers from New Zealand

© PressReader. All rights reserved.