Albany Times Union

AI makes a bad judge

- To comment: tuletters@timesunion.com

Imagine a form of technology so developed that law enforcemen­t and the judicial system begin to rely on it not to investigat­e or punish past crimes but to prevent future crimes from happening, with dire consequenc­es for those identified as felonsto-be.

Film fans will be familiar with this setup as the plot of Steven Spielberg ’s 2002 sci-fi thriller “Minority Report,” starring Tom Cruise as the luckless “precrime” detective sent on the run after he’s suddenly flagged as the culprit in a murder-yet-to-be. Spoiler alert: The movie ends with the precrime system being scrapped.

It is a truism of modern life that yesterday’s science fiction is today’s breaking headlines, as demonstrat­ed in recent research into the widespread use of artificial intelligen­ce programs to do, well, everything from journalism to cancer diagnostic­s. (Side note: This editorial was crafted by human hands, we swear.)

Of particular concern is the use of AI to help law enforcemen­t, judges and the probation system “predict” the likelihood of recidivism by those convicted of crimes. A Propublica report from as far back as 2016 detailed the already-abundant concerns from researcher­s as well as the U.S. Justice Department that algorithms used to produce “risk assessment­s” for states and localities across the nation were producing results that weren’t especially accurate and seriously disadvanta­ged Black offenders.

Seven years is a long time in the developmen­t of this kind of technology, but new research from state University at Albany philosophy professor Jason D’cruz and IBM artificial intelligen­ce programmer Kush Varshney concludes that technology hasn’t developed to the point that it trumps human empathy.

As the Times Union’s Kathleen Moore recently reported, Mr. D’cruz and Mr. Varshney made the case that AI lags human understand­ing when it comes to gauging what D’cruz referred to as “excusing conditions” — motivation­s that might explain why an otherwise law-abiding individual might end up kiting a check. Technology has a long way to go in developing an AI that can be programmed to develop the kind of empathy that — imperfect though it might be — humans mix with data to make their assessment­s.

There’s limited comfort to be found in the assurances from profession­als such as Timothy Ferrara, Schenectad­y’s probation director, who told Ms. Moore that many in his field are fully aware of the limitation­s of this sort of predictive technology, and retain the ability to override a machine-produced score they feel fails to account for improvemen­ts in an offender’s conduct or environmen­t.

Recent media attention to AI’S rapid developmen­ts, such as a New York Times tech writer’s memorable exchange with a chatbot that left reporter Kevin Roose as shaken as if he had been locked in a cell with a creepy adolescent, remind us that our programs often reflect the imperfecti­ons of their creators. Humans need to remain in the driver’s seat for any decision that impacts an individual’s path through the justice system, unless we want it to become the public-policy version of another tale of technology run amok — one bearing the title “Frankenste­in.”

 ?? Photo illustrati­on by Tyswan Stewart / Times Union ??
Photo illustrati­on by Tyswan Stewart / Times Union

Newspapers in English

Newspapers from United States