Algorithms aid government decisions
Computer algorithms are being used by government agencies to help make a wide range of potentially life-changing decisions about the treatment of Kiwis, according to a new report.
Police are using software to help predict whether people with a history of family violence are likely to commit crimes against their victims within two years.
Software is also being used by the Corrections Department to forecast the likelihood of inmates re-offending, when they come up for parole hearings.
Computer forecasts help determine how vigorously visa applicants are screened and whether people are offered automatic tax refunds by Inland Revenue.
But humans, rather than computers, have the final say on ‘‘almost all’’ decisions, the report found.
An exception appeared to be that more than 20,000 people between the ages of 15 and 24 have been automatically referred for help with qualifications or training from the Social Development Ministry after being assessed by a computer algorithm called NEET as being at risk of becoming longterm unemployed.
The report said police used a ‘‘static risk’’ algorithm to calculate the probability that a ‘‘family violence perpetrator’’ would commit a crime against a family member within two years, based on data they held that included their gender, past incidents of family harm and criminal history. But all final decisions about ‘‘actions and interventions’’ were made at the discretion of police officers, it said.
Factors used to calculate the chances of offenders re-offending when they came up for parole include details of their prior offending, their age, gender, and the age of their first offence.
‘‘The risk scores generated by the algorithm are considered together with the opinions of relevant qualified professionals including case managers, probation officers and psychologists,’’ the report said.
The algorithm ‘‘stocktake’’ was ordered by former digital services minister Clare Curran in May in the wake of concerns – denied by Immigration NZ – that it had been using algorithms to prioritise the deportation of overstayers based on factors including race.
Government ‘‘chief data steward’’ Liz Macpherson said the report showed how algorithms were helping agencies deliver better policies and services ‘‘but it also reminds us of the need to take care in their use’’.