‘Urgent’ stocktake to shed light on computer profiling
Kiwis should get a much better idea within a few months of how computers are being used by government agencies to make decisions about them.
An ‘‘urgent’’ stocktake has been ordered into the different ways that departments are crunching people’s data using algorithms, Government Digital Services Minister Clare Curran and Statistics Minister James Shaw said.
Computer-assisted decision making hit headlines last month when an Immigration New Zealand official said algorithms were being used to prioritise the deportation of overstayers based on factors such as race.
The department later denied it was engaged in racial profiling but has not responded to an Official Information Act request seeking clarification on seemingly conflicting information it gave on whether, why and how it was using data about people’s race and gender in deportation decisions.
The Corrections Department has used algorithms to help parole boards assess the likelihood of prisoners reoffending.
Algorithms are mathematical formula used to express and forecast relationships between data.
The Ministry of Social Development (MSD) conducted an experiment in 2015 to see if it could use algorithms to better identify children at risk of abuse, before that trial was canned by then-minister Anne Tolley.
However, ministers are not necessarily aware of how agencies are using algorithms, with Immigration NZ’s possible profiling and the MSD trial only coming to ministers’ attention after the fact.
Curran said the first stage of the stocktake would be completed by August. A spokeswoman for Curran said those findings would be made public.
A new privacy law that is due to come into effect in Europe tomorrow, called the General Data Protection Regulation, will give Europeans the right to an explanation when an automated decisions are made about them.
Curran is leading a project by the D7 group of digitally advanced nations that could see Kiwis get similar rights.
Otago University Professor Colin Gavaghan, who is assisting the Government in that work, has cautioned that in order for such safeguards to be effective, people needed to know algorithms were being used in decisions affecting them.