Manawatu Standard

By the numbers

The government already knows plenty about us from all the data it collects. How would we feel if its agencies put that knowledge to greater use? Katie Kenny reports.

-

Even before we’re born, government agencies are collecting informatio­n which will help them better understand us, and others like us. Whether our parents were on a benefit, for example, our ethnicity, or where we live.

As we go about our daily lives – visiting a doctor, using social services, attending school, paying taxes, travelling – our digital footprint grows.

Advances in technology have enabled better collection and processing of huge amounts of digital informatio­n. Increasing­ly, this data is being used to inform decisions made about us, with computer algorithms.

By the time a child is at school, they’ve shed enough data for Work and Income to identify their risk of long-term unemployme­nt, and for the Ministry of Education to assess their eligibilit­y for help getting to and from school.

But these are a fraction of what algorithms are capable of. A stocktake published last month found limited use of algorithms and no use of artificial intelligen­ce (according to the report’s definition­s of those things) by New Zealand government agencies.

Some within the tech community say the stocktake highlights a risk-averse government. But those with experience on the inside say a lack of education, rather than lack of expertise and willingnes­s to innovate, is the real issue when it comes to getting projects over the line.

Algorithms are essentiall­y a statistica­l tool for solving a problem or carrying out a task, and are now fundamenta­l in data analysis. Drawing on historical data, these lines of code can model possible outcomes such as the likelihood of a criminal reoffendin­g, a student dropping out of university, or a child being abused. Or what shows you want to watch on Netflix.

‘‘They have an essential role in supporting the services government provides, and help deliver new, innovative, and well-targeted policies for New Zealanders,’’ the stocktake said.

While there are many advantages to using algorithms, there are also risks; mainly relating to accuracy and bias. Amazon, for example, recently scrapped an automated recruitmen­t process after it was found to favour men’s CVS.

‘‘The Government is acutely aware of the need to ensure transparen­cy and accountabi­lity as interest grows regarding the challenges and opportunit­ies associated with emerging technology such as AI,’’ then digital services minister Clare Curran said at the time the audit was announced, in May.

Between June and July this year, 14 agencies were asked to respond to a standard series of questions, relating to their use of ‘‘operationa­l algorithms’’ and provide examples to illustrate that use.

The surveyed agencies included the ministries of education, health, justice and social developmen­t, as well as the Department of Correction­s, police and Customs.

Government chief data steward Liz Macpherson says the stocktake found widespread use of algorithms.

Police use two algorithms to assess the risk of future offending. One calculates the probabilit­y that a family violence perpetrato­r will commit a crime against a family member within the next two years, based on data already held by police such as gender, past incidents of family harm, or criminal history. The other helps to predict whether violence is escalating or likely to occur again at a given scene.

The combinatio­n of the two models creates an ‘‘overall level of concern’’ for the safety for the people involved.

Work and Income identifies young people at risk of long-term unemployme­nt, so they can be offered support in terms of qualificat­ions and training opportunit­ies. That algorithm is based on data such as demographi­c informatio­n, whether a young person’s parents were on a benefit, school history, and notificati­ons to Oranga Tamariki.

And the Ministry of Education uses software to calculate student eligibilit­y for transport assistance and to develop the most efficient routes for school buses.

None of this – or anything else uncovered in the stocktake – counts as AI, according to Macpherson.

But Ali Knott, of Otago University’s AI and Law in New Zealand Project, says some of the tools now in use ‘‘can already be thought of as AI systems’’.

‘‘I’d say anything the report refers to as ‘machine learning’ algorithms can be considered as ‘AI’. A lot of ‘operationa­l’ algorithms referred to in the report, that make decisions ‘based on large/complex data sets’, are machine learning algorithms and therefore AI.’’

When asked if they expected to develop algorithms that rely on AI in the future, eight agencies said yes, five said no (Oranga Tamariki, Ministry of Education, Department of Internal Affairs, Ministry of Justice, and Social Investment Agency), and one was unsure.

Syen Nik, head of machine learning at Jade Software in Christchur­ch, spent more than four years working on algorithms at the Ministry of Social Developmen­t (MSD). He left in March this year. When asked why, he says there were many reasons, but one was that ‘‘things could have happened quicker’’. Almost all the models he developed didn’t end up getting used.

However, he says the stocktake shows progress.

During his time at the agency, his team came up against issues relating to transparen­cy, human rights, and privacy. ‘‘We should have dealt with those issues a lot earlier. It would have made the data scientist jobs much easier.’’

Now, there are frameworks in place, assessing the ethics of any new service or process. Those frameworks are a sound foundation for current and future models, Nik says.

‘‘We learnt that the business has to drive our work. Internal staff who will be using the output want to know what’s happening.’’

The challenge was getting them to see machine learning as helpful, rather than as a competitor. ‘‘Part of the job was

educating [frontline staff], showing them these models can help them, and aren’t going to replace them.

‘‘Once they understand the models can help them achieve their performanc­e indicators, they start to use them and like them. In turn, that helps the models improve.

‘‘So it’s a feedback loop. We were starting to see that at the time I left.’’

MSD is widely regarded as leading the public sector in its use of these technologi­es. ‘‘I get the feeling we were leading, and MSD still is,’’ Nik says. ‘‘But the gap is closing, quickly, which is a good thing.’’

At the end of the day, he says, these models are just serving up recommenda­tions. He finds the current hype about them rather bemusing.

‘‘The use of data has always been there. Maybe at a different level, but it’s always been there and people have always used it to make decisions.’’

Quoting AI boffin Andrew Ng, he adds: ‘‘Worrying about machines taking over the world is like worrying about overpopula­tion on Mars.’’

Almost all participat­ing agencies use operationa­l algorithms to inform human decisionma­king, rather than to automate significan­t decisions, the stocktake found. ‘‘Humans, rather than computers, review and decide on almost all significan­t decisions made by government agencies.’’

While these tools are helping agencies deliver better and more efficient services, ‘‘there’s plenty of scope to lift our game’’, Macpherson says. ‘‘New Zealand has robust systems and principles in place around the safe use of data, but as techniques become more sophistica­ted we must remember to keep the focus on people and make sure the things we are doing are for their benefit.’’

The report’s recommenda­tions include maintainin­g human oversight, involving those who will be affected, promoting transparen­cy and awareness, regularly reviewing algorithms that inform significan­t decisions, and monitoring adverse effects.

‘‘Even the best algorithms can perpetuate historic inequality if biases in data are not understood and accounted for,’’ the stocktake said. Yet, it continued: ‘‘Few agencies reported any regular review process for existing algorithms to ensure they are achieving their intended aims without unintended or adverse effects.’’

When asked about this, Macpherson says that, while few agencies ‘‘explicitly referenced a review process for algorithms, there are a range of different safeguards and assurance processes that they did specify’’. Those include getting advice from experts, or employing a dedicated data steward.

The report found agencies could also benefit from a fresh perspectiv­e by looking beyond government for privacy, ethics, and data expertise. This could be achieved by bringing together a group of independen­t experts that agencies could consult for advice and guidance.

No decisions have been made about how the Government will respond to the report’s recommenda­tions, Macpherson says.

Evelyn Wareham, chief data and insights officer at the Ministry of Business, Innovation and Employment, says as one of the government’s largest and most complex policy and operationa­l agencies, it relies on good-quality evidence to inform decisions.

‘‘The data received by algorithms provides insights on a wide variety of operationa­l and policy decisions.’’ ‘‘Activity’’ is under way to ensure ‘‘transparen­cy, accountabi­lity and best practice for algorithm use across MBIE’’, she says.

Paul James, government chief digital officer and chief executive of the Department of Internal Affairs, says the Government is working with ‘‘communitie­s, interest groups, business and other nations to ensure we are developing and making best use of tools to benefit New Zealanders’’.

‘‘Take-up of technologi­es is relatively advanced in New Zealand, but AI applicatio­ns are still emerging.’’

 ??  ??
 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from New Zealand