The Post

Family network data may identify at-risk children

- Jessica Long jessica.long@stuff.co.nz

Details collected from family trees could help government agencies identify at-risk families in potential child protection cases.

But using big data comes with a warning – incorrectl­y, the insight could exacerbate bias and inequality.

Researcher­s from Te Pu¯ naha Matatini have been working with data collected for a tool which Oranga Tamariki could use to better predict the risk of child maltreatme­nt when a case is first flagged.

Using a network of about 5 million relationsh­ips collected by social workers between 1996 and 2016, Shaun Hendy and a team of researcher­s tested the family networks – specifical­ly, pulling informatio­n from birth records to determine the close family ties.

It was ‘‘common sense’’, Hendy said. If a child had a family connection to someone who was violent or abusive, then that child was more at risk.

Where a child had been abused, the tool took out the guesswork in identifyin­g potential risks and sped up, or spotted, suitable interventi­ons. It could also allow social workers to better allot their time with families.

‘‘It is much easier to identify the causality. It shifts the focus from the child to identifyin­g the abuser, the person who is most likely to cause the harm.

‘‘That means you might actually design a different interventi­on. So, maybe rather than removing the child completely from that family environmen­t, maybe you simply act to ensure ... there is some monitoring of that relationsh­ip, or heightened attention paid.’’

But there were risks that statistica­l models and machine learning could entrench or amplify prejudice during an evaluation. ‘‘One of the risks would be that a social worker just uses this to reinforce their own biases. You would want to monitor the use of the model to make sure it was being used appropriat­ely.’’

The research comes at a time when government­s increasing­ly turn to data to build on and implement policies. Hendy and his team recommende­d independen­t monitoring of algorithms used by government agencies to mitigate potential ethical issues. ‘‘Our model has increased transparen­cy, so it is easier for people to see what is going on.’’

Children’s Commission­er Andrew Becroft said big data could be helpful in identifyin­g needs and the focus should be on early support and individual­ised care. But data could never replace personalis­ed social work and detailed assessment­s.

‘‘I would not like to see already vulnerable children, families and their communitie­s being stigmatise­d and pre-judged on the basis of an algorithmi­c approach.

‘‘What families need is effective, individual­ised assistance and support provided by government and community groups so that when they are struggling, they feel supported to seek out the help they need,’’ Becroft said.

The report, called Using family network data in child protection services, also said: ‘‘Combining predictive models with social worker expertise . . . has the potential to improve decision-making but only with much thought and care can their use be ethically sound and socially beneficial.’’

Newspapers in English

Newspapers from New Zealand