The Chronicle

Age verificati­on a must to protect our children

- Andrew Wallace Andrew Wallace is the Federal Member for Fisher and Parliament­ary Joint Committee on Intelligen­ce and Security Deputy Chair

In a matter of weeks, my wife and I are set to become firsttime grandparen­ts. We could not be more excited. But as I look how our world is changing, particular­ly online, I’m more worried than ever about how my kids will be able to protect their kids.

The US Surgeon-General issued a public advisory, warning his country that social media is driving a youth mental health crisis. And the number of children exploited and bullied online by both predators and peers is only getting worse.

That’s why I’ve fought so hard for age verificati­on for online pornograph­y and social media accountabi­lity.

We can’t expect kids to keep themselves safe online. Industry clearly has no interest in doing so either.

This is exacerbate­d by the Federal Government’s refusal to support age assurance legislatio­n, despite the overwhelmi­ng support of parents, experts, the eSafety Commission­er and even their own backbenche­rs.

Beyond big porn, we’re watching social media run roughshod over business, media and democracy.

So how can we expect parents to take on these big tech platforms alone? Government has a duty to equip parents and police with the mechanisms they need to protect our kids and hold social media companies to account.

The Coalition’s proposal to create a new Commonweal­th offence to criminalis­e posting material that depicts violence, drug offences or property offences is welcome news.

I agree – it’s time to outlaw the act of promoting crime online. But I think we’re missing the bigger picture.

We need to ask ourselves why this kind of material gets attention in the first place. Two answers: anonymity and algorithms.

We used to call social media a virtual town square, giving ordinary people the extraordin­ary opportunit­y to engage in a public-facing, worldwide setting.

But social media is becoming less a force for good than a facility for harm.

I’m not just talking about trolls and keyboard warriors who tweet mean things or post offensive memes.

Social media has become a labyrinth of sometimes untraceabl­e channels, swamped with automated, anonymous figures who show little to no regard or accountabi­lity for others.

Anonymous perpetrato­rs of violence use social media to bully, ‘sextort’ and harass their victims.

Anonymous predators use social media to groom, exploit, and abuse children.

Anonymous parties of statespons­ored and organised crime gangs assemble armies of operatives and automated bots delivering targeted and harmful content, in an effort radicalise the vulnerable, terrorise dissidents and disrupt democracy.

Their greatest tool, beyond the unaccounta­bility that anonymity affords, is the simple algorithm.

Algorithms are the complex rules which guide what we see online. They’re designed to keep us hooked.

The emphasis on engagement puts the pressure on content creators to publish increasing­ly extreme material for likes and shares.

Algorithms amplify our biases, desensitis­e us to borderline content, and remove the moderating influence and accountabi­lity afforded by peers, parents, and social norms.

As a result, our newsfeeds become saturated with harmful material intent on consuming every square inch of our newsfeed.

And it’s not just interferen­ce by foreign actors, or non-state actors building extreme political silos which worries me. It’s the insidious way algorithms that entrap our vulnerable kids in cycles of harmful content.

Two in five Australian kids see porn on their newsfeed without seeking it. “Thinspirat­ion”, “hourglass abs”, and dangerous fad diet reels quickly roll into tips for purging and self-harm, driving an epidemic of eating disorders.

Foreign disinforma­tion campaigns fade into extremist content for radicalisa­tion and recruitmen­t.

In the face of such evil, community notes, flags, and viewer discretion notices aren’t enough. We need to address social media algorithms now.

Some US states are already looking at forcing tech companies to publicise their algorithms and to restrict the algorithms from promoting harmful content or products.

In my inquiry into family, domestic and sexual violence in 2021, I recommende­d that the Federal Government implement a mandatory ID verificati­on regime for social media platforms, to address the problems stemming from anonymity.

In the three years since, I’m more convinced than ever that the time is now for social media ID verificati­on.

We face a grave threat online for which parents and police are entirely unequipped. With nearly three in four Australian­s already on social media, this issue needs more than a tweet and it needs more than tweaking.

Mandate age assurance for porn. Criminalis­e content which encourages crime. Regulate harmful social media algorithms. Verify ID for social media.

There’s a generation of our kids – our future adults – who depend on it.

 ?? ??
 ?? ??

Newspapers in English

Newspapers from Australia