The Guardian Australia

Child sexual abuse content growing online with AI-made images, report says

- Katie McQue

Child sexual exploitati­on is on the rise online and taking new forms such as images and videos generated by artificial intelligen­ce, according to an annual assessment released on Tuesday by the National Center for Missing & Exploited Children (NCMEC), a US-based clearingho­use for the reporting of child sexual abuse material.

Reports to the NCMEC of child abuse online rose by more than 12% in 2023 compared with the previous year, surpassing 36.2m reports, the organizati­on said in its annual CyberTipli­ne report. The majority of tips received were related to the circulatio­n of child sexual abuse material (CSAM) such as photos and videos, but there was also an increase in reports of financial sexual extortion, when an online predator lures a child into sending nude images or videos and then demands money.

Some children and families were extorted for financial gain by predators using AI-made CSAM, according to the NCMEC.

The center received 4,700 reports of images or videos of the sexual exploitati­on of children made by generative AI, a category it only started tracking in 2023, a spokespers­on said.

“The NCMEC is deeply concerned about this quickly growing trend, as bad actors can use artificial intelligen­ce to create deepfaked sexually explicit images or videos based on any photograph of a real child or generate CSAM depicting computer-generated children engaged in graphic sexual acts,” the NCMEC report states.

“For the children seen in deepfakes and their families, it is devastatin­g.”

AI-generated child abuse content also impedes the identifica­tion of real child victims, according to the organizati­on.

Creating such material is illegal in the United States, as making any visual depictions of minors engaging in sexually explicit conduct is a federal crime, according to a Massachuse­ttsbased prosecutor from the Department of Justice, who spoke on the condition

of anonymity.

In total in 2023, the CyberTipli­ne received more than 35.9m reports that referred to incidents of suspected CSAM, more than 90% of it uploaded outside the US. Roughly 1.1m reports were referred to police in the US, and 63,892 reports were urgent or involved a child in imminent danger, according to Tuesday’s report.

There were 186,000 reports regarding online enticement, up 300% from 2022; enticement is a form of exploitati­on involving an individual who communicat­es online with someone believed to be a child with the intent to commit a sexual offense or abduction.

The platform that submitted the most cybertips was Facebook, with 17,838,422. Meta’s Instagram made 11,430,007 reports, and its WhatsApp messaging service made 1,389,618. Google sent NCMEC 1,470,958 tips, Snapchat sent 713,055, TikTok sent 590,376 and Twitter reported 597,087.

In total, 245 companies submitted CyberTipli­ne reports to the NCMEC out of 1,600 companies around the world who have registered their participat­ion with the cybertip reporting program. US-based internet service providers, such as social media platforms, are legally mandated to report instances of CSAM to the CyberTipli­ne when they become aware of them.

According to the NCMEC, there is disconnect between the volumes of reporting and the quality of the reports submitted. The center and law enforcemen­t cannot legally take action in response to some of the reports, including ones made by content moderation algorithms, without human input. This technicali­ty can prevent police from seeing reports of potential child abuse.

“The relatively low number of reporting companies and the poor quality of many reports marks the continued need for action from Congress and the global tech community,” the NCMEC report states.

• In the US, call or text the Childhelp abuse hotline on 800-422-4453 or visit their website for more resources and to report child abuse or DM for help. You can also report child sexual exploitati­on at NCMEC’s CyberTipli­ne. For adult survivors of child abuse, help is available at ascasuppor­t.org. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Associatio­n for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Braveheart­s on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines Internatio­nal.

 ?? Photograph: Cultura/Rex/Shuttersto­ck ?? The NCMEC is a US-based clearingho­use for the reporting of child sexual abuse material.
Photograph: Cultura/Rex/Shuttersto­ck The NCMEC is a US-based clearingho­use for the reporting of child sexual abuse material.

Newspapers in English

Newspapers from Australia