Northwest Arkansas Democrat-Gazette
Getting the picture
AI uses demand laws protecting kids
Anytime the term “loophole” comes up in conversation, it’s fair to question whether someone is up to some shenanigans.
Whether it involves a taxpayer doing intellectual gymnastics in search of favorable application of the tax code or an attorney twisting a state law like a pretzel to help a client, the suspicion is the same: justification of a behavior that a reasonable, objective observer would declare a violation.
Arkansas makes the sexual abuse of children and imagery depicting it illegal, thankfully. But now the rapidly expanding use of artificial intelligence yet again raises a discomforting question, the kind most Arkansans (we hope) hardly believe has to be asked: Are there forms of child abuse imagery that are legal?
That such a question has to be asked indicates the extent to which society has fallen short in applying important moral and legal standards to many uses of artificial intelligence.
A news story by the Northwest Arkansas Democrat-Gazette’s Tracy Neal described an investigation by Benton County law enforcement. Investigators found images of child sexual abuse on a man’s computer that had been generated using artificial intelligence. The man faces no state criminal charges in relation to those images.
He does, however, face criminal charges related to images that involved real children.
Purveyors of such horrid images are quickly adopting artificial intelligence to create synthetic child abuse images, law enforcement officers say. In October, the United Kingdom’s Internet Watch Foundation issued a report that described its earliest examination of AI’s use for such images in the spring of 2023. Those images had clear “tells” that they’d been artificially generated.
“Half a year on, we’re now in a position where the imagery is so lifelike, that it’s presenting real difficulties for even our highly trained analysts to distinguish,” the organization’s CEO, Susie Hargreaves, wrote in the report. She said by the fall perpetrators online displayed “jubilation that fantasies can be made to order.”
“What’s more concerning for me,” Hargreaves wrote, “is the idea that this type of child sexual abuse content is, in some way, ethical. It is not.”
What could possibly be ethical about it? In monitoring online discussions, those working to end child sexual abuse say perpetrators argue AI-generated content will eliminate victimization of real children in the making of pornographic images. But come on, every aspect of sexual abuse involving children is doing harm. Creation of “fake” imagery involving such abuse will have the effect of normalizing such conduct when nothing about it should be considered normal or acceptable.
It’s important to know, too, that AI-generated images don’t just involve fake individuals; experts say people are using the technology to make abuse images of actual children, whether it’s someone the image-maker knows personally or a famous child. So if the images aren’t real but they involve a child who is, does that mean that child isn’t victimized by their distribution? Hardly. They’re horribly damaging.
Locally, law enforcement officials say they need better laws to enforce against AI-generated content involving children.
“We have to make sure that it’s criminal because we can’t have people doing it,” said Tyler Dunn, a deputy prosecutor in Benton County. “It’s an evil intent.”
Proliferation of child sexual abuse images through AI will also produce a bigger haystack in which the needle of real-life child abuse may become much harder for law enforcement to detect, investigate and prosecute.
State Sen. Joshua Bryant, R-Rogers, said he’s been working on a bill to present in next year’s legislative session to criminalize using artificial intelligence to create child pornography.
Bryan Sexton, Benton County’s new prosecuting attorney, said the state’s laws need to catch up.
“The collection of AI-created images of child sexual abuse material is a disturbing slope that could cause victimization of actual children and deserves the attention of the Legislature now,” Sexton said.
A fellow law enforcement official in Britain, responding to the Internet Watch Foundation’s report, shared Sexton’s concerns.
“We are seeing children groomed, we are seeing perpetrators make their own imagery to their own specifications, we are seeing the production of AI imagery for commercial gain — all of which normalizes the rape and abuse of real children,” said a statement from Ian Critchley, child protection lead for Britain’s National Police Chiefs’ Council.
The good news for now is that federal law prohibits sex abuse images that appear to involve a minor. Arkansas law should follow.
Our bottom line is that public officials should look for ways to combat imagery depicting child sexual abuse. Protecting children is paramount.
The use and misuse of artificial intelligence is happening at break-neck speeds. Our state Legislature cannot afford to dawdle in taking steps to protect the state’s children and prevent the normalization of abusive behaviors toward children.