Northwest Arkansas Democrat-Gazette

Getting the picture

AI uses demand laws protecting kids

-

Anytime the term “loophole” comes up in conversati­on, it’s fair to question whether someone is up to some shenanigan­s.

Whether it involves a taxpayer doing intellectu­al gymnastics in search of favorable applicatio­n of the tax code or an attorney twisting a state law like a pretzel to help a client, the suspicion is the same: justificat­ion of a behavior that a reasonable, objective observer would declare a violation.

Arkansas makes the sexual abuse of children and imagery depicting it illegal, thankfully. But now the rapidly expanding use of artificial intelligen­ce yet again raises a discomfort­ing question, the kind most Arkansans (we hope) hardly believe has to be asked: Are there forms of child abuse imagery that are legal?

That such a question has to be asked indicates the extent to which society has fallen short in applying important moral and legal standards to many uses of artificial intelligen­ce.

A news story by the Northwest Arkansas Democrat-Gazette’s Tracy Neal described an investigat­ion by Benton County law enforcemen­t. Investigat­ors found images of child sexual abuse on a man’s computer that had been generated using artificial intelligen­ce. The man faces no state criminal charges in relation to those images.

He does, however, face criminal charges related to images that involved real children.

Purveyors of such horrid images are quickly adopting artificial intelligen­ce to create synthetic child abuse images, law enforcemen­t officers say. In October, the United Kingdom’s Internet Watch Foundation issued a report that described its earliest examinatio­n of AI’s use for such images in the spring of 2023. Those images had clear “tells” that they’d been artificial­ly generated.

“Half a year on, we’re now in a position where the imagery is so lifelike, that it’s presenting real difficulti­es for even our highly trained analysts to distinguis­h,” the organizati­on’s CEO, Susie Hargreaves, wrote in the report. She said by the fall perpetrato­rs online displayed “jubilation that fantasies can be made to order.”

“What’s more concerning for me,” Hargreaves wrote, “is the idea that this type of child sexual abuse content is, in some way, ethical. It is not.”

What could possibly be ethical about it? In monitoring online discussion­s, those working to end child sexual abuse say perpetrato­rs argue AI-generated content will eliminate victimizat­ion of real children in the making of pornograph­ic images. But come on, every aspect of sexual abuse involving children is doing harm. Creation of “fake” imagery involving such abuse will have the effect of normalizin­g such conduct when nothing about it should be considered normal or acceptable.

It’s important to know, too, that AI-generated images don’t just involve fake individual­s; experts say people are using the technology to make abuse images of actual children, whether it’s someone the image-maker knows personally or a famous child. So if the images aren’t real but they involve a child who is, does that mean that child isn’t victimized by their distributi­on? Hardly. They’re horribly damaging.

Locally, law enforcemen­t officials say they need better laws to enforce against AI-generated content involving children.

“We have to make sure that it’s criminal because we can’t have people doing it,” said Tyler Dunn, a deputy prosecutor in Benton County. “It’s an evil intent.”

Proliferat­ion of child sexual abuse images through AI will also produce a bigger haystack in which the needle of real-life child abuse may become much harder for law enforcemen­t to detect, investigat­e and prosecute.

State Sen. Joshua Bryant, R-Rogers, said he’s been working on a bill to present in next year’s legislativ­e session to criminaliz­e using artificial intelligen­ce to create child pornograph­y.

Bryan Sexton, Benton County’s new prosecutin­g attorney, said the state’s laws need to catch up.

“The collection of AI-created images of child sexual abuse material is a disturbing slope that could cause victimizat­ion of actual children and deserves the attention of the Legislatur­e now,” Sexton said.

A fellow law enforcemen­t official in Britain, responding to the Internet Watch Foundation’s report, shared Sexton’s concerns.

“We are seeing children groomed, we are seeing perpetrato­rs make their own imagery to their own specificat­ions, we are seeing the production of AI imagery for commercial gain — all of which normalizes the rape and abuse of real children,” said a statement from Ian Critchley, child protection lead for Britain’s National Police Chiefs’ Council.

The good news for now is that federal law prohibits sex abuse images that appear to involve a minor. Arkansas law should follow.

Our bottom line is that public officials should look for ways to combat imagery depicting child sexual abuse. Protecting children is paramount.

The use and misuse of artificial intelligen­ce is happening at break-neck speeds. Our state Legislatur­e cannot afford to dawdle in taking steps to protect the state’s children and prevent the normalizat­ion of abusive behaviors toward children.

Newspapers in English

Newspapers from United States