Miami Herald

AI-generated nudes at Beverly Hills middle school expose gaps in the law

- JON HEALEY Los Angeles Times

If an eighth-grader in California shared a nude photo of a classmate with friends without consent, the student could conceivabl­y be prosecuted under state laws dealing with child pornograph­y and disorderly conduct.

If the photo is an AIgenerate­d deepfake, however, it is not clear that any state law would apply in that situation.

That’s the dilemma facing the Beverly Hills Police Department as it investigat­es a group of students from Beverly Vista Middle School who allegedly shared photos of classmates that had been doctored with an artificial-intelligen­ce-powered app.

According to the district, the images used real faces of students atop AI-generated nude bodies.

Lt. Andrew Myers, a spokesman for the Beverly Hills police, said that no arrests had been made and that the investigat­ion was continuing.

Beverly Hills Unified School District Superinten­dent Michael Bregy said the district’s investigat­ion into the episode was in its final stages.

“Disciplina­ry action was taken immediatel­y and we are pleased it was a contained, isolated incident,” Bregy said in a statement, although no informatio­n was disclosed about the nature of the action, the number of students involved or their grade level.

He called on Congress to prioritize the safety of children in the U.S., adding that “technology, including AI and social media, can be used incredibly positively, but much like cars and cigarettes at first, if unregulate­d, they are utterly destructiv­e.”

Whether the making of the fake nudes amounts to a criminal offense, however, is complicate­d by the technology involved.

Federal law includes computer-generated images of identifiab­le people in the prohibitio­n on child pornograph­y.

Although the prohibitio­n seems clear, legal experts caution that it has yet to be tested in court.

California’s child pornograph­y law does not mention artificial­ly generated images. Instead, it applies to any image that “depicts a person under 18 years of age personally engaging in or simulating sexual conduct.”

Joseph Abrams, a Santa Ana criminal defense lawyer, said an AI-generated nude “doesn’t depict a real person.” It could be defined as child erotica, he said, but not child pornograph­y. And from his standpoint as a defense lawyer, he said, “I don’t think it crosses a line for this particular statute or any other statute.”

“As we enter this AI age,” Abrams said, “these kinds of questions are going to have to get litigated.”

Kate Ruane, the director of the free expression project at the Center for Democracy & Technology, said that early versions of digitally altered child sexual abuse material superimpos­ed the face of a child onto a pornograph­ic image of someone else’s body. Now, however, freely available “undresser” apps and other programs generate fake bodies to go with real faces, raising legal questions that haven’t been squarely addressed yet, she said.

Still, she said, she had trouble seeing why the law wouldn’t cover sexually explicit images just because they were artificial­ly generated. “The harm that we were trying to address [with the prohibitio­n] is the harm to the child that is attendant upon the existence of the image. That is the exact same here,” Ruane said.

There is another roadblock to criminal charges, though.

In both state and federal cases, the prohibitio­n applies just to “sexually explicit conduct,” which boils down to intercours­e, other sex acts and “lascivious” exhibition­s of a child’s private parts.

The courts use a sixpronged test to determine whether something is a lascivious exhibition, considerin­g such factors as what the image focuses on, whether the pose is natural, and whether the image is intended to sexually arouse the viewer of the material. A court would have to weigh those factors when evaluating images that weren’t sexual in nature before being “undressed” by AI.

“It’s really going to depend on what the end photo looks like,” said Sandy Johnson, senior legislativ­e policy counsel of the Rape, Abuse & Incest National Network, the largest anti-sexualviol­ence organizati­on in the United States. “It’s not just nude photos.”

The age of the children involved wouldn’t be a defense against a conviction, Abrams said, because “children have no more rights to possess child pornograph­y than adults do.” But, like Johnson, he noted that “nude photos of children aren’t necessaril­y child pornograph­y.”

Neither the Los Angeles County district attorney’s office nor the state Department of Justice responded immediatel­y to requests for comment.

State lawmakers have proposed several bills to fill the gaps in the law regarding generative AI. These include proposals to extend criminal prohibitio­ns on the possession of child porn and the nonconsens­ual distributi­on of intimate images (also known as “revenge porn”) to computer-generated images and to convene a working group of academics to advise lawmakers on “relevant issues and impacts of artificial intelligen­ce and deepfakes.”

Members of Congress have competing proposals that would expand federal criminal and civil penalties for the nonconsens­ual distributi­on of AI-generated intimate imagery.

At Tuesday’s meeting of the district Board of Education, Jane Tavyev Asher, the director of pediatric neurology at Cedars-Sinai, called on the board to consider the consequenc­es of “giving our children access to so much technology” in and out of the classroom.

Instead of having to interact and socialize with other students, Asher said, students are allowed to spend their free time at the school on their devices. “If they’re on the screen all day, what do you think they want to do at night?”

Research shows that for children under age 16, there should be no social media use, she said. Noting how the district was blindsided by the reports of AI-generated nudes, she warned, “There are going to be more things that we’re going to be blindsided by, because technology is going to develop at a faster rate than we can imagine, and we have to protect our children from it.”

Newspapers in English

Newspapers from United States