AI-generated nudes at Beverly Hills middle school expose gaps in the law
If an eighth-grader in California shared a nude photo of a classmate with friends without consent, the student could conceivably be prosecuted under state laws dealing with child pornography and disorderly conduct.
If the photo is an AIgenerated deepfake, however, it is not clear that any state law would apply in that situation.
That’s the dilemma facing the Beverly Hills Police Department as it investigates a group of students from Beverly Vista Middle School who allegedly shared photos of classmates that had been doctored with an artificial-intelligence-powered app.
According to the district, the images used real faces of students atop AI-generated nude bodies.
Lt. Andrew Myers, a spokesman for the Beverly Hills police, said that no arrests had been made and that the investigation was continuing.
Beverly Hills Unified School District Superintendent Michael Bregy said the district’s investigation into the episode was in its final stages.
“Disciplinary action was taken immediately and we are pleased it was a contained, isolated incident,” Bregy said in a statement, although no information was disclosed about the nature of the action, the number of students involved or their grade level.
He called on Congress to prioritize the safety of children in the U.S., adding that “technology, including AI and social media, can be used incredibly positively, but much like cars and cigarettes at first, if unregulated, they are utterly destructive.”
Whether the making of the fake nudes amounts to a criminal offense, however, is complicated by the technology involved.
Federal law includes computer-generated images of identifiable people in the prohibition on child pornography.
Although the prohibition seems clear, legal experts caution that it has yet to be tested in court.
California’s child pornography law does not mention artificially generated images. Instead, it applies to any image that “depicts a person under 18 years of age personally engaging in or simulating sexual conduct.”
Joseph Abrams, a Santa Ana criminal defense lawyer, said an AI-generated nude “doesn’t depict a real person.” It could be defined as child erotica, he said, but not child pornography. And from his standpoint as a defense lawyer, he said, “I don’t think it crosses a line for this particular statute or any other statute.”
“As we enter this AI age,” Abrams said, “these kinds of questions are going to have to get litigated.”
Kate Ruane, the director of the free expression project at the Center for Democracy & Technology, said that early versions of digitally altered child sexual abuse material superimposed the face of a child onto a pornographic image of someone else’s body. Now, however, freely available “undresser” apps and other programs generate fake bodies to go with real faces, raising legal questions that haven’t been squarely addressed yet, she said.
Still, she said, she had trouble seeing why the law wouldn’t cover sexually explicit images just because they were artificially generated. “The harm that we were trying to address [with the prohibition] is the harm to the child that is attendant upon the existence of the image. That is the exact same here,” Ruane said.
There is another roadblock to criminal charges, though.
In both state and federal cases, the prohibition applies just to “sexually explicit conduct,” which boils down to intercourse, other sex acts and “lascivious” exhibitions of a child’s private parts.
The courts use a sixpronged test to determine whether something is a lascivious exhibition, considering such factors as what the image focuses on, whether the pose is natural, and whether the image is intended to sexually arouse the viewer of the material. A court would have to weigh those factors when evaluating images that weren’t sexual in nature before being “undressed” by AI.
“It’s really going to depend on what the end photo looks like,” said Sandy Johnson, senior legislative policy counsel of the Rape, Abuse & Incest National Network, the largest anti-sexualviolence organization in the United States. “It’s not just nude photos.”
The age of the children involved wouldn’t be a defense against a conviction, Abrams said, because “children have no more rights to possess child pornography than adults do.” But, like Johnson, he noted that “nude photos of children aren’t necessarily child pornography.”
Neither the Los Angeles County district attorney’s office nor the state Department of Justice responded immediately to requests for comment.
State lawmakers have proposed several bills to fill the gaps in the law regarding generative AI. These include proposals to extend criminal prohibitions on the possession of child porn and the nonconsensual distribution of intimate images (also known as “revenge porn”) to computer-generated images and to convene a working group of academics to advise lawmakers on “relevant issues and impacts of artificial intelligence and deepfakes.”
Members of Congress have competing proposals that would expand federal criminal and civil penalties for the nonconsensual distribution of AI-generated intimate imagery.
At Tuesday’s meeting of the district Board of Education, Jane Tavyev Asher, the director of pediatric neurology at Cedars-Sinai, called on the board to consider the consequences of “giving our children access to so much technology” in and out of the classroom.
Instead of having to interact and socialize with other students, Asher said, students are allowed to spend their free time at the school on their devices. “If they’re on the screen all day, what do you think they want to do at night?”
Research shows that for children under age 16, there should be no social media use, she said. Noting how the district was blindsided by the reports of AI-generated nudes, she warned, “There are going to be more things that we’re going to be blindsided by, because technology is going to develop at a faster rate than we can imagine, and we have to protect our children from it.”