Chattanooga Times Free Press

Spurred by teen girls, states move to ban deepfake nudes

- BY NATASHA SINGER

Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bondthemed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

A few weeks later, she and other female students learned that a male classmate was circulatin­g fake nude images of girls who had attended the dance, sexually explicit pictures he had fabricated using an artificial intelligen­ce app designed to automatica­lly “strip” clothed photos of real girls and women.

Mullet, 15, alerted her father, Mark, a Democratic Washington state senator. Although she was not among the girls in the pictures, she asked if something could be done to help her friends, who felt “extremely uncomforta­ble” that male classmates had seen simulated nude images of them. Soon, Mark Mullet and a colleague in the state House proposed legislatio­n to prohibit the sharing of AI-generated sexuality explicit depictions of real minors.

“I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself,” Caroline Mullet told state lawmakers during a hearing on the bill in January.

The state Legislatur­e passed the bill without opposition. Gov. Jay Inslee, a Democrat, signed it last month.

States are on the front lines of a rapidly spreading new form of peer sexual exploitati­on and harassment in schools. Boys across the United States have used widely available “nudificati­on” apps to surreptiti­ously concoct sexually explicit images of their female classmates and then circulated the simulated nudes via group chats on apps like Snapchat and Instagram.

Now, spurred in part by troubling accounts from teenage girls like Mullet, federal and state lawmakers are rushing to enact protection­s in an effort to keep pace with exploitati­ve AI apps.

Since early last year, at least two dozen states have introduced bills to combat AI-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organizati­on. And several states have enacted the measures.

Among them, South Dakota this year passed a law that makes it illegal to possess, produce or distribute AI-generated sexual abuse material depicting real minors. Last year, Louisiana enacted a deepfake law that criminaliz­es AI-generated sexually explicit depictions of minors.

URGENT ACTION NEEDED

“I had a sense of urgency hearing about these cases and just how much harm was being done,” said Rep. Tina Orwall, a Democrat who drafted Washington state’s explicit-deepfake law after hearing about incidents like the one at Issaquah High.

Some lawmakers and child protection experts say such rules are urgently needed because the easy availabili­ty of AI nudificati­on apps is enabling the mass production and distributi­on of false, graphic images that can potentiall­y circulate online for a lifetime, threatenin­g girls’ mental health, reputation­s and physical safety.

“One boy with his phone in the course of an afternoon can victimize 40 girls, minor girls,” said Yiota Souras, chief legal officer for the National Center for Missing & Exploited Children, “and then their images are out there.”

Over the past two months, deepfake nude incidents have spread in schools — including in Richmond, Illinois, and Beverly Hills and Laguna Beach, California.

Yet few laws in the United States specifical­ly protect people under 18 from exploitati­ve AI apps.

That is because many current statutes that prohibit child sexual abuse material or adult nonconsens­ual pornograph­y — involving real photos or videos of real people — may not cover AI-generated explicit images that use real people’s faces, said U.S. Rep. Joseph Morelle, D-N.Y.

Last year, he introduced a bill that would make it a crime to disclose AI-generated intimate images of identifiab­le adults or minors. It would also give deepfake victims, or parents, the right to sue individual perpetrato­rs for damages.

“We want to make this so painful for anyone to even contemplat­e doing, because this is harm that you just can’t simply undo,” Morelle said. “Even if it seems like a prank to a 15-yearold boy, this is deadly serious.”

ANOTHER BILL INTRODUCED

U.S. Rep. Alexandria OcasioCort­ez, D-N.Y., recently introduced a similar bill to enable victims to bring civil cases against deepfake perpetrato­rs.

But neither bill would explicitly give victims the right to sue the developers of AI nudificati­on apps, a step that trial lawyers say would help disrupt the mass production of sexually explicit deepfakes.

“Legislatio­n is needed to stop commercial­ization, which is the root of the problem,” said Elizabeth Hanley, a lawyer in Washington who represents victims in sexual assault and harassment cases.

The U.S. legal code prohibits the distributi­on of computerge­nerated child sexual abuse material depicting identifiab­le minors engaged in sexually explicit conduct. Last month, the FBI issued an alert warning that such illegal material included realistic child sexual abuse images generated by AI.

Yet fake AI-generated depictions of real teenage girls without clothes may not constitute “child sexual abuse material,” experts say, unless prosecutor­s can prove the fake images meet legal standards for sexually explicit conduct or the lewd display of genitalia.

Some defense lawyers have tried to capitalize on the apparent legal ambiguity. A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporaril­y restrain his client, who had created nude AI images of a female classmate, from viewing or sharing the pictures because they were neither harmful nor illegal. Federal laws, the lawyer argued in a court filing, were not designed to apply “to computer-generated synthetic images that do not even include real human body parts.” (The defendant ultimately agreed not to oppose a restrainin­g order on the images.)

Now states are working to pass laws to halt exploitati­ve AI images. This month, California introduced a bill to update a state ban on child sexual abuse material to specifical­ly cover AI-generated abusive material.

And Massachuse­tts lawmakers are wrapping up legislatio­n that would criminaliz­e the nonconsens­ual sharing of explicit images, including deepfakes. It would also require a state entity to develop a diversion program for minors who shared explicit images to teach them about issues like the “responsibl­e use of generative artificial intelligen­ce.”

“I had a sense of urgency hearing about these cases and just how much harm was being done.”

— REP. TINA ORWALL

PRISON TIME

Punishment­s can be severe. Under the new Louisiana law, any person who knowingly creates, distribute­s, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.

In December, Miami-Dade County police officers arrested two middle school boys for allegedly making and sharing fake nude AI images of two female classmates, ages 12 and 13, according to police documents obtained by The New York Times through a public records request. The boys were charged with third-degree felonies under a 2022 state law prohibitin­g altered sexual depictions without consent. (The state attorney’s office for MiamiDade County said it could not comment on an open case.)

The new deepfake law in Washington state takes a different approach.

After learning of the incident at Issaquah High from his daughter, Mark Mullet reached out to Orwall, an advocate for sexual assault survivors and a former social worker. Orwall, who had worked on one of the state’s first revenge-porn bills, then drafted a House bill to prohibit the distributi­on of AIgenerate­d intimate, or sexually explicit, images of either minors or adults. (Mullet, who sponsored the companion Senate bill, is now running for governor.)

Under the resulting law, first offenders could face misdemeano­r charges while people with prior conviction­s for disclosing sexually explicit images would face felony charges. The new deepfake statute takes effect in June.

“It’s not shocking that we are behind in the protection­s,” Orwall said. “That’s why we wanted to move on it so quickly.”

 ?? RUTH FREMSON/THE NEW YORK TIMES ?? On Wednesday, Caroline Mullet sits with her father, Washington State Sen. Mark Mullet, outside Issaquah High School in Issaquah, Wash. Caroline, a ninth grader, prompted her father to work on a bill to ban AI-generated sexually explicit images of minors. The ban is set to take effect in June.
RUTH FREMSON/THE NEW YORK TIMES On Wednesday, Caroline Mullet sits with her father, Washington State Sen. Mark Mullet, outside Issaquah High School in Issaquah, Wash. Caroline, a ninth grader, prompted her father to work on a bill to ban AI-generated sexually explicit images of minors. The ban is set to take effect in June.
 ?? RUTH FREMSON/THE NEW YORK TIMES ?? Washington State Rep. Tina Orwall poses last Tuesday at the marina in Des Moines, Wash. Orwall drafted Washington State’s new law prohibitin­g AI-generated sexually explicit images of minors.
RUTH FREMSON/THE NEW YORK TIMES Washington State Rep. Tina Orwall poses last Tuesday at the marina in Des Moines, Wash. Orwall drafted Washington State’s new law prohibitin­g AI-generated sexually explicit images of minors.

Newspapers in English

Newspapers from United States