© 2025 WGCU News
PBS and NPR for Southwest Florida
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Sweeping new Florida law targets using AI to ‘nudify’ people in photographs

A Florida Law is taking effect that targets AI-assisted sex crimes against adults and minors. (Photoillustration by Kaley Mantz/Fresh Take Florida)
A Florida Law is taking effect that targets AI-assisted sex crimes against adults and minors. (Photoillustration by Kaley Mantz/Fresh Take Florida)

GAINESVILLE – A sweeping new law in Florida that took effect Wednesday makes it illegal to produce sexual images of a person using artificial intelligence or similar technologies without their permission.

The new law also allows people whose photographs were manipulated that way to sue those responsible in civil court.

The law took effect this week only two days after Marion County sheriff’s deputies arrested Lucius William Martin, 39, of Eustis, Florida, and accused him of using AI to produce nude images of the juvenile daughter of someone close to him and her friend. The software Martin used digitally removed the girls’ clothing in pictures he downloaded from social media, according to court records.

Such tools can be used to “nudify” an otherwise innocent photograph.

Martin was arrested Monday and remains in the county jail in Ocala, facing eight felony counts of child pornography under Florida’s existing statutes and one count of trying to destroy evidence. The girl’s mother captured a screenshot of the images to give to authorities, the sheriff’s office said. A deputy said Martin reset his phone as he was being arrested to delete the evidence.

Martin couldn’t be reached immediately for comment because he was still in jail. He was being appointed a public defender on Thursday for his arraignment scheduled next month, but no lawyer had yet been assigned to represent him.

The versions of the images of the girls nude on Martin’s phone included remnants of their clothing that had been digitally removed and showed deformities on the girls’ arms and legs, which a deputy wrote in court records “is common on AI-generated imagery.” His phone also contained the same, unaltered images of the girls wearing clothes, court records said.

Last year, singer Taylor Swift was the victim of AI-generated, fake images of her nude, also called “deep fakes,” circulating over popular social media sites.

The Florida bill, sponsored by Republican Reps. Mike Redondo of Miami and Jennifer Kincart Jonsson of Bartow and known as the “sexual images” bill, passed the Legislature unanimously earlier this year and was signed into law by Gov. Ron DeSantis in May.

Rep. Michelle Salzman, R-Cantonment, is seen speaking about a bill that makes it illegal to produce sexual images of a person using artificial intelligence or similar technologies without their permission, during a House Judiciary Committee hearing on April 2, 2025, in Tallahassee, Fla. Salzman said the new law, which took effect Wednesday, Oct. 1, 2025, helps to hold accountable people who misuse powerful AI tools.
Florida Channel/Fresh Take Florida
Rep. Michelle Salzman, R-Cantonment, is seen speaking about a bill that makes it illegal to produce sexual images of a person using artificial intelligence or similar technologies without their permission, during a House Judiciary Committee hearing on April 2, 2025, in Tallahassee, Fla. Salzman said the new law, which took effect Wednesday, Oct. 1, 2025, helps to hold accountable people who misuse powerful AI tools.

Rep. Michelle Salzman, R-Cantonment, said during a House Judiciary Committee hearing earlier this year that her community in Florida’s Panhandle has suffered cases of AI-generated sexual images.

“Seeing this brought forward is a breath of fresh air,” she said. “AI is incredible. We need it. It does a lot of good, but with great power comes great responsibility, and a lot of folks aren’t taking responsibility for their actions.”

Key provisions of the new law include criminalizing use of AI to generate a nude image of an actual person without their consent, or soliciting or possessing such images. The new felony punishment includes a prison term up to 5 years for each image and a fine up to $5,000.

The new law was long overdue, said former Sen. Lauren Book, a leading advocate for sex crime victims. She said AI and popular software tools make it easy to create realistic images.

“Legislation is a crucial step in ensuring that our justice system can keep pace with technological advancements so that we are not lagging in protecting our children,” said Book, a child sex abuse survivor who founded “Lauren’s Kids,” a non-profit dedicated to stopping child sex abuse.

Such digitally altered images of children or teens are often used to extort families, said Fallon McNulty, executive director at the National Center for Missing and Exploited Children. Criminals can extract payment or sexual favors in exchange for agreeing not to distribute nude images to victims’ friends, classmates or family members.

The center’s  CyberTipline, which started tracking reports involving generative AI in 2023, received 4,700 reports involving AI-generated images in its first year. In the first six months of 2025, she said the tipline had received 400,000 such reports.

McNulty said mainstream software companies try to block and report illicit use of their programs, but some developers offer apps with no built-in safety measures.

Meta announced earlier this year it was suing a company in Hong Kong that it said ran ads on its platforms to promote an app that helps users create nonconsensual, sexualized images using AI. It sued the developer of an app called CrushAI, which could be used to create nude images.

Lawmakers are always “trying to play catch up” when it comes to regulating AI, said Elizabeth Rasnick, an assistant professor at the Center for Cybersecurity at the University of West Florida, adding that they are “doing the best they can with what they currently have.”

“ There's no possible way we can foresee how these tools are going to be used in the future,” Rasnick said. “The Legislature is always going to have to try to fill in whatever gaps there were after those gaps are discovered and exploited.”

Digitally altering images has been possible for decades using specialized image-editing tools, but the new AI programs can turn out sexual content in seconds with no special skills required, said Kevin Butler, a  professor of computer science and  director of the Institute for Cybersecurity Research at the University of Florida.

Using the new AI tools can take a photo posted on social media and “undress the whole family,” said Kyle Glen, commander of the Central Florida Internet Crimes Against Children Task Force. He praised the new law but noted that juvenile offenders – who may try to bully classmates by creating such images – often aren’t prosecuted criminally the first time they are caught.

“As much laws as we pass and as much software is out there, and technology that we use, bad guys are always a step ahead,” Glen said. “They're innovative and they're going to think of ways to get around law enforcement or exploit children, you know, if that's what they're infatuated with.”

This story was produced by Fresh Take Florida, a news service of the University of Florida College of Journalism and Communications. The reporter can be reached at maria.avlonitis@freshtakeflorida.com.