Miami Teens Arrested for Creating AI-Generated Nude Images of Classmates


March 14th, 2024: Two teenagers from Miami, Florida, aged 13 and 14, were arrested on December 22, 2023, for allegedly creating and sharing AI-generated nude images of their classmates without consent.

According to a police report cited by WIRED, the teenagers used an unnamed “AI app” to generate the explicit images of male and female classmates, ages 12 and 13.

The incident, which took place at Pinecrest Cove Academy in Miami, led to the suspension of the students on December 6th and was subsequently reported to the Miami-Dade Police Department.

The arrests and charges against the teenagers are believed to be the first of their kind in the United States related to the sharing of AI-generated nudes.

Under a 2022 Florida law that criminalizes the dissemination of deepfake sexually explicit images without the victim’s consent, the teenagers are facing third-degree felony charges, which are comparable to car theft or false imprisonment.

As of now, neither the parents of the accused boys nor the investigator and prosecutor in charge have commented on the case.

The issue of minors creating AI-generated nudes and explicit images of other children has become increasingly common in school districts across the country.

While the Florida case is the first known instance of criminal charges related to AI-generated nude images, similar cases have come to light in the US and Europe.

The impact of generative AI on matters of child sexual abuse material, nonconsensual deepfakes, and revenge porn has led to various states tackling the issue independently, as there is currently no federal law addressing nonconsensual deepfake nudes.

President Joe Biden has issued an executive order on AI, asking agencies for a report on banning the use of generative AI to produce child sexual abuse material, and both the Senate and House have introduced legislation known as the DEFIANCE Act of 2024 to address the issue.

Although the naked bodies depicted in AI-generated fake images are not real, they can appear authentic, potentially leading to psychological distress and reputational damage for the victims.

The White House has called such incidents “alarming” and emphasized the need for new laws to address the problem.

The Internet Watch Foundation (IWF) has also reported that AI image generators are leading to an increase in child sexual abuse material (CSAM), complicating investigations and hindering the identification of victims.