In Almendralejo, a small town in southern Spain, more than 20 teenage girls recently discovered they had received artificially generated naked images of themselves on their mobile phones.
These disturbingly realistic deep fake nudes were manipulated from photographs initially posted on the girls’ Instagram accounts.
Now, parents and legal experts are confronting the question: Does the creation and distribution of these images constitute a crime under existing laws?
The explicit images were generated using ClothOff, an AI application, and subsequently disseminated in various WhatsApp groups.
The subjects in the photos were fully clothed, but the AI software convincingly altered the images to make the nudity appear real.
This has led to mounting concern among parents and legal authorities about whether these manipulated photos could be categorized as child pornography.
Miriam Al Adib, the mother of one of the affected girls, expressed her dismay on her Instagram account, stating, “The montages are super realistic, it’s very disturbing and a real outrage.”
She continued, “My daughter told me with great disgust: ‘Mum, look what they have done to me.'”
Al Adib also raised concerns that the altered images might even find their way to adult websites.
Fátima Gómez, another concerned parent, told Spanish news network Extremadura TV that her daughter had been blackmailed over social media. A boy demanded money from her daughter, and when she refused, he sent her a deep fake nude.
The FBI warned of AI-generated “sextortion” in June this year, and several women have already fallen victim.
The Spanish National Police have investigated the case and pinpointed several juveniles allegedly involved in creating and distributing these images.
The situation has also been referred to the Juvenile Prosecutor’s Office. The mayor of Almendralejo warned that what may have started as a “joke” could carry serious legal consequences.
ClothOff, the AI application in question, has a disconcerting slogan: “Undress anybody, undress girls for free.” It allegedly charges €10 for the generation of 25 naked images.
The youngest girl involved was just 11 years old.
Existing laws provide poor coverage
Manuel Cancio, a professor of criminal law at the Autonomous University of Madrid, told Euronews that existing laws may not adequately address the issue.
“Since it is generated by deepfake, the actual privacy of the person in question is not affected. The effect it has (on the victim) can be very similar to a real nude picture, but the law is one step behind,” he explained.
Essentially, the fact the image is fake or ‘fictionalized’ complicates legalities.
Legal experts are divided on how to categorize the crime. Some argue that it could fall under child pornography laws, leading to more severe penalties, while others suggest that the crime could be considered an offense against moral integrity.
Legal interpretations of the case would differ worldwide, and there would likely be similar debate if this had happened in other countries.
Children are being swept up in AI-related harm, with the World Economic Forum (WEF) recently drawing attention to a glaring lack of policy to protect children from AI risks.
US lawmakers also raised issues surrounding AI’s interaction with children, and journalists in the UK found that pedophile rings have been generating disturbing illicit content using AI models.
Legislators are being tested by such cases – this again reveals how traditional lawmaking is simply too cumbersome to adapt to AI-related risks that could have barely been imagined a few years ago.