All 50 US state prosecutors are urging Congress to take action against the potential exploitation of children through AI-generated pornography.
In a letter addressed to Republican and Democratic leaders in both the House and Senate, attorneys general nationwide asked lawmakers to “establish an expert commission to study the means and methods of AI that can be used to exploit children specifically” and to expand current laws to include AI-generated child sexual abuse material.
“We are engaged in a race against time to protect the children of our country from the dangers of AI,” the letter reads, which was shared in advance with The Associated Press.
“Indeed, the proverbial walls of the city have already been breached. Now is the time to act.”
The initiative was led by South Carolina Attorney General Alan Wilson, who gathered support from all 50 states.
Wilson expressed optimism that bipartisan support for the issue could translate into legislative action. “Everyone’s focused on everything that divides us,” Wilson said.
He continued, “My hope would be that, no matter how extreme or polar opposites the parties and the people on the spectrum can be, you would think protecting kids from new, innovative and exploitative technologies would be something that even the most diametrically opposite individuals can agree on — and it appears that they have.”
The race to protect children from AI-related risk is on
This follows a statement by the World Economic Forum (WEF) calling upon policymakers to formulate protective measures to shield children from AI-related harm.
Paedophilic AI-generated images and images that appear to sexualize children are popping up across the internet, including in social media adverts.
Wilson also pointed to several risks that AI technologies bring, including creating “deepfake” scenarios involving minors or altering a real child’s image to depict abuse.
“Your child was never assaulted, your child was never exploited, but their likeness is being used as if they were,” he noted.
This has already happened to adults – the FBI recently issued a statement warning people against AI-generated ‘sexploitation.’
Another concern he raised was the digital creation of fictitious children for exploitative purposes.
“The argument would be, ‘well I’m not harming anyone — in fact, it’s not even a real person,’ but you’re creating demand for the industry that exploits children,” said Wilson.
While Congress has yet to enact comprehensive AI legislation, some moves have been made within the tech industry to tackle the issue.
Platforms like Meta, OnlyFans, and Pornhub have started using an online tool called “Take It Down,” allowing teens to report explicit images and videos, including AI-generated content.
This seems to be a global problem, and there is extensive work to be done to ensure children are protected from multifarious AI-related risks.