AGs want platforms to help prevent the creation and spread of sexually exploitative AI-generated content
August 27, 2025
- Attorneys general from 48 statesare urging major search and payment platforms to take stronger action against nonconsensual deepfake pornography.
-
The coalition is pressing companies to disclose their current safeguards and commit to further steps to prevent the creation and spread of sexually exploitative AI-generated content.
-
Officials say the technology is increasingly being used to harass, intimidate, and exploit victimsparticularly women and girls.
A bipartisan coalition of 47stateattorneys general in sending letters to search platforms and payment processors, demanding stronger measures to combat the growing problem of nonconsensual deepfake pornography.
DeepfakesAI-generated videos, images, or audio clips that appear authenticare increasingly being used to create sexually explicit material without the consent of those depicted. Such content has been weaponized to bully, harass, and exploit victims, raising urgent legal and ethical concerns.
Tools that allow people to generate intimate images and videos of real people without their consent can cause significant harm to the public particularly to women and girls, California Attorney General Rob Bonta said. As technology rapidly evolves, I am committed to engaging with industries to ensure were all working together to guide AI toward positive potential that will benefit us not hurt us.
The letters request information from major search engines and financial platforms about the steps they are taking to limit access to deepfake porn and call on them to strengthen their policies. Search providers, for instance, can filter out results that promote deepfake creation tools, while payment processors can ensure they are not facilitating transactions for those selling such content.
A growing threat
Deepfakes pose a growing threat to all of us, but especially to women and girls, and tech companies must do more to stop the spread of these harmful materials, said Vermont Attorney General Charity Clark. Vermont law already includes non-consensual deepfakes in its revenge pornography statute. Its time for search engines and payment platforms to take responsibility for their role in spreading this harm and crack down on the proliferation of deepfakes. I am proud to have led these bipartisan letters.
In their letters, the coalition points to existing industry practices that can be deployed to address these deepfakes. For example, search engines already limit access to harmful content such as searches for how to build a bomb and how to kill yourself.
The attorneys general urged these companies to adopt similar measures for searches such as how to make deepfake pornography, undress apps, nudify apps, or deepfake porn. The coalition also urged payment platforms to deny sellers the ability to use their services when they learn of connections to deepfake non-consensual intimate imagery tools and content and remove those sellers from their network.
The bipartisan coalition behind the deepfake pornography crackdown spans nearly every U.S. state and territory, underscoring the widespread concern about the potential harms of artificial intelligence when left unchecked.
#States #team #curb #deepfake #pornography