US sues 16 popular AI sites creating explicit deepfakes
The San Francisco City Attorney's office has initiated legal action against 16 popular AI-powered "undressing" websites. These platforms are allegedly used to generate non-consensual nude deepfakes of women and girls. The lawsuit, announced by City Attorney David Chiu, reveals that these sites collectively received over 200 million visits in the first half of 2024 alone.
Websites accused of violating multiple laws
The websites in question enable users to upload images of fully dressed individuals, which are then digitally manipulated using AI tools to simulate nudity. The lawsuit accuses these sites of contravening state and federal laws that prohibit revenge pornography, deepfake pornography, and child pornography. They are also alleged to have violated California's unfair competition law due to the significant harm they inflict on consumers.
Lawsuit seeks penalties and permanent shutdown
The lawsuit is not only seeking civil penalties but also aims to permanently shut down these websites and prevent their operators from creating future deepfake pornography. This legal action comes at a time when there is increased focus on the creation and distribution of non-consensual nudes, largely due to advancements in generative AI. These developments have led to a rise in cases of "sextortion," where individuals are blackmailed with explicit content created without their consent.
Chiu expresses horror over exploitation of women and girls
Chiu expressed his shock at the extent of exploitation facilitated by these websites. He stated, "This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation." He further emphasized that this is a significant issue that society needs to address urgently.