Advertisement

AI-powered deepfake nude websites are targeted by San Francisco city attorney’s lawsuit

David Chiu speaks into a microphone as Xavier Becerra looks on in 2018
San Francisco City Atty. David Chiu, left, shown at a 2018 news conference, has sued 16 websites that use artificial intelligence to create deepfake nudes.
(Rich Pedroncelli / Associated Press)
Share via

San Francisco City Atty. David Chiu announced Thursday that his office is suing the operators of 16 AI-powered “undressing” websites that help users create and distribute deepfake nude photos of women and girls.

The lawsuit, which city officials said was the first of its kind, accuses the websites’ operators of violating state and federal laws that ban deepfake pornography, revenge pornography and child pornography, as well as California’s unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday.

Chiu’s office has yet to identify the owners of many of the websites, but officials say they hope to find their names and hold them accountable.

Advertisement

Chiu said the lawsuit has two goals: shutting down these websites and sounding the alarm about this form of “sexual abuse.”

On these websites, users upload photos of fully clothed real people, then artificial intelligence alters the image to simulate what the person would look like undressed. The sites create “pornographic” images without the consent of the persons in the photo, Chiu said during a Thursday morning news conference.

A Beverly Hills middle school has discovered that AI tools make a nasty form of bullying easy. Putting a stop to deepfake nude images could be hard.

According to the lawsuit, one of the websites promotes the nonconsensual nature of the images, stating, “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”

Advertisement

The availability of open source AI models means that anyone can access and adapt AI-powered engines for their own purposes. One result: sites and apps that can generate deepfake nudes from scratch or “nudify” existing images in realistic ways, often for a fee.

Deepfake apps grabbed headlines in January when fake nude images of Taylor Swift circulated online, but many other, far less famous people were victimized before and after the pop star. “The proliferation of these images have exploited a shocking number of women and girls across the globe,” from celebrities to middle school students, Chiu said.

Through its investigation, the city attorney’s office found that the websites in question were visited more than 200 million times in just the first six months of 2024.

Advertisement

Once an image is online, it’s very difficult for victims to determine what websites were used to “nudify” their images because these images “don’t have any unique or identifying marks that link you back to websites,” said Yvonne R. Meré, San Francisco’s chief deputy city attorney.

It’s also very difficult for victims to remove the images from the internet.

The producer is accused of distributing explicit photos and videos in texts and Snapchat without consent.

Five Beverly Hills eighth-graders were expelled this year for creating and sharing deepfake nude images of 16 eighth-grade girls, superimposing the girls’ faces onto AI-generated bodies.

Chiu’s office said it has seen similar incidents at other schools in California, Washington and New Jersey.

“These images are used to bully, humiliate and threaten women and girls,” Chiu said. “The impact on victims has been devastating on their reputations, their mental health, loss of autonomy and, in some instances, causing individuals to become suicidal.”

Advertisement