X

DeepNude app that turned photos of clothed women into nudes shuts down

The viral app took deepfakes to another level, but its Twitter account now says, "We don't want to make money this way."

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
2 min read
Young woman sitting on bed with cellphone

Programs like DeepNude, which use artificial intelligence to manipulate media, raise questions about consent and privacy as synthetic photos and videos become easier to create. 

Getty Images

The creators of DeepNude, a desktop app that used artificial intelligence to morph a photo of a clothed woman into a picture of her naked, have shut down the app and renounced using the software, a day after an article focused attention on the program.

"We don't want to make money this way," said a message posted on the app's Twitter account, which still carries a bio describing the program as the "superpower you always wanted." 

"Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones who sell it," the post continued. "The world is not yet ready for DeepNude." 

In addition, the program's website returned a blank page with the text "not found." 

The app is the latest form of media manipulation to raise questions about privacy and consent as artificial intelligence gets better at creating fake photos and videos. Though computer manipulation of media has existed for decades, programs like DeepNude and deepfake-video technology are making the creation of sophisticated fakes easier for average people to do -- and making forgeries harder to identify with the unaided eye. 

Watch this: Facial recognition is going to be everywhere

Saying that the app was originally created as entertainment, DeepNude's post discouraged use of the program and said that downloading the software from other sources or sharing it would violate its terms. The post also said DeepNude won't be released in other versions, and that nobody -- including people who hold a license for a premium version -- has permission to use it. (It's unclear how or if DeepNude can enforce those terms; the creators weren't immediately reachable for comment.) 

DeepNude, which launched as a downloadable Windows and Linux application on June 23, was the subject of a Vice article Wednesday.