The Washington PostDemocracy Dies in Darkness

This app had the ability to digitally ‘undress’ women. After backlash, its creator has pulled the plug.

Artificial intelligence creating ‘deepfake’ images has been harnessed to make a torrent of fake porn

By
July 1, 2019 at 12:00 p.m. EDT
(Sarah Hashemi/The Washington Post)

Adapted from a story by The Washington Post’s Taylor Telford.

While artificial intelligence to create “deepfake” images has posed a threat to national security, it has also been harnessed to make a torrent of fake porn, including widely circulated videos of celebrities such as Gal Gadot and Scarlett Johansson. Although sites including Reddit, Twitter and Pornhub have tried to ban pornographic deepfakes, they have had limited success. The technology is cheap and easily accessible, and the opportunities for use are limitless.

Now, an app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that the world is not ready for it.

DeepNude had used AI to present realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes. Deepfake photos and videos often appear credible to the average viewer, prompting concerns by researchers and lawmakers about their potential to mislead the public, especially in the run-up to the 2020 election.

AI-generated videos that show a person’s face on another’s body are called “deepfakes.” They’re becoming easier to make and weaponized against women. (Video: Drew Harwell, Jhaan Elker/The Washington Post)

Last month, a doctored clip of House Speaker Nancy Pelosi (D-Calif.) that had been altered to make her slur her words went viral, drawing attention to how even poorly made videos can be used to spread political disinformation at alarming speeds.

“Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content,” Danielle Citron, a law professor at the University of Maryland, testified before a House committee on the dangers of deepfakes this month. “Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage.”

Facebook refused to delete an altered video of Nancy Pelosi. Would the same rule apply to Mark Zuckerberg?

The DeepNude app

The free version of DeepNude placed a large watermark on images it generated. The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures. As the online magazine Motherboard noted, it could be easily cropped out.

When Motherboard first reported on the app on Thursday, its creator, a programmer who goes by “Alberto,” insisted he was “not a voyeur,” merely a technology enthusiast who was driven to create the app out of “fun and enthusiasm.”

“Also due to previous failures (other start-ups) and economic problems, I asked myself if I could have an economic return from this algorithm,” the programmer told Motherboard. “That’s why I created DeepNude.”

Fake-porn videos don’t just target celebrities — they’re being used to humiliate and harass ordinary women

The app, which was available for Windows and Linux, was based on an open-source algorithm from the University of California at Berkeley, Alberto told Motherboard. DeepNude was taught to create convincing nudes using 10,000 naked images.

DeepNude’s creator said he mulled the ethics of his software but ultimately decided the same results could be accomplished through any number of photo-editing programs.

“If someone has bad intentions, having DeepNude doesn’t change much. ... If I don’t do it, someone else will do it in a year,” Alberto said.

Soon after Motherboard’s report, traffic caused the server to crash. Late Thursday, after further coverage and outrage on social media, Alberto took to Twitter to announce DeepNude’s end, saying the chances of people abusing the app were too high.

“We don’t want to make money this way,” the tweet read. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”

Pornographic deepfake images don’t technically count as revenge porn because they aren’t actual images of real women’s bodies, but they are still capable of causing psychological damage. California is considering a bill that would make pornographic deepfakes illegal, making it the only state to date to take legislative action against them.