Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
*Sigh* Deepfaking technology continues to be the absolute dirt worst. But then, it probably shouldn’t come as a surprise that an idea birthed by a bunch of Reddit nerds for the purpose of face-swapping female celebrities onto pornstars could only get more disturbing, right?
As first discovered by Motherboard, there’s an app out there that uses deepfake AI to turn images of clothed women (it only works on women, because obviously) into realistic looking nudes. It’s called DeepNude, and it cannot be overstated how much everyone involved in its creation needs to be chemically castrated.
Currently available to download for free on Windows, DeepNude also comes with a $99 premium version that offers high-resolution images for ew ew ewewewewwww.
“This is absolutely terrifying,” said Katelyn Bowden, founder, and CEO of revenge porn activism organization Badass. “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.”
It should come as no surprise that the developer behind this app is a goddamn creep.
The dev behind DeepNude identifies himself simply as “Alberto” and says things like “I’m not a voyeur, I’m a technology enthusiast.” When asked to why in the world he would ever create something so vile, he gave the old if-not-me-then-who excuse.
“I also said to myself: the technology is ready (within everyone’s reach),” he said. “So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year.”
So yeah, deepfakes are officially out of hand, and apps like DeepNude are only accelerating the ease with which they can cause legit harm to one of the most targeted groups on the Internet. Fantastic.
UPDATE: Annnnnndddd it’s gone.
— deepnudeapp (@deepnudeapp) June 27, 2019
Following the Motherboard story, the creator of DeepNude has decided to take the app down. Citing a server overload and, oh yeah, the potential harm it may cause, Alberto released a statement via DeepNude’s Twitter account.
We created this project for users’ entertainment months ago. We thought we were selling a few sales every month in a controlled manner… We never thought it would become viral and we would not be able to control traffic. We don’t want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones to sell it.
Does this make you feel dirty just reading this? Why would anyone need this? Have any thoughts? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.
Editors’ Recommendations:
- Not that it matters, but AI is now capable of detecting deepfakes
- Now there’s a deepfake video of Jon Snow apologizing for the Game of Thrones ending
- In Mark Zuckerberg’s ‘stolen data’ video, can we admit that deepfake technology is already out of hand?
- This new deepfake software will let you literally put words in someone’s mouth