
Thanks to applications such as Photoshop and the media’s coverage of deepfakes, if we don’t already question the authenticity of digitally-produced images, we’re well on our way to doing so. If technology’s ability to create fake images-including nudes- well enough to fool the human eye isn’t new, why is this significant? The downside of DeepNude becoming open source is that the algorithm can be trained on a larger dataset of nude images to increase (“improve”) the resulting nude image’s accuracy level. The upside for potential victims is that the algorithm is failing to meet expectations: Quite the opposite- it’s back as an open source project on GitHub- making it more dangerous than it was as a standalone app.


Read more here.Īfter 4 days on the market, the creator(s) of DeepNude, the AI that “undressed” women, retired the app following a glut of backlash from individuals including leaders of the AI community:Īlthough DeepNude’s algorithm, which constructed a deepfake nude image of a woman (not a person a woman) based on a semi-clothed picture of her wasn’t sophisticated enough to pass forensic analysis, its output was passable to the human eye once the company’s watermark over the construced nude (for the free app) or “FAKE” stamp in the image’s corner ($50 version of the app) was removed. Update July 9, 7:55 p.m EST: GitHub removed the DeepNude source code from its website.
