Deepfakes have become all too common in today’s world. What started out as a Reddit thread by a user named ‘deepfakes’ who posted fake and explicit celebrity videos using deep learning has now become a toxic tool to bully and harass other people. And now, some app manufacturers are moving beyond the concept of Deepfakes. One such AI app, called DeepNude, can create nudes of women out of their fully clothed pictures by using neural networks.
The app, DeepNude, only works on women where it swaps their clothes, with intimate parts. What’s even more pathetic is the fact that the app doesn’t work on men, as reported by Vice. Also, the app doesn’t seem to work well for pictures where the person is fully clothed (eg; winter wear) and works best for pictures of women in swimsuits and short dresses. Similarly, if the picture doesn’t have good lighting or angle, or in case it’s animated, the app wouldn’t perform well.
Creator of DeepNude used pix2pix, an open-source algorithm by the University of California, Berkeley based on General Adversarial Networks (GANs). The algorithm was trained using a large dataset which included nude pictures of over 10,000 women. The algorithm also keeps self-learning to improve itself over time. DeepNude creator further added that he also wants to make the app applicable for male pictures, but since the nude pictures of women are easily available online, he decided to make the female version first. Also, the app is free to download and try, but charges $50 from users who want to remove the watermark that says “FAKE” from the image. The app works both online and offline.
“I also said to myself: the technology is ready (within everyone’s reach). So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year”, the DeepNude creator told Vice. He further added that “I’m not a voyeur, I’m a technology enthusiast. Continuing to improve the algorithm”.
Now although the creator of the app claims to be “improving the AI algorithm”, the very concept of the app screams of misogyny and sparks some major concerns. Firstly, the app promotes the whole idea around “revenge porn” which has long been a bone of contention for companies trying to get the deepfakes situation under control. For instance, DARPA is known to be working on certain AI forensic tools to catch deepfakes.
Secondly, the very fact that the app charges $50 by promoting people to remove the “FAKE” watermark reeks of hypocrisy on the creators part. This is because somewhere he believes that people would be willing to pay the money to make the picture seem real. And thirdly, although he claims to want to “improve the AI algorithm”, he could have come up with other app ideas to test it.
With all the existing hate and controversy surrounding the toxic nature of deepfakes, the app only seems to be exacerbating the use of deepfakes.