GitHub has banned the code that is based on DeepNude, the app that used artificial intelligence to create fake nude images of women. It has been confirmed that the Microsoft-owned software development platform would not allow DeepNude projects anymore.
The confirmation came from the Motherboard, the website that reported on DeepNude last month. Speaking to the website, Github said that the code of the app violated its rules against sexually obscene content. Github has also clarified that it has removed multiple repositories, including the one that DeepNude’s creator managed.
It is worth mentioning that DeepNude was initially a paid app that created nude pictures of women using ‘deepfakes’, a technology that is similar to AI. The women whose images it created were not asked for consent in any manner. After Motherboard reported about the app, the development team shut it down sawn saying, “the probability that people will misuse it is too high.” But then, it was reported last week that copies of the app were accessible online and some of them were on Github too.
Later that week, DeepNude team also uploaded the core algorithm to the platform, but then, this was not the actual app interface. In one among the pages that have now been deleted, the team wrote, “The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code. DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”
We should also mention that GitHub’s guidelines clearly state that “non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes.” However, the platform draws a line when it comes to pornographic or obscene content and outrightly bans it.
It is worth pointing out that the concept of fake nude photos was not invented by DeepNude. People have been creating fake nude images using various other platforms like Photoshops for many years now. Moreover, the results one got from DeepNude are reportedly not very consistent and worked the best when the subject was already wearing something like a swimsuit. But then, one did not need any technical or artistic skills to produce photos on the app and that is where it was different from Photoshop.
A lot has been spoken about how dangerous the impact of deepfakes can be. It cannot only be used to create fake and non-consensual porn of women but also to harass them using fake nudes. While copies of DeepNude may not stop appearing online, GitHub’s decision will make it harder for people to find the app.
For the latest gadget and tech news, and gadget reviews, follow us on Twitter, Facebook and Instagram. For newest tech & gadget videos subscribe to our YouTube Channel. You can also stay up to date using the Gadget Bridge Android App.