GitHub is banning code from DeepNude, the app that used AI to create fake nude pictures of women. Motherboard, which first reported on DeepNude last month, confirmed that the Microsoft-owned software development platform wont allow DeepNude projects. GitHub told Motherboard that the code violated its rules against “sexually obscene content,” and its removed multiple repositories, including one that was officially run by DeepNudes creator.

DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI “deepfakes.” The development team shut it down after Motherboards report, saying that “the probability that people will misuse it is too high.” However, as we noted last week, copies of the app were still accessible online — including on GitHub.

Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. “The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code,” wrote the team on a now-deleted page. “DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”

GitHubs guidelines say that “non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes.” But the platform bans “pornographic” or “obscene” content.

DeepNude didnt invent the concept of fake nude photos — theyve been possible through Photoshop, among other methods, for decades. And its results were inconsistent, working best with photos where the subject was already wearing something like a bikini. But Motherboard called them “passably realistic” under these circumstances, and unlike Photoshop, they could be produced by anyone with no technical or artistic skill.

Politicians and comm