An app with the ability to digitally remove the clothes from pictures of women to create fake nudes was called Deepnude, and it has finally been removed as a downloaded app by its creators. The app was on sale for $50. This was in the face of public criticisms promoted by a report by the online technology magazine Motherboard.
The technology behind the app was sophisticated, but this does not excuse the intention behind it. The application deployed neural networks to remove clothing from the images of women, with the objective of making women look realistically nude.
One of those protesting against the app was Katelyn Bowden, founder of anti-revenge porn campaign group Badass. Quoted by the BBC, she called the app “terrifying”, and said: “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo,” she told the site. “This tech should not be available to the public.”
In response to adverse publicity, the developers agreed to remove the app, at least for now. “The probability that people will misuse it is too high,” wrote the programmers in a message on their Twitter feed. “We don’t want to make money this way.”
There is a wider issue at play that is the rise of deepfakes and the dangerous role they can play as a a disinformation tool. “Deepfake” (a truncation of “deep learning” and “fake”) images and videos set out to manipulate. The technologies are becoming more sophisticated due to advances in artificial intelligence. This creates the potential for new kinds of misinformation which carries important consequences, such a reputation damage for an individual, objectifying women, or interfering in the political process.
An example of the sophistication was reported by Digital Journal (“Samsung brings Mona Lisa ‘to life’ with deepfake AI“) where background of Leonardo da Vinci’s painting Mona Lisa was enhanced through a new artificial intelligence initiative, undertaken by the technology firm Samsung. While this in itself was a relatively amusing exercise, it demonstrated the sophistication of generative adversarial networks and huge potential for misuse.