The popularity of websites and applications utilizing artificial intelligence to digitally undress women in photographs is on the rise, researchers have found.
Data from Graphika, a social network analysis firm, reveals that in the month of September, these undressing websites attracted a staggering 24 million visitors.
According to Graphika, numerous “nudify” or undressing services are actively promoting themselves on popular social networks. Since the start of the year, there has been a more than 2,400% surge in social media advertisements for undressing apps, notably on platforms like X and Reddit, as reported by the researchers. These services employ AI to generate nude versions of images, predominantly targeting women.
This trend is alarming due to the development and dissemination of non-consensual pornography, facilitated by advancements in artificial intelligence. Known as deepfake pornography, this fabricated media often originates from images taken from social media and is distributed without the subjects’ consent, control, or awareness. Such proliferation poses significant legal and ethical challenges.
Graphika reports that the surge in the popularity of these apps coincides with the release of several advanced open-source diffusion models. These AI models can generate images of significantly higher quality than those produced a few years ago. Being open-source, these models are freely available for app developers to use.
Santiago Lakatos, an analyst at Graphika, points out the realism of these creations, noting that earlier deepfakes were often of lower quality and blurry.
An advertisement for an undressing app posted on X used language implying that users could create nude images of someone and then send them to the person depicted, potentially encouraging harassment. Meanwhile, one of these apps has invested in sponsored content on Google’s YouTube, appearing at the top of search results for the term “nudify.”
A spokesperson from Google stated that the company prohibits ads featuring sexually explicit content and is removing those that breach its policies after reviewing the ads in question. Both X and Reddit did not respond to requests for comments.
The services, some charging $9.99 per month, are experiencing a surge in traffic and claim on their websites to be attracting numerous customers. “They are conducting a lot of business,” remarked Lakatos. He further noted that one of the undressing apps claims on its website to have over a thousand users daily.
The internet has long struggled with non-consensual pornography of public figures, but privacy experts like Eva Galperin, the Director of Cybersecurity at the Electronic Frontier Foundation, are increasingly worried. Advances in AI technology are making deepfake software more accessible and efficient. Galperin observes that ordinary people are increasingly using this technology, with incidents occurring among high school students and college-goers.
Related: Meta introduces AI models for video generation, image editing
Galperin points out that many victims remain unaware of these images, but even those who discover them often face challenges in getting law enforcement to investigate or in securing funds for legal action.
While there is no federal law in the U.S. that bans the creation of deepfake pornography in general, producing such images of minors is illegal. In a landmark case, a North Carolina child psychiatrist received a 40-year prison sentence in November for using undressing apps on photos of his patients. This case marked the first prosecution under the law that bans the generation of deepfake child sexual abuse material.
TikTok has proactively blocked the keyword “undress,” commonly used to search for these services. The app warns users that this term may link to content or behavior breaching its guidelines. TikTok’s representative, however, declined to provide further details. Similarly, in response to inquiries, Meta Platforms Inc. started blocking keywords related to the search for undressing apps, with their spokesperson choosing not to comment.