'Nudify' Alert: AI Powers Apps That 'Undress' Any Photo

What happens when a technology becomes a weapon of harassment and digital violence? What if the pillars of the internet—Amazon, Google, or Cloudflare—not only fail to prevent it, but indirectly facilitate it?
In the last two years, websites and apps that allow people to create fake nude images from real photos without the consent of the people in the photographs have proliferated . The promise of apps that act as "x-rays" is not new, but it has seen a resurgence thanks to AI.
New research has shed light on the scope, business model, and actors that make this hidden industry possible. What's most alarming is not only the existence of these platforms , but also their business sophistication and dependence on the technological infrastructure we use every day.
The shady business of nudifiers: from fringe games to a multi-million-dollar industryThe "nudify" industry didn't emerge out of nowhere. Since the first explicit deepfakes began circulating in 2017 , the technology has become more refined and democratized. Today, a photo uploaded to one of these sites is enough to create a fake sexual image in just a few clicks. What began as a disturbing technical curiosity has transformed into an industry of non-consensual sexual content that generates tens of millions of dollars annually.
According to Indicator 's analysis, 18 of these sites earned between $2.6 million and $18.4 million in the last six months alone. They do this by selling subscriptions or credits, as with any other digital entertainment platform. Many mimic the business model of OnlyFans or Twitch, with affiliate programs, promotional videos, and even deals with porn actresses.
But its social impact is anything but trivial. Most victims are women, often teenagers, who have never given permission for their photos to be manipulated. Once created, the images circulate on social media, forums, Telegram channels, and other platforms that are difficult to track. The emotional and reputational damage is incalculable.
The key role of Google, Amazon, and Cloudflare in keeping the ecosystem aliveOne of the most surprising aspects of the research is not only the volume of visits or money generated, but also the technological infrastructure that supports the system . Of the 85 sites analyzed:
- 62 use Amazon Web Services or Cloudflare for content hosting and distribution.
- 54 use the Google login system.
- Several have integrated payment methods that rely on legitimate commercial gateways.
The problem isn't just that these large platforms are being used, but that they're being used repeatedly and massively, without effective oversight or systematic responses to dismantle these spaces. According to Alexios Mantzarlis, co-founder of Indicator , "Silicon Valley has taken a laissez-faire approach to generative AI," which has given these toxic businesses a lifeline.
Google and Amazon have claimed to take action when they detect violations of their policies, but the reality is that their monitoring system is deficient. In many cases, the creators of these websites use intermediary sites to camouflage their true intentions, thus evading automated controls.
Invisible victims: when the image becomes a weaponThe rise of these platforms has brought with it a new form of digital sexual harassment and violence . Victims rarely know their images have been manipulated. By the time they find out, the damage has already been done. And removing such content from the digital ecosystem is nearly impossible.
Among the most alarming cases are those involving adolescents. In several countries, boys have used photos of their classmates to create deepfakes, which are then circulated in WhatsApp groups or on social media. This form of cyberbullying leaves no physical trace, but it does leave deep wounds.
The problem is compounded by the absence of clear laws or their slow implementation . Although several countries are beginning to criminalize the creation and dissemination of fake sexual images, the legal framework has not yet caught up with technology. The platforms, for their part, claim that they are only infrastructure providers, not content providers.
Between pornography and digital fraud: a growing gray areaThe nudifier economy is becoming more sophisticated. Some sites even offer "premium services" with better quality . Others advertise on adult video platforms or use affiliate marketing techniques. They are beginning to blend into the legal porn industry, seeking legitimacy or at least anonymity within the chaos of online adult content.
This phenomenon represents a digital gray area , where sexual abuse merges with e-commerce business models. Furthermore, they do so through seemingly legitimate channels: payment gateways, loyalty systems, SEO traffic, promoted videos.
For experts, the key is cutting off access to the resources that make these platforms viable. If big tech companies block cloud services, registration systems, or hosting, many of these sites would collapse. The goal isn't to eradicate them—something impossible—but to make their lives so difficult that they lose visibility, users, and revenue.
Although the problem has been brewing for years, some recent actions indicate a shift in focus . Meta has sued a company responsible for a nudifier that advertised on its platform . Microsoft has tracked down developers of celebrity deepfakes. And the US administration has passed the controversial "Take It Down Act," which requires tech companies to act quickly on reports of non-consensual sexual content.
In the United Kingdom, the creation of sexual deepfakes has already been criminalized. And some cities, such as San Francisco, have filed lawsuits against companies that market these tools. However, measures remain partial and reactive. A common, comprehensive, and decisive strategy is lacking.
The risk, according to analysts, is that these platforms will seek refuge in even less regulated digital spaces , such as the dark web or closed social networks. But even in those environments, their impact will be less if they lose the logistical and commercial support they currently receive from large technology companies.
eleconomista