In recent years, a disturbing corner of the internet has quietly grown into a profitable industry. So-called “nudify” and “undress” websites—which use artificial intelligence to generate fake nude images, often without the consent of those depicted—have become increasingly widespread. These tools have enabled the creation of nonconsensual sexual imagery, including content that may involve minors, despite rising efforts by lawmakers and tech platforms to curb such abuse.
A recent investigation by Indicator, a research initiative focused on digital deception, analyzed 85 websites offering nudification tools. Their findings paint a troubling picture: these sites collectively receive an average of 18.5 million visitors per month and could be generating as much as $36 million annually.
The core technology behind these services is often supported—knowingly or not—by major tech infrastructure providers. The study found that 62 of the websites rely on hosting or content delivery from Amazon and Cloudflare, while 54 use Google’s login systems. Additionally, many of the platforms utilize common third-party services such as payment processors and webmaster tools, helping them stay operational and accessible.
Alexios Mantzarlis, cofounder of Indicator and an expert in online safety, describes the current state of the nudify ecosystem as a “profitable machine fueled by lax oversight in the generative AI space.” He argues that large tech companies should have taken stronger action when it became clear these tools were being used primarily for harassment and exploitation.
Amazon Web Services stated that it enforces strict terms requiring adherence to applicable laws and reviews reports of misuse swiftly. Google added that it is working to remove offending services and improve protections, noting its sign-in features are governed by policies that prohibit abuse and illegal content.
