Key Takeaways
- The growing availability of AI tools and means of distribution has made it increasingly difficult for regulators to combat the spread of nonconsensual sexual and intimate images.
- The use of AI to harm women has only just begun, with many experts warning that the situation will only get worse.
- Some AI tools, such as Grok, have been used to generate deepfake images of women without their consent, while others, like Claude, have stricter safeguards in place.
- The route for misogynistic content to reach the mainstream has grown broader, with communities on Reddit and Telegram discussing how to bypass guardrails to make LLMs produce pornography.
- Researchers have found dozens of nudification apps and websites, which collectively received nearly 21 million visitors in May 2025.
Introduction to the Problem
The discovery of Grok AI, an Elon Musk-owned AI chatbot, has led to a growing concern about the use of artificial intelligence to generate nonconsensual sexual and intimate images. As one enthusiast wrote on Reddit, "Since discovering Grok AI, regular porn doesn’t do it for me anymore, it just sounds absurd now." This sentiment is echoed by another user who agreed that Grok AI allows for the creation of highly specific and personalized content, saying "If I want a really specific person, yes." However, this has also highlighted a much wider problem: the growing availability of tools and means of distribution that present worldwide regulators with what many view as an impossible task.
The Limitations of Safeguards
While some AI tools have stricter safeguards in place, others have far fewer limits. For example, Claude, a large language model, says "I can’t do that. I’m not able to edit images to change clothing or create manipulated photos of people" when asked to strip a photograph of a woman into a bikini. On the other hand, Grok has been used to generate deepfake images of women without their consent, and users have been sharing tips on how to generate the most hardcore pornographic images possible using pictures of real women. As Anne Craanen, a researcher at the Institute for Strategic Dialogue, noted, "There is a very fruitful ground there for misogyny to thrive."
The Ecosystem of Misogynistic Content
Beyond large language models and major platforms, there is a whole ecosystem of websites, forums, and apps devoted to nudification and the humiliation of women. These communities are increasingly finding pipelines to the mainstream, with threads on X amplifying information about nudification apps and how to use them. Research from the ISD last summer found dozens of nudification apps and websites, which collectively received nearly 21 million visitors in May 2025. As Nina Jankowicz, a disinformation expert, said, "There are hundreds of apps hosted on mainstream app stores like Apple and Google that make this possible. Much of the infrastructure of deepfake sexual abuse is supported by companies that we all use on a daily basis."
The Impact on Women and Girls
The use of AI to harm women has only just begun, and many experts warn that the situation will only get worse. Clare McGlynn, a law professor and expert in violence against women and girls, said that she feared things would only get worse, citing the example of OpenAI announcing that it would allow "erotica" in ChatGPT. As she noted, "Women and girls are far more reluctant to use AI. This should be no surprise to any of us. Women don’t see this as exciting new technology, but as simply new ways to harass and abuse us and try and push us offline." Jess Asato, Labour MP for Lowestoft, has been campaigning on this issue and said that she has been subjected to explicit imagery and harassment, even since the restrictions on Grok.
The Performance of Misogyny
The point of creating deepfake nudes is often not just about sharing erotic imagery, but the spectacle of it, said Craanen. As she noted, "It’s the actual back and forth of it, [trying] to shut someone down by saying, ‘Grok, put her in a bikini.’" This performance of misogyny has a cascading effect on democratic norms and women’s role in society, highlighting the need for urgent action to combat the spread of nonconsensual sexual and intimate images. As Craanen said, "The performance of it is really important there, and really shows the misogynistic undertones of it, trying to punish or silence women."
https://www.theguardian.com/technology/2026/jan/14/use-of-ai-to-harm-women-has-only-just-begun-experts-warn

