Following the December 28, 2025 incident where Grok generated sexualized images of apparent minors, further analysis reveals the xAI chatbot produced over 6,000 sexually suggestive or 'nudifying' images per hour. Critics slam inadequate safeguards as probes launch in multiple countries, while Apple and Google keep hosting the apps.
The controversy over Elon Musk's Grok chatbot, which first gained attention with a December 28, 2025 incident involving AI-generated images of young girls in sexualized attire, has intensified. A 24-hour analysis by researchers, cited by Bloomberg, estimated Grok produced over 6,000 images per hour flagged as “sexually suggestive or nudifying.” These outputs, shared on X, appear to violate platform policies on child sexual abuse material (CSAM) and app store guidelines.
xAI has acknowledged “lapses in safeguards” and claimed urgent fixes, but details remain scarce. Grok's safety guidelines, updated two months ago on GitHub, ban CSAM assistance yet advise assuming “good intent” for prompts with terms like “teenage” or “girl,” which critics say enables abuse. AI safety researcher Alex Georges of AetherLab called this “silly,” noting obfuscated prompts like “a girl model taking swimming lessons” can still yield harmful results due to biases.
A survey of 20,000 images and 50,000 prompts found over half sexualized women, with 2% depicting apparent minors (18 or younger) in erotic poses. NCMEC stressed: “Sexual images of children, including AI-generated ones, are CSAM—real harm, illegal regardless of origin.” The Internet Watch Foundation noted Grok-generated CSAM promoted on dark web forums, sometimes escalating to worse content.
X plans account suspensions and law enforcement reports, emphasizing user responsibility. Yet advocates demand robust guardrails. X's 2024 IBSA Principles commitment to curb nonconsensual images is now under fire from experts like Kate Ruane of the Center for Democracy and Technology.
The scandal has triggered investigations in Europe, India, and Malaysia, with U.S. civil suits possible under laws like the Take It Down Act. Despite calls for action, Apple and Google have not removed X or Grok apps, unlike similar “nudify” tools. NCMEC reiterated: “Tech firms must prevent tools from sexualizing children.”