Japan scrutinizes Musk's Grok AI over sexualized images

Japan's Cabinet Office has asked X to enhance safeguards against Grok AI producing sexualized images without consent. Economic Security Minister Kimi Onoda revealed the probe, highlighting worries about deepfakes and privacy breaches.

Japan has joined other countries in examining X over Elon Musk's AI service Grok, focusing on its involvement in generating and disseminating sexualized images of individuals without their permission.

The Cabinet Office requested that the social media platform bolster safeguards to limit the production of sexually altered images by Grok, according to Economic Security Minister Kimi Onoda. Officials also sent written inquiries regarding X's steps to block deepfakes and visuals that infringe on privacy, intellectual property, and the right to one's likeness, she added.

Onoda stated, "The Cabinet Office has asked the social media platform to improve safeguards and curb the output of sexually altered images by Grok."

This action aligns with global concerns about AI chatbots, including issues related to child abuse and child pornography. Grok operates on X and its image-creation capabilities have drawn criticism. The government's response aims to promote ethical AI use.

The minister highlighted the potential for such images to violate personal rights and called for prompt measures. The scrutiny was made public on January 16, 2026.

Awọn iroyin ti o ni ibatan

Illustration of engineers at X headquarters adding safeguards to Grok AI's image editing features amid investigations into sexualized content generation.
Àwòrán tí AI ṣe

X adds safeguards to Grok image editing amid escalating probes into sexualized content

Ti AI ṣe iroyin Àwòrán tí AI ṣe

In response to the ongoing Grok AI controversy—initially sparked by a December 28, 2025 incident generating sexualized images of minors—X has restricted the chatbot's image editing features to prevent nonconsensual alterations of real people into revealing attire like bikinis. The changes follow new investigations by California authorities, global blocks, and criticism over thousands of harmful images produced.

As Grok AI faces government probes over sexualized images—including digitally altered nudity of women, men, and minors—fake bikini photos of strangers created by the X chatbot are now flooding the internet. Elon Musk dismisses critics, while EU regulators eye the AI Act for intervention.

Ti AI ṣe iroyin

Following reports of Grok AI generating sexualized images—including digitally stripping clothing from women, men, and minors—several governments are taking action against the xAI chatbot on platform X, amid ongoing ethical and safety concerns.

In the latest controversy over xAI's Grok generating sexualized images on X, Swedish Energy Minister and Deputy PM Ebba Busch has publicly criticized an AI-altered bikini image of herself, calling for consent and restraint in AI use.

Ti AI ṣe iroyin

Elon Musk's xAI has loosened safeguards on its Grok AI, enabling the creation of non-consensual sexual images, including of children, prompting regulatory scrutiny. Despite Google's explicit policies prohibiting such content in apps, the Grok app remains available on the Play Store with a Teen rating. This discrepancy highlights enforcement gaps in app store oversight.

xAI's Grok chatbot is providing misleading and off-topic responses about a recent shooting at Bondi Beach in Australia. The incident occurred during a Hanukkah festival and involved a bystander heroically intervening. Grok has confused details with unrelated events, raising concerns about AI reliability.

Ti AI ṣe iroyin

Some users of AI chatbots from Google and OpenAI are generating deepfake images that alter photos of fully clothed women to show them in bikinis. These modifications often occur without the women's consent, and instructions for the process are shared among users. The activity highlights risks in generative AI tools.

 

 

 

Ojú-ìwé yìí nlo kuki

A nlo kuki fun itupalẹ lati mu ilọsiwaju wa. Ka ìlànà àṣírí wa fun alaye siwaju sii.
Kọ