Elon Musk’s social media platform X has taken action against misuse of its AI chatbot Grok. The company has confirmed that Grok will no longer be allowed to create or edit sexualised images of real people.
The move comes after strong criticism from users and regulators. Women and children were reportedly being targeted using AI-generated images. X says the new restrictions are meant to close this gap and prevent further harm.
According to the platform, Grok is now blocked from editing images of real individuals in revealing clothing. This rule applies to everyone. Even paid subscribers are included.
Why X restricted Grok’s image generation features
X says it has added technical safeguards to stop image misuse at the source. The company also limited image generation and editing tools to paid users only. This, it claims, makes it easier to track misuse and hold offenders accountable.
Earlier, Grok had already restricted explicit image generation to subscribers. That step came after multiple women complained their photos were being altered without consent. The backlash continued, pushing X to take stronger action.
In India alone, X removed nearly 3,500 obscene or sexually explicit images created using Grok. Around 600 accounts were also blocked for violating platform rules.
Government pressure and safety concerns around Grok
The changes followed intervention from India’s Ministry of Electronics and Information Technology. The ministry asked X to immediately remove all illegal content created using AI tools. It also directed the platform to preserve evidence and enforce stricter penalties.
Officials warned that accounts misusing Grok to target women and children should be suspended or terminated. X’s response was shared as part of an official compliance report submitted to the government.
The case highlights growing concerns around AI safety and misuse. As generative tools become more powerful, platforms are under pressure to act faster and more responsibly.
For X, tightening Grok’s image controls appears to be a step toward damage control. Whether it is enough remains to be seen.