
The Grok AI tool on Elon Musk’s X will no longer be able to undress pictures of real people, the company has announced.
“We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis,” said a statement.
“This restriction applies to all users, including paid subscribers.”
It comes amid mounting condemnation in the UK and US of the chatbot’s image editing capabilities, with British government ministers threatening the platform with action.
Sir Keir Starmer has described nonconsensual sex images produced by Grok as “disgusting” and “shameful”, and media regulator Ofcom has launched an investigation.
The statement from X came hours after California announced its own state-level probe into the spread of sexualised images created by Grok, including of children.
This breaking news story is being updated and more details will be published shortly.
Please refresh the page for the fullest version.
You can receive Breaking News alerts on a smartphone or tablet via the Sky News App. You can also follow @SkyNews on X or subscribe to our YouTube channel to keep up with the latest news.
