12th January, updated 15th January 2026
The Equality Party demanded faster action on Grok, the artificial intelligence system integrated into the social media platform X, that has generated and displayed sexualised images of women and children to users.
These images of real women and children have been undressed, some of them placed in sexual positions, some involving violence, and have been shared widely on X. The problem has been raised by the media, safeguarding organisations, and the public. Sharing this content is unlawful in the United Kingdom.
X is a UK-accessible service and is therefore within scope of the Online Safety Act 2023. The Act places duties on platform providers to prevent the dissemination of illegal content and to mitigate risks arising from their systems, including algorithmic and generative technologies. Ofcom has powers to investigate, require remedial action, impose substantial fines, and, in extreme cases, pursue service restriction measures.
Ofcom opened an investigation and asked for a response from X, promising to expedite its investigation and pointing out that the legal responsibility sits with platforms to ensure they do not host illegal content.
X has removed the capability from ‘those countries where sharing such images is illegal’ but the technology remains active and accessible to other users.
In recent weeks, researchers estimated that it was producing more than 6700 undressed images a minute.
Leader of the Equality Party, Cllr Kay Wesley, said:
“If an AI system accessible in the UK is producing sexualised images of women and children, that is not a hypothetical risk — it is a live safeguarding failure and a crime taking place in real time. The government and Ofcom’s response was inadequate. The technology has been available for months and no action was taken until the media ran the story.
“Some public figures, including Nigel Farage, have sought to frame opposition to banning Grok as a matter of ‘free speech’, and Elon Musk has similarly attacked regulatory efforts as censorship.
“Free speech does not permit sexual or child abuse and these arguments show their total disregard for the safety of women and children. Even now the technology is still available in other countries, showing that only enforcement has made X act, the company has no real intention to keep people using its platform safe.
“The UK law is clear on this point, and enforcement should happen immediately where image abuse or child sexual exploitation is taking place. If people distributed sexualised images of women and children in the high street, they would be arrested. Anyone helping to facilitate this would also be brought to justice. Why is this not happening online?
“Where a platform cannot demonstrate that it can operate such technology safely and lawfully, it should be required to remove it. If it refuses, prosecution or exclusion from the market must be the next step.”
ENDS
Context
The Online Safety Act 2023 is the UK’s main law regulating online platforms accessible to UK users, including social media services and systems using algorithms or generative technology.
The Act requires platforms to prevent and remove illegal content, including child sexual exploitation and abuse material, and to assess and reduce risks created by their systems. Ofcom is the independent regulator responsible for enforcement, with powers to require changes, issue fines of up to £18 million or 10% of global turnover, and seek court orders to restrict services in the UK. The law applies to companies based outside the UK where their services are available to UK users.
References
6700 images per minute https://www.theguardian.com/technology/2026/jan/08/grok-x-nonconsensual-images
Ofcom investigation https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery
