Elon Musk’s company xAI should be held accountable for allowing its AI models to generate abusive sexual images of recognizable minors, three anonymous plaintiffs claim in a lawsuit filed Monday in California federal court.
The three plaintiffs want to file a class action lawsuit on behalf of those who had actual images of themselves as minors altered by Grok to contain sexual content. xAI claims it does not take basic precautions taken by other Frontier Institutes to prevent image models from producing pornography depicting real people or minors.
The lawsuit by Jane Doe 1, Jane Doe 2, a minor, and Jane Doe 3, a minor, against x.AI Corp. and x.AI LLC was filed in the United States District Court for the Northern District of California.
Other deep learning image generators employ various techniques to prevent child pornography from being created from regular photos. The lawsuit alleges that these standards have not been adopted by xAI.
It is virtually impossible to prevent the generation of sexual content featuring children, especially if models are allowed to generate nudity or erotic content from real images. Musk’s public promotion of Grok’s ability to create sexual images and depict real people in skimpy costumes is featured heavily in the lawsuit.
The company did not respond to TechCrunch’s request for comment.
One of the plaintiffs, Jane Doe No. 1, had her high school homecoming and yearbook photos altered by Grok, showing her without clothes. An anonymous tipster who contacted her on Instagram told her the photos were circulating online and sent her a link to a Discord server that posted sexual images of her and other minors she recognized from the school.
tech crunch event
San Francisco, California
|
October 13-15, 2026
A second plaintiff, Jane Doe 2, was informed by criminal investigators of altered sexual images created by a third-party mobile app that relied on the Grok model. A third person, Jane Doe #3, was also tipped off by criminal investigators who discovered altered pornographic images of her on the arrested subject’s phone. Plaintiffs’ lawyers argue that the company should be held responsible because the xAI code and servers are still required for use by third parties.
All three plaintiffs, two of whom are still minors, say they are extremely distressed by the distribution of these images and the effect it has had on their reputations and social lives. They are seeking civil penalties under a series of laws aimed at protecting exploited children and preventing corporate negligence.
