Three Tennessee teenagers — two of them minors — filed a class-action lawsuit Monday against Elon Musk's xAI, alleging the company's Grok image generation technology was used to create sexually explicit nonconsensual images of them. It is the first lawsuit against xAI brought by underage individuals depicted in AI-generated child sexual abuse material (CSAM).

What Happened

The suit, filed in California where xAI is headquartered, alleges that an acquaintance used a third-party app powered by xAI's AI model to produce deepfake nude images and videos using photos of the plaintiffs — including one taken at a school homecoming event. Law enforcement arrested a suspect the following month and found alleged CSAM on his phone produced using xAI technology.

Crucially, the perpetrator did not use Grok or X directly. Instead, he used an unnamed third-party app that licensed xAI's underlying model. The plaintiffs allege xAI deliberately licensed its technology abroad to distance itself from liability.

Part of a Larger Pattern

The lawsuit expands a wave of legal and regulatory scrutiny. The EU launched a formal inquiry into xAI in January after researchers estimated Grok produced roughly 3 million sexualized images in under two weeks — including approximately 23,000 depicting children. Influencer Ashley St. Clair, who has a child with Musk, separately sued the company earlier this year.

Musk previously denied Grok generated illegal images, claiming in January: "Literally zero." xAI did not respond to press requests regarding the latest suit.

"xAI chose to profit off the sexual predation of real people, including children," said plaintiff attorney Vanessa Baehr-Jones.