
Nonprofits call for an immediate federal suspension of xAI’s Grok, warning that the closed-source chatbot’s explicit content failures and black-box design make it unsafe for defence and government use.
A coalition of nonprofits is urging the U.S. government to immediately suspend deployment of Elon Musk’s xAI chatbot Grok across federal agencies, including the Department of Defense, arguing that reliance on a closed, proprietary AI system poses serious safety and national security risks.
Signatories including Public Citizen, the Center for AI and Digital Policy, and the Consumer Federation of America cite reports that Grok generated thousands of nonconsensual explicit images per hour, including sexualised images of real women and, in some cases, children, which were widely shared on X. The letter states: “It is deeply concerning that the federal government would continue to deploy an AI product with system-level failures resulting in generation of nonconsensual sexual imagery and child sexual abuse material.”
The concerns come despite xAI securing a General Services Administration sales agreement and a Department of Defense contract worth up to $200 million, with Grok set to operate inside Pentagon networks handling classified and unclassified documents.
“Our primary concern is that Grok has pretty consistently shown to be an unsafe large language model,” said JB Branch, Public Citizen Big Tech accountability advocate. “If you know that a large language model is or has been declared unsafe by AI safety experts, why in the world would you want that handling the most sensitive data we have?”
Andrew Christianson, founder of Gobii AI and a former NSA contractor, added: “Closed weights means you can’t see inside the model, you can’t audit how it makes decisions… Open source gives you that. Proprietary cloud AI doesn’t.”
Citing OMB guidance requiring high-risk systems to be discontinued, the coalition is demanding an immediate pause and formal safety investigation.












































































