Trump Moves to Ban Anthropic From the US Government
#Trump AI ban #Anthropic Pentagon #Military AI restrictions #Claude Gov #Silicon Valley defense #Autonomous weapons #AI ethics military
📌 Key Takeaways
- Trump ordered immediate halt to federal use of Anthropic's AI tools
- The conflict stems from Anthropic resisting Pentagon pressure on military AI restrictions
- Anthropic raised concerns about AI being used for lethal weapons and surveillance
- Anthropic was the first major AI lab to work with the US military through a $200 million deal
📖 Full Retelling
US President Donald Trump announced on Friday, February 27, 2026, that he was instructing every federal agency to immediately halt the use of Anthropic's AI tools, following weeks of conflict between the AI company and the Department of Defense over restrictions on military applications of artificial intelligence. In a post on Truth Social, Trump referred to Anthropic officials as 'Leftwing nut jobs' who made a 'DISASTROUS MISTAKE trying to STRONG-ARM the Department of War.' The Pentagon has been seeking to modify a deal signed with Anthropic and other AI companies last July, which originally included restrictions on how the technology could be deployed, instead allowing 'all lawful use' of AI. Anthropic objected to this change, expressing concerns that it could enable the development of lethal autonomous weapons or mass surveillance of US citizens. The company became the first major AI laboratory to partner with the US military through a $200 million agreement, creating specialized Claude Gov models with fewer restrictions than its standard offerings. Currently, these models are primarily used for routine tasks like report writing and document summarization, but also support intelligence analysis and military planning through platforms provided by Palantir and Amazon's cloud services for classified military work.
🏷️ Themes
Government regulation of AI, Military technology ethics, Civilian-military tech partnerships
Entity Intersection Graph
No entity connections available yet for this article.
Original Source
Will Knight Business Feb 27, 2026 4:36 PM Trump Moves to Ban Anthropic From the US Government President Donald Trump’s sudden order comes after the Defense Department pressured Anthropic to drop restrictions on how its AI can be used by the military. Photograph: Prakash Singh/Getty Images Save this story Save this story US President Donald Trump announced Friday that he was instructing every federal agency to halt the use of Anthropic’s AI tools effective immediately. The move comes after Anthropic and top officials from the clashed for weeks over military applications of artificial intelligence. "The Leftwing nut jobs at Anthropic have made a DISASTROUS MISTAKE trying to STRONG-ARM the Department of War,” Trump said in a post on Truth Social . The Pentagon and Anthropic did not immediately respond to requests for comment. The Department of Defense has sought to change the terms of a deal struck with Anthropic and other AI companies last July to eliminate restrictions on how AI can be deployed and instead permit “all lawful use” of the technology. Anthropic objected to the change, claiming that it could allow AI to be used to fully control lethal autonomous weapons or to conduct mass surveillance on US citizens. The Pentagon does not currently use AI in these ways, and has said it has no plans to do so. However, top Trump administration officials have voiced opposition to the idea of a civilian tech company dictating military use of such an important technology. Anthropic was the first major AI lab to work with the US military, through a $200 million deal signed with the Pentagon last year. It created several custom models known as Claude Gov that have fewer restrictions than its regular ones. Google, OpenAI, and xAI signed similar deals around the same time but Anthropic is the only AI company currently working with classified systems. Anthropic’s model is available through platforms provided by Palantir and Amazon’s cloud platform for classified military work. Cla...
Read full article at source