Home / Technology / AI Firm Barred from Pentagon Over Drone Swarm Fears
AI Firm Barred from Pentagon Over Drone Swarm Fears
3 Mar
Summary
- Anthropic proposed AI for Pentagon drone swarm challenge.
- Pentagon barred contractors from commercial ties with Anthropic.
- Concerns over autonomous weapons use sparked the dispute.

Anthropic PBC submitted a proposal for a $100 million Pentagon prize challenge focused on developing voice-controlled, autonomous drone swarming technology earlier this year. The submission occurred during intense negotiations with the Defense Department concerning Anthropic's boundaries for military technology use. On Friday, Defense Secretary Pete Hegseth ordered a ban on contractors and their partners engaging in any commercial activity with Anthropic.
The core of the disagreement centers on the role of artificial intelligence in creating autonomous weapons capable of selecting and firing on targets without human intervention. Anthropic executives have consistently advocated for the lawful use of AI in combat, excluding only mass domestic surveillance and fully autonomous weapons. The company believed its contribution to the drone swarm effort would not violate its principles, as human oversight would remain, allowing for monitoring and intervention.
Anthropic's proposal emphasized using its Claude AI tool to translate commander intent into digital instructions and coordinate drone fleets, rather than for autonomous targeting or weapons decisions. The company also sought to establish a joint research program with the Pentagon for the safe development and evaluation of autonomous weapons. The prize challenge, a multi-phase research and development effort, is intended to progress from software development to live testing.
Despite its submission, Anthropic was not among the selected companies. While the exact reason for exclusion remains unclear, the Pentagon's move to sideline the company followed OpenAI's announcement of a new agreement with the Defense Department for using AI tools on classified cloud systems, with a stipulation for human responsibility in the use of force.




