Anthropic is facing a strict deadline of Friday evening to provide the United States military with unrestricted access to its artificial intelligence models or face severe regulatory and legal repercussions. According to recent reporting, the dispute has escalated rapidly following direct communications between the company and high-ranking defense officials.
During a meeting held on Tuesday, Defense Secretary Pete Hegseth reportedly issued an ultimatum to Anthropic CEO Dario Amodei. The Pentagon has indicated it is prepared to either designate the AI firm as a "supply chain risk"—a classification historically utilized to identify foreign adversaries—or invoke the Defense Production Act (DPA). The latter option would legally compel the company to modify its models to suit specific military requirements, effectively bypassing the firm's internal safety protocols.
Expanding Executive Authority
The Defense Production Act grants the president broad authority to require private corporations to prioritize and expand production for national defense purposes. The statute gained prominence during the COVID-19 pandemic when it was utilized to force manufacturers such as General Motors and 3M to produce essential medical supplies, including ventilators and protective masks.
However, applying the DPA to override software usage policies represents a novel interpretation of the law. Anthropic has maintained a consistent stance regarding the ethical application of its technology, explicitly stating that its models must not be used for the mass surveillance of American citizens or the operation of fully autonomous weapons systems. The company has refused to waive these restrictions for the Department of Defense.
In contrast, Pentagon officials argue that military applications of technology should be subject to United States law and constitutional boundaries rather than the terms of service drafted by private contractors.
Dean Ball, a senior fellow at the Foundation for American Innovation and a former senior policy advisor on AI, suggests that using the DPA to dismantle AI guardrails would signal a significant shift in how the law is applied. He argues that this move reflects a growing instability within the executive branch.
"It would basically be the government saying, 'If you disagree with us politically, we're going to try to put you out of business,'" Ball stated regarding the potential designation.
Ideological Friction and Market Stability
The standoff is occurring amidst a broader ideological conflict. Administration figures, including AI official David Sacks, have publicly criticized Anthropic's safety protocols, characterizing them as politically motivated or "woke."
Ball warned that such aggressive posturing could damage the United States' reputation as a reliable environment for commerce. He noted that responsible investors and corporate managers might view these actions as an attack on the stability and predictability of the American legal system, which has traditionally been a cornerstone of the nation's economic strength.
A Single Point of Failure
Despite the pressure, reports indicate that Anthropic does not intend to relax its usage restrictions. The company appears to hold significant leverage in the negotiation, as it is currently the only frontier AI laboratory with access to classified Department of Defense systems.
While the Pentagon has reportedly secured a deal to utilize xAI's Grok for classified systems, that integration is not yet a viable backup. This lack of immediate redundancy may explain the Defense Department's aggressive tactics.
If Anthropic were to terminate its contract, the Department of Defense would face an immediate capability gap. Ball highlighted that the agency appears to be operating in violation of a National Security Memorandum issued during the previous administration, which directed federal agencies to avoid reliance on a single vendor for classified frontier AI systems.
"The DOD has no backups. This is a single-vendor situation here," Ball explained, noting that the military cannot rectify such a structural dependency overnight.



