Pentagon Designates Anthropic as Supply-Chain Risk in Unprecedented Move
The Defense Department has declared Anthropic PBC a supply-chain risk, marking the first time such a designation has been applied to an American company. This rare penalty, previously reserved for firms from adversary nations like China's Huawei Technologies, threatens to disrupt Anthropic's $200 million contract to provide classified AI tools to the Pentagon and could bar the company from future defense partnerships.
Legal Challenge and Innovation Concerns
Anthropic CEO Dario Amodei has vowed to fight the designation in court, stating, "We do not believe this action is legally sound." The decision has raised alarms among national security and contracting specialists who warn it could set a dangerous precedent for American technology companies operating at the forefront of innovation.
"Using this tool against a domestic AI firm sends a troubling signal that could chill innovation and weaken the very technology ecosystem the United States needs to stay competitive," said Morgan Plummer, Vice President of Policy at Americans for Responsible Innovation. "These authorities were designed to keep foreign adversaries out of our supply chains, not to punish American companies for building safeguards into their technology."
Contract Implications and Broader Impact
The supply-chain risk label threatens to unravel Anthropic's substantial Pentagon contract and cast a shadow over the company, whose AI tools have gained significant traction in corporate environments. While the $200 million defense contract represents a fraction of Anthropic's projected $20 billion revenue for 2026, the designation could have far-reaching consequences for the company's government business and reputation.
The decision follows weeks of tense negotiations between Anthropic and the Pentagon regarding access to the company's technology. Talks collapsed last week after Anthropic demanded assurances that its AI would not be used for mass surveillance of American citizens or autonomous weapons deployment. This stance prompted President Donald Trump to order U.S. agencies to cease work with Anthropic and led Defense Secretary Pete Hegseth to threaten the rarely invoked supply-chain exclusion.
Legal Basis and Congressional Notification
The Pentagon is implementing its finding under Section 3252 of the law governing the U.S. armed forces, which permits the Defense Department to exclude a company as a contractor if it's found to jeopardize the supply chain. The provision defines risk as the potential that "an adversary may sabotage, maliciously introduce unwanted function, or otherwise subvert" the technology or service being provided.
Defense Secretary Hegseth has formally notified Congress of his decision through letters to top Republicans and Democrats on the House and Senate committees for armed services, appropriations, and intelligence. In his correspondence, Hegseth stated, "This determination is based in part on a risk analysis by the DoW and input from senior DoW personnel that the Covered Entity's restrictions on the use of its products and services introduces national security risks to the DoW's supply chain." He referred to Anthropic using the acronym DoW for Department of War, the name he now prefers for the Department of Defense.
Procedural Requirements and Broader Implications
The legal provision requires the defense secretary to demonstrate the supply-chain risk and show that less intrusive measures were unavailable. This unprecedented application of supply-chain risk designation to a domestic company specializing in artificial intelligence—a technology the government has declared a national priority—threatens to slow broader Pentagon efforts to accelerate AI adoption across the U.S. military.
The situation represents a significant clash between national security concerns and technological innovation, with potential ramifications for how American companies develop and deploy advanced technologies for government use. As Anthropic prepares its legal challenge, the outcome could establish important precedents for the relationship between private sector AI developers and national defense agencies.
