Anthropic Tells Judge Billions at Stake If US Shuns AI Tool (3)

March 11, 2026, 12:05 PM UTC

Anthropic PBC told a judge it could lose as much as billions of dollars in revenue this year and urged quick action on its request to block the Trump administration’s declaration of the company as a US supply-chain risk after a blowup with the Pentagon over artificial intelligence safety issues.

The startup made a case for urgency to US District Judge Rita F. Lin at a hearing in San Francisco a day after Anthropic sued the Defense Department over the supply-risk designation. The dispute is over the startup’s demand for assurances that its AI wouldn’t be used for mass surveillance of Americans or autonomous weapons deployment.

Microsoft Corp., which owns a stake in OpenAI and Anthropic, meanwhile filed its own brief urging a judge to temporarily block the government’s moves because they have the potential to delay all ongoing Defense Department “contracting for IT products and services.”

Michael Mongan, an attorney for Anthropic, argued Tuesday that the federal government’s actions have led to more than 100 enterprise customers contacting the company to express doubt about continuing their work with Anthropic.

He also said that a financial services company paused its negotiations with Anthropic regarding a $50 million contract, a pharmaceutical firm asked to shorten the duration of its contract by 10 months, and a financial technology company “explicitly tied” reducing its $10 million contract to $5 million to Anthropic’s issues with the federal government. In all, Mongan said that Anthropic’s chief financial officer has estimated harm to its 2026 revenue could range from hundreds of millions of dollars to billions of dollars.

Read More: Pentagon Official Sees Little Chance to Revive Anthropic AI Deal

A hearing on Anthropic’s request had been set for April 3. The judge moved it up to March 24.

Mongan asked for a commitment from the federal government that it would not take any retaliatory actions against Anthropic before the next hearing — such as by issuing an executive order impacting the AI startup.

“I’m not prepared to offer any commitments on that issue,” said James Harlow, a lawyer for the Justice Department.

Anthropic wants the judge to remove the supply-chain risk designation and require US agencies to withdraw directives related to it. The company claims it is being shut out for disagreeing with the administration and argues the legal principles at stake affect every federal contractor whose views the government dislikes.

Last week, the Pentagon formally notified Anthropic of its determination. Chief Executive Officer Dario Amodei then issued a statement saying the government’s actions were not “legally sound” and had left the company with “no choice but to challenge it in court.”

Besides Microsoft, Anthropic has drawn support from other tech-industry players.

In a joint letter to the judge, dozens of AI scientists and researchers from OpenAI and Google — competitors and, in Google’s case, also an investor — expressed support for Anthropic. They said existing AI systems can’t “safely or reliably handle fully autonomous lethal targeting, and should not be available for domestic mass surveillance of the American people.”

In its brief, Microsoft also warned of significant costs for government suppliers to remove Anthropic software and that the uniqueness of Anthropic’s products may leave some with no alternatives.

The case is Anthropic v. US Department of War, 26-cv-01996, US District Court, Northern District of California (San Francisco).

(Updates to highlight Microsoft’s amicus brief.)

To contact the reporter on this story:
Rachel Metz in San Francisco at rmetz17@bloomberg.net

To contact the editors responsible for this story:
Seth Fiegerman at sfiegerman@bloomberg.net

Peter Blumberg, Steve Stroth

© 2026 Bloomberg L.P. All rights reserved. Used with permission.

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.