Artificial intelligence company Anthropic has filed a lawsuit challenging the Pentagon's decision to blacklist it from military contracts. Legal experts believe the company has a strong case, arguing the Trump administration may have overstepped its authority in designating Anthropic as a supply chain risk.

An artificial intelligence company is taking the Pentagon to court over a decision that could cost billions in revenue, and legal scholars believe the firm has solid grounds for its challenge.
Anthropic filed a federal lawsuit Monday contesting the Defense Department’s move to blacklist the company from military contracts by labeling it a supply chain security threat. The AI lab argues this action violated its constitutional rights to free speech and due process, claiming it was retaliation for the company’s stance on AI safety in military applications.
Company leadership revealed Tuesday that this blacklisting could slash Anthropic’s projected 2026 earnings by several billion dollars while damaging its reputation in the marketplace.
The Pentagon relied on an uncommon statute that permits blocking companies from specific contracts when they might expose military computer systems to hostile infiltration. According to a Reuters examination of legal records, this law has never faced court scrutiny or been applied against an American corporation.
While courts typically show deference to executive branch decisions on national security matters, five legal experts specializing in national security law told Reuters the Pentagon may have exceeded its authority.
“It’s not at all clear that the statute can even apply to an American company where there’s no foreign entanglement,” stated Alan Rozenshtein, a professor at University of Minnesota Law School.
The Defense Department declined to provide comments regarding ongoing litigation.
Anthropic, which operates as a U.S.-incorporated company with American headquarters, maintains it doesn’t qualify as an “adversary” under Trump administration definitions that include China, Russia, Iran, North Korea, Cuba and Venezuela, according to court documents.
The company highlighted that Defense Secretary Pete Hegseth offered no rationale for how Anthropic’s Claude AI system posed supply chain dangers, despite ongoing military use of the technology. The lawsuit references Hegseth’s February 24 meeting where he described Claude as “exquisite” technology the Defense Department would “love” to partner with.
Military forces utilized Claude as recently as last month during operations targeting Iran, based on Reuters reporting.
Hegseth classified Anthropic as a national security supply chain threat on March 3 following the company’s refusal to remove Claude’s built-in restrictions preventing military use for autonomous weapons or domestic surveillance operations.
In a February 27 social media statement announcing the designation, Hegseth criticized Anthropic for hiding behind the “sanctimonious rhetoric of ‘effective altruism'” to “strong-arm the United States military into submission.”
Anthropic maintains that AI technology lacks sufficient reliability for autonomous weapons systems and opposes domestic surveillance on ethical grounds. Pentagon officials counter that Anthropic’s limitations could jeopardize American military personnel.
Federal law defines supply chain risks as threats where adversaries might sabotage, infiltrate or disrupt government information technology infrastructure.
The statute invoked by the Pentagon, Section 3252, permits the defense secretary to exclude companies from certain contracts to prevent “adversaries” from sabotaging, introducing malicious functions, or otherwise compromising military information systems to “surveil, deny, disrupt, or otherwise degrade” their operations.
The Pentagon also designated Anthropic under separate legislation that could expand contract exclusions across civilian government agencies. Anthropic submitted an additional legal challenge to that designation Monday.
Section 3252 permits company exclusions only as final measures, and other defense contractors aren’t mandated to completely cease collaborating with designated firms.
Reuters couldn’t locate other companies publicly designated as supply chain risks under Section 3252, though this specialized procurement statute doesn’t mandate public disclosure of such designations.
Amos Toh, a national security law specialist at the Brennan Center for Justice, said Claude’s usage policies don’t appear to create foreign sabotage or subversion threats.
“These are basically safety protocols. You can debate whether these protocols are acceptable or not, but they run directly counter to the risk that the law is designed to regulate,” Toh explained.
Anthropic’s legal filing argues the supply chain risk designation punishes the company for its AI safety positions in violation of First Amendment constitutional protections for free speech and expression.
Legal scholars suggested Trump and Hegseth’s public criticism of Anthropic, including Trump’s social media post calling it a “RADICAL LEFT WOKE COMPANY,” could strengthen this constitutional argument.
“A lot of things Hegseth has said and the Pentagon has done undermine their case and suggest there was personal animus and bad blood between the parties, and that the Pentagon had it out for Anthropic,” said Joel Dodge, a legal expert at Vanderbilt University.
Anthropic also contends Hegseth’s supply chain risk order violated Fifth Amendment due process protections by imposing “draconian punishments” without “meaningful process,” factual determinations, or opportunities for the company to contest the decision.
Courts generally hesitate to challenge federal agency determinations but show particular deference to executive branch national security judgments.
This deference would likely form the core of the government’s legal strategy, according to legal experts who said Justice Department attorneys could reference numerous cases where courts determined judges shouldn’t second-guess presidential and military defense decisions.
The government might argue that the president and cabinet secretaries possess broad supplier selection authority and that the military cannot depend on vendors whose usage policies restrict military operations.
The Justice Department could also invoke legal precedent establishing that contract decisions don’t constitute First Amendment violations when supported by legitimate policy or operational justifications.
Eric Crusius, an attorney and government contract specialist not involved in the case, said the government is attempting to impose the “death penalty” on Anthropic and must demonstrate “there was no alternative and that they meticulously considered other options prior to pulling the trigger.”
Anthropic’s lawsuit claims Hegseth’s decision violated the Administrative Procedure Act, legislation allowing courts to overturn actions deemed “arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.”
Legal experts identified apparent contradictions in the government’s position as strong evidence that Hegseth’s decision was arbitrary.
“The government was simultaneously threatening to use the (Defense Production Act) to force Anthropic to sell its services, using its services in active military operations, and saying it’s too dangerous to use them in government contracts,” said University of Minnesota Law School professor Alan Rozenshtein.
“Not all of these things can be true,” he concluded.
Congo Republic President Faces Six Challengers in Upcoming Election
South Korea Eyes Expanded AI Partnership with UAE After Middle East Tensions Ease
Student Loan Oversight Gaps Put Borrowers at Risk, Government Watchdog Warns