Pentagon Might Label Anthropic “Supply Chain Risk”
The dispute arises from conflicting policies over Claude’s use. Pentagon officials want the AI available for all lawful military purposes, while Anthropic’s rules prohibit deploying its technology to “facilitate violence, develop weapons, or conduct surveillance.” An unnamed Pentagon official described the potential separation as complicated, warning that Anthropic would “pay a price for forcing our hand.”
Claude is currently the only AI model deployed on the military’s classified systems via a partnership with Palantir Technologies. A “supply chain risk” designation would require Pentagon contractors to prove they do not use Anthropic technology or face potential loss of contracts.
A Pentagon spokesperson confirmed the relationship with Anthropic is under review, stating that all partners must be prepared to fully support military operations. Meanwhile, Anthropic has described discussions with the department as “productive,” despite unresolved disagreements.
The tension follows reports that Claude was utilized during the operation to abduct Nicolás Maduro in early January, with the AI allegedly involved in both planning and execution, though the exact role remains unclear. This has intensified scrutiny over how military applications of AI align with Anthropic’s ethical safeguards.
Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.