Anthropic’s artificial intelligence model, Claude, was reportedly used in the US military operation that led to the capture of former Venezuelan President Nicolas Maduro, according to a report by The Wall Street Journal citing people familiar with the matter.

The operation, which allegedly involved airstrikes on multiple sites in Caracas last month, culminated in an early January raid during which US forces apprehended Maduro and transported him to New York to face drug trafficking charges.

Role of Claude and Palantir

The report said Claude’s deployment occurred through Anthropic’s partnership with Palantir Technologies, whose software platforms are widely used by the US Department of Defense and federal law enforcement agencies.

Reuters said it could not independently verify the claims. The US Defense Department and the White House did not immediately respond to requests for comment, and Palantir also declined to comment.

Anthropic stated it could not discuss operational specifics. “We cannot comment on whether Claude, or any other AI model, was used for any specific operation, classified or otherwise,” a spokesperson said. The company added that any deployment of Claude must comply with its usage policies, which govern how the model can be used. “We work closely with our partners to ensure compliance,” the spokesperson said.

The Defense Department also declined to comment, according to the Journal.

Policy questions and military AI use

Anthropic’s usage policies prohibit Claude from being used to facilitate violence, develop weapons or conduct surveillance. If accurate, the reported involvement of the model in a raid that included bombing operations raises fresh questions about how commercial AI systems are being integrated into military contexts.

The Wall Street Journal previously reported that concerns within Anthropic about potential military applications of Claude have prompted administration officials to consider cancelling a contract worth up to $200 million.

According to sources cited by the newspaper, Anthropic became the first AI model developer whose system was used in classified operations by the Pentagon. It remains unclear whether other AI tools were involved in the Venezuela mission for unclassified or support functions.

Earlier this week, Reuters reported that the Pentagon has been encouraging leading AI developers, including OpenAI and Anthropic, to make their models available on classified networks with fewer restrictions than those applied to commercial users.

While many AI firms are building customised systems for the US military, most operate only on unclassified networks used for administrative or analytical purposes. Anthropic is currently the only major AI developer whose system is accessible in classified environments through approved third-party partnerships, though government users are still bound by its policies.

Anthropic, founded in 2021 by former OpenAI executives including CEO Dario Amodei, positions itself as a safety-focused AI company. Amodei has repeatedly called for stronger regulation and guardrails around advanced AI systems.

At a January event announcing Pentagon collaboration with xAI, Defense Secretary Pete Hegseth reportedly said the department would not “employ AI models that won’t allow you to fight wars,” referring to discussions with Anthropic.

What is Anthropic’s Claude?

Claude is a large language model and chatbot developed by Anthropic. Designed for text generation, reasoning, coding and data analysis, it competes with systems such as OpenAI’s ChatGPT and Google’s Gemini.

Claude can summarise documents, answer complex questions, generate reports and assist with programming tasks. Anthropic markets it as a safety-oriented AI system with explicit restrictions against supporting violence, weapon development or surveillance.

The reported use of Claude in the Maduro operation underscores the growing intersection between commercial AI development and national security operations — a space where debates over ethics, oversight and regulatory safeguards continue to intensify.