Last updated: 4 May 2026
The DeepSeek Anthropic API lets developers call DeepSeek models through an Anthropic-compatible API format. In practice, this means you can point Anthropic-style tooling, including the Anthropic SDK and Claude Code, to DeepSeek’s Anthropic-compatible base URL: https://api.deepseek.com/anthropic. It does not mean DeepSeek is Anthropic, and it does not mean Anthropic hosts DeepSeek models. It means DeepSeek provides a compatibility layer for teams already using Anthropic-style workflows.
This guide explains the DeepSeek Anthropic API endpoint, SDK setup, Claude Code configuration, supported fields, unsupported fields, model choices, pricing notes, and troubleshooting steps.
Quick answer
| Item | Answer |
|---|---|
| Endpoint | https://api.deepseek.com/anthropic |
| SDK | Anthropic Python SDK or TypeScript/JavaScript SDK |
| API key | Use your DeepSeek API key, not an Anthropic key |
| Recommended models | deepseek-v4-flash, deepseek-v4-pro |
| Best use case | Testing DeepSeek in Anthropic-style apps, Claude Code, agents, and migration workflows |
| Key limitation | Compatibility is not identical to Anthropic Claude API; images, documents, and several Claude-native content blocks are not supported in DeepSeek’s compatibility endpoint |
What Is the DeepSeek Anthropic API?
The DeepSeek Anthropic API is DeepSeek’s Anthropic-compatible API format. Instead of rewriting an application that already uses client.messages.create() from the Anthropic SDK, you can change the base URL, use a DeepSeek API key, and call DeepSeek models with an Anthropic-style request shape.
DeepSeek’s official docs state that its API supports OpenAI/Anthropic-compatible formats and lists the Anthropic base URL as https://api.deepseek.com/anthropic. The same docs list the OpenAI-compatible base URL as https://api.deepseek.com.
Compatibility is useful when you want to:
- use DeepSeek with Anthropic SDK code;
- test DeepSeek as a backend for Claude Code;
- reduce migration work for existing Anthropic-style apps;
- compare DeepSeek models against Claude models in controlled workflows;
- keep a similar request structure while switching providers.
However, compatibility does not mean every Claude-native feature works. DeepSeek documents specific supported, ignored, and unsupported fields. For example, text content and tool use are supported, while image, document, search_result, server tool, web search tool, MCP tool, and container upload content blocks are not supported in the Anthropic-compatible endpoint.
DeepSeek Anthropic API Endpoint and Basic Setup
| Field | Value | Notes |
|---|---|---|
| Anthropic-compatible base URL | https://api.deepseek.com/anthropic | Use this with the Anthropic SDK or Anthropic-style tools |
| OpenAI-compatible base URL | https://api.deepseek.com | Use this with OpenAI-compatible Chat Completions clients |
| API key | DeepSeek API key | Create it from the DeepSeek Platform |
| Model examples | deepseek-v4-flash, deepseek-v4-pro | Current model names listed in DeepSeek docs |
| SDK | anthropic Python SDK or @anthropic-ai/sdk | Anthropic SDKs provide message, streaming, and tool interfaces |
| Streaming | Supported | DeepSeek marks stream as fully supported |
| Tool calls | Supported | Tool name, input schema, and description are fully supported |
| JSON output | Supported on current DeepSeek V4 models | Verify current docs before production use |
DeepSeek’s current model table lists deepseek-v4-flash and deepseek-v4-pro. It also says deepseek-chat and deepseek-reasoner are legacy names scheduled for deprecation on July 24, 2026, and currently map to non-thinking and thinking modes of deepseek-v4-flash.
How to Call DeepSeek with the Anthropic Python SDK
Install the Anthropic Python SDK:
python -m pip install anthropic
Set environment variables:
export ANTHROPIC_BASE_URL="https://api.deepseek.com/anthropic"
export ANTHROPIC_API_KEY="your_deepseek_api_key"
Use this Python example:
import os
from anthropic import Anthropic
DEEPSEEK_ANTHROPIC_BASE_URL = os.environ.get(
"ANTHROPIC_BASE_URL",
"https://api.deepseek.com/anthropic",
)
api_key = os.environ.get("ANTHROPIC_API_KEY")
if not api_key:
raise RuntimeError(
"Missing ANTHROPIC_API_KEY. Set it to your DeepSeek API key before running."
)
client = Anthropic(
api_key=api_key,
base_url=DEEPSEEK_ANTHROPIC_BASE_URL,
)
try:
message = client.messages.create(
model="deepseek-v4-pro",
max_tokens=1000,
system="You are a helpful technical assistant.",
messages=[
{
"role": "user",
"content": "Explain the DeepSeek Anthropic API in one paragraph.",
}
],
)
for block in message.content:
if getattr(block, "type", None) == "text":
print(block.text)
except Exception as exc:
print(f"DeepSeek Anthropic-compatible request failed: {exc}")
DeepSeek’s own Anthropic API guide shows the same core pattern: install anthropic, set ANTHROPIC_BASE_URL, set an API key, create an anthropic.Anthropic() client, and call client.messages.create() with a DeepSeek model such as deepseek-v4-pro.
Anthropic’s Python SDK documentation confirms the package name is anthropic, supports synchronous and asynchronous usage, and uses the Messages API pattern with client.messages.create().
Node.js Example for the DeepSeek Anthropic-Compatible API
Install the Anthropic JavaScript SDK:
npm install @anthropic-ai/sdk
Set environment variables:
export ANTHROPIC_BASE_URL="https://api.deepseek.com/anthropic"
export ANTHROPIC_API_KEY="your_deepseek_api_key"
Create deepseek-anthropic-example.mjs:
import Anthropic from "@anthropic-ai/sdk";
const apiKey = process.env.ANTHROPIC_API_KEY;
const baseURL =
process.env.ANTHROPIC_BASE_URL || "https://api.deepseek.com/anthropic";
if (!apiKey) {
throw new Error("Missing ANTHROPIC_API_KEY. Set it to your DeepSeek API key.");
}
const client = new Anthropic({
apiKey,
baseURL,
});
try {
const message = await client.messages.create({
model: "deepseek-v4-pro",
max_tokens: 1000,
system: "You are a practical software engineering assistant.",
messages: [
{
role: "user",
content: "Give me a short checklist for using DeepSeek with the Anthropic SDK.",
},
],
});
for (const block of message.content) {
if (block.type === "text") {
console.log(block.text);
}
}
} catch (error) {
console.error("DeepSeek Anthropic-compatible request failed:", error);
}
Run it:
node deepseek-anthropic-example.mjs
Anthropic’s TypeScript SDK docs list npm install @anthropic-ai/sdk, server-side JavaScript usage, streaming support, and client.messages.create() as the standard request method.
How to Use DeepSeek in Claude Code
Claude Code is an AI coding assistant that runs in the terminal. DeepSeek documents an official integration path that points Claude Code to the DeepSeek Anthropic API endpoint.
For macOS or Linux:
export ANTHROPIC_BASE_URL=https://api.deepseek.com/anthropic
export ANTHROPIC_AUTH_TOKEN=<your DeepSeek API Key>
export ANTHROPIC_MODEL=deepseek-v4-pro[1m]
export ANTHROPIC_DEFAULT_OPUS_MODEL=deepseek-v4-pro[1m]
export ANTHROPIC_DEFAULT_SONNET_MODEL=deepseek-v4-pro[1m]
export ANTHROPIC_DEFAULT_HAIKU_MODEL=deepseek-v4-flash
export CLAUDE_CODE_SUBAGENT_MODEL=deepseek-v4-flash
export CLAUDE_CODE_EFFORT_LEVEL=max
For Windows PowerShell:
$env:ANTHROPIC_BASE_URL="https://api.deepseek.com/anthropic"
$env:ANTHROPIC_AUTH_TOKEN="<your DeepSeek API Key>"
$env:ANTHROPIC_MODEL="deepseek-v4-pro[1m]"
$env:ANTHROPIC_DEFAULT_OPUS_MODEL="deepseek-v4-pro[1m]"
$env:ANTHROPIC_DEFAULT_SONNET_MODEL="deepseek-v4-pro[1m]"
$env:ANTHROPIC_DEFAULT_HAIKU_MODEL="deepseek-v4-flash"
$env:CLAUDE_CODE_SUBAGENT_MODEL="deepseek-v4-flash"
$env:CLAUDE_CODE_EFFORT_LEVEL="max"
Then run:
cd /path/to/my-project
claude
Do not expose your DeepSeek API key in client-side code, screenshots, Git repositories, CI logs, or shared shell history.
DeepSeek Anthropic API Compatibility: Supported and Unsupported Fields
| Area | Field or content type | Status |
|---|---|---|
| HTTP header | x-api-key | Fully supported |
| HTTP header | anthropic-beta, anthropic-version | Ignored |
| Simple field | model | Use DeepSeek model names |
| Simple field | max_tokens | Fully supported |
| Simple field | stop_sequences | Fully supported |
| Simple field | stream | Fully supported |
| Simple field | system | Fully supported |
| Simple field | temperature | Fully supported, range 0.0 to 2.0 |
| Simple field | top_p | Fully supported |
| Simple field | thinking | Supported, but budget_tokens is ignored |
| Simple field | output_config | Only effort is supported |
| Simple field | container, mcp_servers, metadata, service_tier, top_k | Ignored |
| Tool field | tools.name | Fully supported |
| Tool field | tools.input_schema | Fully supported |
| Tool field | tools.description | Fully supported |
| Tool field | tools.cache_control | Ignored |
| Tool choice | none, auto, any, tool | Supported; disable_parallel_tool_use is ignored |
| Message content | string content | Fully supported |
| Message content | text block | Fully supported |
| Message content | image block | Not supported |
| Message content | document block | Not supported |
| Message content | search result block | Not supported |
| Message content | thinking block | Supported |
| Message content | redacted thinking block | Not supported |
| Message content | tool use / tool result | Supported |
| Message content | server tool, web search tool, code execution tool, MCP tool, container upload | Not supported |
This matters when migrating Claude-native applications. A text-only chatbot may need very few changes. A Claude-native app that depends on image inputs, document blocks, web search tool results, server tools, or MCP content blocks will need compatibility testing and possibly a different integration path.
DeepSeek Models for Anthropic-Compatible Requests
DeepSeek currently lists two primary V4 model names for API use:
| Model | Best for | Notes |
|---|---|---|
deepseek-v4-flash | Lower-cost, high-volume, general-purpose usage | Also used for automatic mapping of unsupported model names in the Anthropic-compatible endpoint |
deepseek-v4-pro | More complex coding, reasoning, and agentic workflows | Used in DeepSeek’s Anthropic SDK example and Claude Code configuration |
DeepSeek’s pricing/model page lists both V4 models with support for thinking and non-thinking modes, 1M context length, maximum output up to 384K, JSON output, tool calls, and Chat Prefix Completion. FIM Completion is listed as non-thinking mode only. Check the official pricing/docs before production use because model features and prices may change.
DeepSeek’s Anthropic API guide also notes that unsupported model names passed to the DeepSeek Anthropic API are automatically mapped to deepseek-v4-flash. For production systems, treat this as a safety fallback, not as a substitute for explicit model selection and logging.
DeepSeek API Pricing Notes
DeepSeek lists pricing in units of 1M tokens and says billing is based on total input and output tokens. As checked in May 2026, the pricing page lists:
| Model | 1M input tokens, cache hit | 1M input tokens, cache miss | 1M output tokens |
|---|---|---|---|
deepseek-v4-flash | $0.0028 | $0.14 | $0.28 |
deepseek-v4-pro | $0.003625 with listed discount | $0.435 with listed discount | $0.87 with listed discount |
DeepSeek’s page says the deepseek-v4-pro discount is extended until May 31, 2026, 15:59 UTC, and also says product prices may vary, so you should regularly check the official pricing page before production use.
A simple cost formula is:
cost =
(cache_hit_input_tokens / 1,000,000 × cache_hit_input_price)
+ (cache_miss_input_tokens / 1,000,000 × cache_miss_input_price)
+ (output_tokens / 1,000,000 × output_price)
For production, log token usage, cache-hit/cache-miss behavior, selected model name, and request type.
DeepSeek Anthropic API vs Anthropic Claude API
| Category | DeepSeek Anthropic-compatible API | Anthropic Claude API |
|---|---|---|
| Provider | DeepSeek | Anthropic |
| Models | DeepSeek models such as deepseek-v4-flash and deepseek-v4-pro | Claude models |
| Endpoint | https://api.deepseek.com/anthropic | Anthropic’s native Claude API endpoints |
| SDK compatibility | Works with Anthropic-style SDK calls by changing configuration | Native SDK support |
| Native feature support | Compatibility layer with documented limitations | Full Claude-native feature access |
| Vision/documents | Not supported in DeepSeek’s Anthropic-compatible message content blocks | Supported in Claude-native workflows where available |
| Tool use | Supported for documented tool fields | Native Claude tool-use support |
| Pricing model | DeepSeek pricing | Anthropic pricing |
| Best for | Testing DeepSeek in Anthropic-style workflows, Claude Code, migration experiments | Claude-native applications and full Anthropic feature access |
| Migration risk | Medium if your app uses advanced Claude-native blocks | Lower for Claude-native apps |
Anthropic’s own API docs describe official SDKs that handle authentication, request formatting, retries, streaming, timeouts, and error handling. They also emphasize that the direct Claude API is best for full Claude feature access.
Common Errors and Troubleshooting
Wrong base URL
Use:
https://api.deepseek.com/anthropic
Do not use the OpenAI-compatible base URL when calling through the Anthropic SDK.
Using Anthropic model names
Use DeepSeek model names such as:
deepseek-v4-flash
deepseek-v4-pro
Do not use Claude model names unless you are calling Anthropic directly.
Missing API key
For SDK examples, set:
export ANTHROPIC_API_KEY="your_deepseek_api_key"
For Claude Code, DeepSeek’s docs use:
export ANTHROPIC_AUTH_TOKEN="<your DeepSeek API Key>"
Unsupported image or document content
Remove image and document blocks before sending the request through DeepSeek’s Anthropic-compatible endpoint. DeepSeek documents image and document message content blocks as unsupported.
max_tokens errors
max_tokens is fully supported, but the generated output plus input must fit within the model’s limits. Cap max_tokens to a practical value for your application.
429 or rate limit errors
Add retry logic with exponential backoff, queue high-volume jobs, and monitor usage. Do not blindly retry all requests forever.
Streaming issues
Confirm stream is enabled only when your client code can consume server-sent events. Anthropic’s SDK docs show streaming support, and DeepSeek marks stream as fully supported in its compatibility table.
Unexpected automatic model mapping
If you pass an unsupported model name, DeepSeek says the Anthropic API backend automatically maps it to deepseek-v4-flash. Log the model you requested and the model behavior you observe, especially in production.
Production Best Practices
Keep API keys server-side. Never expose DeepSeek keys in browser bundles, public repositories, or mobile app binaries.
Use environment variables for all secrets and base URLs. This makes it easy to switch between DeepSeek, Anthropic, staging, and production environments.
Validate model outputs before using them in user-visible or business-critical workflows.
Validate tool-call arguments against your own schema before executing tools.
Add retry logic with exponential backoff for transient failures.
Log model name, latency, token usage, cache-hit/cache-miss data, and request outcome.
Cap max_tokens to control cost and reduce runaway outputs.
Use streaming for chat UX when appropriate.
Monitor official model and pricing changes.
Add fallback behavior for critical systems.
Test compatibility before migration, especially if your existing Claude workflow uses images, documents, citations, web search, server tools, code execution tool results, MCP tool calls, or advanced content blocks.
FAQ
What is the DeepSeek Anthropic API?
The DeepSeek Anthropic API is DeepSeek’s Anthropic-compatible API format. It allows developers to use Anthropic-style SDK requests with DeepSeek models by changing the base URL and using a DeepSeek API key.
Is DeepSeek compatible with the Anthropic SDK?
Yes, for documented Anthropic-compatible requests. DeepSeek shows an official Python example using the Anthropic SDK and client.messages.create().
What is the DeepSeek Anthropic API endpoint?
The endpoint is:https://api.deepseek.com/anthropic
Can I use DeepSeek with Claude Code?
Yes. DeepSeek provides official Claude Code environment variable examples using ANTHROPIC_BASE_URL, ANTHROPIC_AUTH_TOKEN, ANTHROPIC_MODEL, default model variables, and CLAUDE_CODE_EFFORT_LEVEL.
Is DeepSeek the same as Anthropic?
No. DeepSeek and Anthropic are different providers. The DeepSeek Anthropic API is a compatibility layer, not Anthropic hosting DeepSeek.
Which model should I use: deepseek-v4-flash or deepseek-v4-pro?
Use deepseek-v4-flash for cost-efficient, high-volume, general usage. Use deepseek-v4-pro for more complex coding, reasoning, or agentic workflows. Check official model docs before production use.
Does the DeepSeek Anthropic API support images?
No. DeepSeek’s Anthropic compatibility table lists image content blocks as not supported.
Does it support tool calls?
Yes. DeepSeek lists tool name, input_schema, and description as fully supported, while cache_control is ignored.
Why is my model name mapped to deepseek-v4-flash?
DeepSeek says unsupported model names passed to its Anthropic API are automatically mapped to deepseek-v4-flash. Use explicit DeepSeek model names to avoid surprises.
Is the DeepSeek Anthropic API production-ready?
It can be used in production only after compatibility testing against your actual workload. Text, streaming, system prompts, sampling fields, and tool calls are documented as supported, but several Claude-native content blocks are not supported.
Conclusion
The practical takeaway is simple: use the DeepSeek Anthropic API when you want DeepSeek models inside Anthropic-style workflows without rewriting your entire application. Set the base URL to https://api.deepseek.com/anthropic, use a DeepSeek API key, select deepseek-v4-flash or deepseek-v4-pro, and test every feature your app depends on.





