Skip to content

API 文档

快速开始

DeepSeek API 使用与 OpenAI 兼容的 API 格式,通过修改配置,您可以使用 OpenAI SDK 来访问 DeepSeek API,或使用与 OpenAI API 兼容的软件。

基本配置

参数
base_urlhttp://ai.sankotrade.com
api_key申请 API 密钥

提示

出于与 OpenAI 兼容考虑,您也可以将 base_url 设置为 http://ai.sankotrade.com/v1 来使用,但注意,此处 v1 与模型版本无关。

可用模型

  • deepseek-chat: DeepSeek-V3.2-Exp 的非思考模式
  • deepseek-reasoner: DeepSeek-V3.2-Exp 的思考模式

快速示例

cURL 示例

bash
curl http://ai.sankotrade.com/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "model": "deepseek-chat",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Hello!"}
    ],
    "stream": false
  }'

Python 示例

python
import openai

client = openai.OpenAI(
    api_key="YOUR_API_KEY",
    base_url="http://ai.sankotrade.com"
)

response = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ],
    stream=False
)

print(response.choices[0].message.content)

Node.js 示例

javascript
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: 'YOUR_API_KEY',
  baseURL: 'http://ai.sankotrade.com'
});

async function main() {
  const completion = await openai.chat.completions.create({
    messages: [
      { role: "system", content: "You are a helpful assistant." },
      { role: "user", content: "Hello!" }
    ],
    model: "deepseek-chat",
  });

  console.log(completion.choices[0].message.content);
}

main();

流式输出

设置 stream: true 来启用流式输出:

python
import openai

client = openai.OpenAI(
    api_key="YOUR_API_KEY",
    base_url="http://ai.sankotrade.com"
)

stream = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "user", "content": "写一首关于春天的诗"}
    ],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")

请求参数

Chat Completions

参数类型必需默认值描述
modelstring-要使用的模型 ID
messagesarray-对话消息列表
temperaturenumber1.0控制输出的随机性 (0-2)
max_tokensinteger-生成的最大 token 数
top_pnumber1.0核采样参数 (0-1)
frequency_penaltynumber0频率惩罚 (-2.0 到 2.0)
presence_penaltynumber0存在惩罚 (-2.0 到 2.0)
streambooleanfalse是否流式返回
stopstring/arraynull停止序列

Messages 格式

json
{
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful assistant."
    },
    {
      "role": "user", 
      "content": "Hello!"
    },
    {
      "role": "assistant",
      "content": "Hello! How can I help you today?"
    }
  ]
}

支持的角色:

  • system: 系统消息,用于设置助手的行为
  • user: 用户消息
  • assistant: 助手回复

响应格式

非流式响应

json
{
  "id": "chatcmpl-123",
  "object": "chat.completion",
  "created": 1677652288,
  "model": "deepseek-chat",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Hello! How can I help you today?"
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 9,
    "completion_tokens": 12,
    "total_tokens": 21
  }
}

流式响应

json
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

使用限制

  • 速率限制: 根据您的订阅计划
  • 上下文长度: 最大 128K tokens
  • 并发请求: 根据您的订阅计划

最佳实践

1. 优化 Prompt

python
# 好的 prompt 示例
messages = [
    {
        "role": "system", 
        "content": "你是一个专业的 Python 编程助手,请提供清晰、可执行的代码示例。"
    },
    {
        "role": "user", 
        "content": "请写一个函数来计算斐波那契数列的第 n 项"
    }
]

2. 处理错误

python
try:
    response = client.chat.completions.create(
        model="deepseek-chat",
        messages=messages
    )
except openai.APIError as e:
    print(f"API 错误: {e}")
except openai.RateLimitError as e:
    print(f"速率限制: {e}")
except Exception as e:
    print(f"其他错误: {e}")

3. 流式处理

python
def stream_response(messages):
    stream = client.chat.completions.create(
        model="deepseek-chat",
        messages=messages,
        stream=True
    )
    
    for chunk in stream:
        if chunk.choices[0].delta.content:
            yield chunk.choices[0].delta.content

下一步

基于 DeepSeek AI 大模型技术