SDK 和工具
DeepSeek 提供多种编程语言的 SDK 和开发工具,帮助您快速集成 AI 能力到您的应用中。
官方 SDK
Python SDK
最受欢迎的 Python 客户端库,兼容 OpenAI API 格式。
安装
bash
pip install openai
基础用法
python
from openai import OpenAI
client = OpenAI(
api_key="your-api-key",
base_url="https://api.deepseek.com"
)
response = client.chat.completions.create(
model="deepseek-chat",
messages=[
{"role": "user", "content": "Hello, world!"}
]
)
print(response.choices[0].message.content)
高级功能
python
# 流式响应
def stream_chat(messages):
stream = client.chat.completions.create(
model="deepseek-chat",
messages=messages,
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content is not None:
yield chunk.choices[0].delta.content
# 异步支持
import asyncio
from openai import AsyncOpenAI
async_client = AsyncOpenAI(
api_key="your-api-key",
base_url="https://api.deepseek.com"
)
async def async_chat():
response = await async_client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Hello!"}]
)
return response.choices[0].message.content
# 批量处理
def batch_process(message_lists):
results = []
for messages in message_lists:
response = client.chat.completions.create(
model="deepseek-chat",
messages=messages
)
results.append(response.choices[0].message.content)
return results
Node.js SDK
适用于 Node.js 和浏览器环境的 JavaScript 客户端。
安装
bash
npm install openai
基础用法
javascript
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your-api-key',
baseURL: 'https://api.deepseek.com'
});
async function chat() {
const completion = await openai.chat.completions.create({
model: 'deepseek-chat',
messages: [
{ role: 'user', content: 'Hello, world!' }
]
});
console.log(completion.choices[0].message.content);
}
chat();
高级功能
javascript
// 流式响应
async function streamChat(messages) {
const stream = await openai.chat.completions.create({
model: 'deepseek-chat',
messages: messages,
stream: true
});
for await (const chunk of stream) {
if (chunk.choices[0]?.delta?.content) {
process.stdout.write(chunk.choices[0].delta.content);
}
}
}
// 错误处理
async function robustChat(messages) {
try {
const completion = await openai.chat.completions.create({
model: 'deepseek-chat',
messages: messages
});
return completion.choices[0].message.content;
} catch (error) {
if (error instanceof OpenAI.APIError) {
console.error('API Error:', error.message);
} else if (error instanceof OpenAI.RateLimitError) {
console.error('Rate limit exceeded');
} else {
console.error('Unexpected error:', error);
}
throw error;
}
}
// 并发处理
async function concurrentChat(messageLists) {
const promises = messageLists.map(messages =>
openai.chat.completions.create({
model: 'deepseek-chat',
messages: messages
})
);
const results = await Promise.all(promises);
return results.map(result => result.choices[0].message.content);
}
Go SDK
高性能的 Go 语言客户端库。
安装
bash
go get github.com/sashabaranov/go-openai
基础用法
go
package main
import (
"context"
"fmt"
"log"
"github.com/sashabaranov/go-openai"
)
func main() {
config := openai.DefaultConfig("your-api-key")
config.BaseURL = "https://api.deepseek.com/v1"
client := openai.NewClientWithConfig(config)
resp, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: "deepseek-chat",
Messages: []openai.ChatCompletionMessage{
{
Role: openai.ChatMessageRoleUser,
Content: "Hello, world!",
},
},
},
)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Message.Content)
}
高级功能
go
// 流式响应
func streamChat(client *openai.Client, messages []openai.ChatCompletionMessage) {
req := openai.ChatCompletionRequest{
Model: "deepseek-chat",
Messages: messages,
Stream: true,
}
stream, err := client.CreateChatCompletionStream(context.Background(), req)
if err != nil {
log.Fatal(err)
}
defer stream.Close()
for {
response, err := stream.Recv()
if errors.Is(err, io.EOF) {
break
}
if err != nil {
log.Fatal(err)
}
fmt.Print(response.Choices[0].Delta.Content)
}
}
// 并发处理
func concurrentChat(client *openai.Client, messageLists [][]openai.ChatCompletionMessage) []string {
var wg sync.WaitGroup
results := make([]string, len(messageLists))
for i, messages := range messageLists {
wg.Add(1)
go func(index int, msgs []openai.ChatCompletionMessage) {
defer wg.Done()
resp, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: "deepseek-chat",
Messages: msgs,
},
)
if err != nil {
log.Printf("Error in goroutine %d: %v", index, err)
return
}
results[index] = resp.Choices[0].Message.Content
}(i, messages)
}
wg.Wait()
return results
}
Java SDK
企业级 Java 应用的理想选择。
安装
Maven:
xml
<dependency>
<groupId>com.theokanning.openai-gpt3-java</groupId>
<artifactId>service</artifactId>
<version>0.18.2</version>
</dependency>
Gradle:
gradle
implementation 'com.theokanning.openai-gpt3-java:service:0.18.2'
基础用法
java
import com.theokanning.openai.OpenAiApi;
import com.theokanning.openai.completion.chat.ChatCompletionRequest;
import com.theokanning.openai.completion.chat.ChatMessage;
import com.theokanning.openai.service.OpenAiService;
import java.util.List;
public class DeepSeekExample {
public static void main(String[] args) {
OpenAiService service = new OpenAiService("your-api-key");
ChatCompletionRequest request = ChatCompletionRequest.builder()
.model("deepseek-chat")
.messages(List.of(
new ChatMessage("user", "Hello, world!")
))
.build();
String response = service.createChatCompletion(request)
.getChoices().get(0).getMessage().getContent();
System.out.println(response);
}
}
C# SDK
适用于 .NET 生态系统的客户端库。
安装
bash
dotnet add package OpenAI
基础用法
csharp
using OpenAI;
using OpenAI.Chat;
var client = new OpenAIClient("your-api-key");
var chatClient = client.GetChatClient("deepseek-chat");
var completion = await chatClient.CompleteChatAsync(
new ChatMessage[]
{
new UserChatMessage("Hello, world!")
}
);
Console.WriteLine(completion.Value.Content[0].Text);
社区 SDK
Ruby
ruby
# Gemfile
gem 'ruby-openai'
# 使用示例
require 'openai'
client = OpenAI::Client.new(
access_token: 'your-api-key',
uri_base: 'https://api.deepseek.com'
)
response = client.chat(
parameters: {
model: 'deepseek-chat',
messages: [
{ role: 'user', content: 'Hello, world!' }
]
}
)
puts response.dig('choices', 0, 'message', 'content')
PHP
php
<?php
require_once 'vendor/autoload.php';
use OpenAI;
$client = OpenAI::client('your-api-key');
$response = $client->chat()->create([
'model' => 'deepseek-chat',
'messages' => [
['role' => 'user', 'content' => 'Hello, world!']
]
]);
echo $response['choices'][0]['message']['content'];
?>
Rust
toml
[dependencies]
async-openai = "0.17"
tokio = { version = "1.0", features = ["full"] }
rust
use async_openai::{Client, types::*};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Client::new()
.with_api_key("your-api-key")
.with_api_base("https://api.deepseek.com/v1");
let request = CreateChatCompletionRequestArgs::default()
.model("deepseek-chat")
.messages([
ChatCompletionRequestUserMessageArgs::default()
.content("Hello, world!")
.build()?
.into(),
])
.build()?;
let response = client.chat().create(request).await?;
println!("{}", response.choices[0].message.content.as_ref().unwrap());
Ok(())
}
开发工具
CLI 工具
安装 DeepSeek CLI 工具:
bash
npm install -g @deepseek/cli
基础用法:
bash
# 配置 API 密钥
deepseek config set api-key your-api-key
# 发送聊天请求
deepseek chat "Hello, world!"
# 流式输出
deepseek chat --stream "写一篇关于 AI 的文章"
# 使用不同模型
deepseek chat --model deepseek-coder "写一个快速排序算法"
# 批量处理
deepseek batch process input.jsonl output.jsonl
VS Code 扩展
安装 DeepSeek VS Code 扩展:
- 打开 VS Code
- 搜索 "DeepSeek" 扩展
- 安装并配置 API 密钥
功能特性:
- 代码补全
- 代码解释
- 错误修复
- 代码重构
- 文档生成
Postman 集合
导入 DeepSeek API Postman 集合:
json
{
"info": {
"name": "DeepSeek API",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json"
},
"auth": {
"type": "bearer",
"bearer": [
{
"key": "token",
"value": "{{api_key}}",
"type": "string"
}
]
},
"variable": [
{
"key": "base_url",
"value": "https://api.deepseek.com"
},
{
"key": "api_key",
"value": "your-api-key"
}
]
}
框架集成
LangChain
python
from langchain.llms import OpenAI
from langchain.chat_models import ChatOpenAI
# 聊天模型
chat = ChatOpenAI(
model_name="deepseek-chat",
openai_api_key="your-api-key",
openai_api_base="https://api.deepseek.com/v1"
)
# 使用示例
from langchain.schema import HumanMessage
response = chat([HumanMessage(content="Hello, world!")])
print(response.content)
LlamaIndex
python
from llama_index.llms import OpenAI
llm = OpenAI(
model="deepseek-chat",
api_key="your-api-key",
api_base="https://api.deepseek.com/v1"
)
response = llm.complete("Hello, world!")
print(response.text)
Haystack
python
from haystack.nodes import OpenAIAnswerGenerator
generator = OpenAIAnswerGenerator(
api_key="your-api-key",
api_base="https://api.deepseek.com/v1",
model="deepseek-chat"
)
最佳实践
1. 选择合适的 SDK
- Python: 数据科学、机器学习项目
- Node.js: Web 应用、前端集成
- Go: 高性能后端服务
- Java: 企业级应用
- C#: .NET 生态系统
2. 性能优化
- 使用连接池
- 实施请求缓存
- 合理设置超时时间
- 使用异步编程
3. 错误处理
- 实现重试机制
- 处理速率限制
- 记录详细日志
- 提供降级方案
4. 安全考虑
- 安全存储 API 密钥
- 使用 HTTPS 连接
- 验证输入数据
- 实施访问控制
获取支持
如果您在使用 SDK 时遇到问题:
贡献
我们欢迎社区贡献!如果您想为 SDK 做出贡献:
- Fork 相应的 GitHub 仓库
- 创建功能分支
- 提交 Pull Request
- 参与代码审查
感谢您选择 DeepSeek!🚀