DeepSeek Integration Guide
DeepSeek integration follows similar principles to OpenAI. For general requirements and minimum specifications, refer to:
📖 ChatGPT Translator Documentation
Two Integration Methods ##
OpenAI ChatGPT Plugin Method
This method uses the existing “ChatGPT for Translator++” plugin interface. All buttons, settings locations, prompts, and parameters are shared with ChatGPT.
Key Configuration Changes:
- Target URL
Replace with → https://api.deepseek.com/v1/chat/completions
- API Key
- Obtain from: DeepSeek Platform
- Model
- Use:
deepseek-chat
(currently the only supported model)
💡 Tip: Preset configuration templates for switching between OpenAI/DeepSeek are available in plugin settings.
LiteLLM Proxy Method
LiteLLM acts as a dedicated LLM proxy, supporting any model listed in its library.
Key Features:
- Automatic compatibility with DeepSeek if supported in LiteLLM
- Unified interface for multiple LLM providers
- 📚 LiteLLM Add-On Documentation
Implementation:
- Follow standard LiteLLM setup process
- Verify DeepSeek model availability in LiteLLM’s supported model list
- Configure using DeepSeek-specific API credentials
Operational Notes
- Both methods share the same translation workflow once configured
- API rate limits and pricing vary – consult DeepSeek’s official documentation
- For mixed provider environments, LiteLLM provides better management capabilities