-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Description
Bug Report: DeepSeek Provider Issues
Summary
DeepSeek model provider is broken in Letta due to undefined type references and missing streaming support.
Environment
- Letta version: Latest (as of October 11, 2025)
- DeepSeek models affected: All (deepseek-chat, deepseek-reasoner, etc.)
- Deployment: Docker (dev-compose.yaml)
Issue 1: NameError - _Message
is not defined
Description
When attempting to use DeepSeek as a model provider, the application crashes with a NameError
when loading the DeepSeek client module.
Steps to Reproduce
- Configure an agent with DeepSeek as the LLM provider
- Attempt to send a message to the agent
- Server crashes with
NameError
Error Traceback
File "/app/letta/llm_api/llm_client.py", line 90, in create
from letta.llm_api.deepseek_client import DeepseekClient
File "/app/letta/llm_api/deepseek_client.py", line 60, in <module>
def map_messages_to_deepseek_format(messages: List[ChatMessage]) -> List[_Message]:
^^^^^^^^
NameError: name '_Message' is not defined
Root Cause
The deepseek_client.py
file references an undefined type _Message
in two locations:
- Line 60: Function return type annotation
- Line 104: Function parameter type annotation
The type _Message
was never imported or defined in the module.
Issue 2: Streaming Not Supported
Description
When attempting to use streaming with DeepSeek models, the application fails with a "Streaming not supported" error, even though DeepSeek's API is OpenAI-compatible and the DeepseekClient
already has streaming implemented.
Steps to Reproduce
- Configure an agent with DeepSeek as the LLM provider
- Attempt to send a streaming message to the agent (e.g., via
/v1/agents/{agent_id}/messages/stream
) - Request fails with ValueError
Error Message
ValueError: Streaming not supported for provider deepseek
Root Cause
The streaming adapters do not include ProviderType.deepseek
in their provider checks, even though:
- DeepSeek uses an OpenAI-compatible API
- The
DeepseekClient
extendsOpenAIClient
and hasstream_async()
implemented - The streaming infrastructure is already in place
Impact
- Severity: High - Complete provider unavailability
- Affected Users: Anyone attempting to use DeepSeek models
- Workaround: None available without code changes
Additional Notes
The DeepseekClient
already has proper OpenAI-compatible streaming support implemented via the stream_async()
method. The streaming adapters simply needed to recognize DeepSeek as a valid streaming provider.