-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Open
Description
Describe the bug
The /v1/models/
endpoint seems to ignore custom BYOK Ollama providers. They don't appear in the app.letta.com's UI dropdown.
Steps to Reproduce:
- Create a custom Ollama provider (
provider_type: "ollama"
,base_url: "http://localhost:11434/v1"
) - Pull the model via Ollama:
ollama pull phi3:mini
(for example) - Query the
/v1/models/
endpoint's output - Check the logs and verify what got checked (in my case I only have
GET https://api.openai.com/v1/models
)
The /v1/models/
should query all configured providers. For Ollama providers, the {base_url}/api/tags
endpoint should appear in the logs.
Please describe your setup
- pip (legacy)
- Ubuntu 22.04, Letta v0.12.1, Python 3.11
Screenshots
N/A
Additional context
My current workaround is to manually specify the model in agent creation API calls.
Local LLM details
- The exact models I'm trying to use:
- LLM Models:
llama3.1:8b
,phi3:mini
- Embedding Model:
mxbai-embed-large:latest
- LLM Models:
- The local LLM backend: Ollama (v0.12.5)
- Hardware: Self-hosted Ubuntu server
Metadata
Metadata
Assignees
Labels
No labels