Skip to content

Custom BYOK Ollama providers not discoverable via /v1/models/ endpoint #3037

@joaosa

Description

@joaosa

Describe the bug

The /v1/models/ endpoint seems to ignore custom BYOK Ollama providers. They don't appear in the app.letta.com's UI dropdown.

Steps to Reproduce:

  1. Create a custom Ollama provider (provider_type: "ollama", base_url: "http://localhost:11434/v1")
  2. Pull the model via Ollama: ollama pull phi3:mini (for example)
  3. Query the /v1/models/ endpoint's output
  4. Check the logs and verify what got checked (in my case I only have GET https://api.openai.com/v1/models)

The /v1/models/ should query all configured providers. For Ollama providers, the {base_url}/api/tags endpoint should appear in the logs.

Please describe your setup

  • pip (legacy)
  • Ubuntu 22.04, Letta v0.12.1, Python 3.11

Screenshots

N/A

Additional context

My current workaround is to manually specify the model in agent creation API calls.


Local LLM details

  • The exact models I'm trying to use:
    • LLM Models: llama3.1:8b, phi3:mini
    • Embedding Model: mxbai-embed-large:latest
  • The local LLM backend: Ollama (v0.12.5)
  • Hardware: Self-hosted Ubuntu server

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions