-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Letta ignores the "OLLAMA_BASE_URL" env variable, and try to use OpenAI instead #2388
Comments
Thanks for reporting this, we are looking into it now! |
Hi @NeilSCGH - thank you so much for trying Letta and your bug report!! Can you confirm what version of Letta you're on? You should see a version message print out at the top of the server logs. I'm on the latest version and am not able to reproduce your bug. For reference, this is the output of
To grab the latest version: docker pull letta/letta:latest Next, I run this: docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
-e OLLAMA_BASE_URL="http://host.docker.internal:11434" \
letta/letta:latest My server logs look like this:
And when I create an agent in the ADE and click on the model dropdown, I can see all my Ollama models (as well as the free endpoint that we host): |
Hello, Letta version: okWhen running docker, it uses the latest version of Letta server: v0.6.15.
Problem reaching the Ollama serverI noticed something, when loading https://app.letta.com/, I have this in the logs:
So Letta tries to use Ollama, but it failed to resolve 'host.docker.internal'. When searching this issue on google, it seems that I've then tried to change
At least a Connection refused error means that Letta can reach Ollama. It solved the problem for my phone, but Letta is still having the Connection refused' error. If I'm correcting with the fact that I'm still stuck here at the moment. Letta is fallbacking to OpenAI, even if there is no OpenAI token givenRegarding the OpenAI error, I still have this error when trying to talk to the agent:
So it seems that Letta tries to use OpenAI if the Ollama connection fails. I think it will be better to not fallback to OpenAI, because if I haven't added an OpenAI token when running Letta, it's useless to try OpenAI as it will not work. And the resulting error is confusing. |
Edit:
And now I can see the list of my models in the Letta Ade Page :) The good solution is probably to make Now I have a new problem haha, when talking to the agent, it makes my ollama server crash. Here are the Letta logs:
Here are the Ollama logs:
The llama3.2 models perfectly works with the terminal and with Open Webui. |
Describe the bug
Letta ignores the
OLLAMA_BASE_URL
environment variable, and try to use OpenAI.The only model available in the new ADE is "letta-free", despite the fact that I use
-e OLLAMA_BASE_URL="http://host.docker.internal:11434"
when running Letta with docker.Is it possible to run Letta with Ollama only? Without an OpenAI API key?
Please describe your setup
Logs
No interesting logs when I start the server, but when I try to talk to the chatbot when creating an agent with the "letta-free" model, I have these errors:
Additional context
When I run
With a dummy OpenAI token, I now have the following error in the log
So Letta can see my environment variables, but the OLLAMA_BASE_URL one is still ignored.
Letta Config
Default initial config, runned letta for the first time.
Local LLM details
The text was updated successfully, but these errors were encountered: