AI: Support Base URL and other models via Env Var #2351
Replies: 3 comments
-
I was looking into this since I wanted to run the AI assistant with Ollama. The code changes for custom URI support are minimal, but I ran into one main issue: the application uses the new OpenAI response endpoint (https://platform.openai.com/docs/guides/responses-vs-chat-completions), and Ollama has a PR open for supporting this endpoint (ollama/ollama#10316), so I'll need to check that out. Here are the minimal code changes needed to add custom URI support to the OpenAI client: app/models/setting.rb
app/models/provider/registry.rb
app/models/provider/openai.rb
This would allow users to set OPENAI_URI environment variable to point to any OpenAI compatible API endpoint. Once Ollama supports the response endpoint, this should work seamlessly with local models. |
Beta Was this translation helpful? Give feedback.
-
100% agree multi-model support and additional configuration would be a good addition. That said, we've intentionally kept this to a single model with zero config to pilot the feature and reduce the footprint of responses to optimize for debugging our core implementation. As @acidtib mentioned, we're hitting the OpenAI responses endpoint, and I think the long term solution to this is to add new providers entirely, which we've built extension points for in the codebase. For example, we have a generic https://github.com/maybe-finance/maybe/blob/main/app/models/provider/llm_concept.rb I'm going to move this to a feature request but want to be clear—we do agree!! Working hard to make the AI tooling more powerful. |
Beta Was this translation helpful? Give feedback.
-
Boosting this since it would be great to have. I don't want to send my data to OpenAI, so I want to use my local LLM models. Having the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Amazing feature about having OpenAI integration.
It would be great to be able to pass extra configurations via an env var to the OpenAI library.
Example, Base URL, which would allow the use of other OpenAI Compatible APIs, and also specify a different model
Beta Was this translation helpful? Give feedback.
All reactions