-
-
Notifications
You must be signed in to change notification settings - Fork 212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow configuration of base uri #72
base: main
Are you sure you want to change the base?
Conversation
This is very useful to use drop in replacements of the openai-api for local models.
Hi @grafst Thank you for your PR. It looks like this is introducing a breaking change, because it will fail for all existing users not having the And in my opinion there is no need to validate if the base uri is a string. Maybe we can just enforce a string casting. |
yes you are totally right, it should just ignore if the config is not present. Is this now better? |
Hi, |
So we still do not have any option to pass a custom URL for self-hosting LLM? Or just have to pass as an HTTP request? |
Yes I will look into it soon.
On February 17, 2024, Markus Hilsenbeck ***@***.***> wrote:
So we still do not have any option to pass a custom URL for self-
hosting LLM? Or just have to pass as an HTTP request?
—
Reply to this email directly, view it on GitHub
<#72 (comment)>,
or unsubscribe <https://github.com/notifications/unsubscribe-
auth/ACAUED23YVD2ZCU4SSAYOETYUDIHHAVCNFSM6AAAAAA7N5TJZSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNJQGI2DGOJYGM>.
You are receiving this because you were mentioned.Message ID: <openai-
***@***.***>
|
I would also love to see this added. Am hoping to use this along with openrouter.ai to get access to many different models from one interface. |
FYI - This allowed me to get Ollama working locally with the lib, although I needed openai-php/client#375 as well |
This is much needed ! For now, you can add this code to your AppServiceProvider register() method to override the client used.
|
This is very useful to use drop in replacements of the openai-api for local models.