-
-
Notifications
You must be signed in to change notification settings - Fork 827
Open
Labels
bugSomething isn't workingSomething isn't working
Description
- This is actually a bug report.
- I am not getting good LLM Results
- I have tried asking for help in the community on discord or discussions and have not received a response.
- I have tried searching the documentation and have not found an answer.
What Model are you using?
- gpt-3.5-turbo
- gpt-4-turbo
- gpt-4
- Other (please specify)
Describe the bug
In the documentation it is shown that to use the litellm provider you can call instructor.from_provider
. However that method doesn't support litellm.
To Reproduce
Run the simple example from the documentation. You will get:
Traceback (most recent call last):
File ".../test.py", line 6, in <module>
client = instructor.from_provider("litellm/gpt-3.5-turbo")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../.venv/lib/python3.12/site-packages/instructor/auto_client.py", line 475, in from_provider
raise ConfigurationError(
instructor.exceptions.ConfigurationError: Unsupported provider: litellm. Supported providers are: ['openai', 'azure_openai', 'anthropic', 'google', 'mistral', 'cohere', 'perplexity', 'groq', 'writer', 'bedrock', 'cerebras', 'fireworks', 'vertexai', 'generative-ai', 'ollama', 'xai']
Expected behavior
It works.
Screenshots
If applicable, add screenshots to help explain your problem.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working