Skip to content

from_provider doesn't support Litellm as described in documentation #1710

@kouk

Description

@kouk
  • This is actually a bug report.
  • I am not getting good LLM Results
  • I have tried asking for help in the community on discord or discussions and have not received a response.
  • I have tried searching the documentation and have not found an answer.

What Model are you using?

  • gpt-3.5-turbo
  • gpt-4-turbo
  • gpt-4
  • Other (please specify)

Describe the bug

In the documentation it is shown that to use the litellm provider you can call instructor.from_provider. However that method doesn't support litellm.

To Reproduce

Run the simple example from the documentation. You will get:

Traceback (most recent call last):
  File ".../test.py", line 6, in <module>
    client = instructor.from_provider("litellm/gpt-3.5-turbo")
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../.venv/lib/python3.12/site-packages/instructor/auto_client.py", line 475, in from_provider
    raise ConfigurationError(
instructor.exceptions.ConfigurationError: Unsupported provider: litellm. Supported providers are: ['openai', 'azure_openai', 'anthropic', 'google', 'mistral', 'cohere', 'perplexity', 'groq', 'writer', 'bedrock', 'cerebras', 'fireworks', 'vertexai', 'generative-ai', 'ollama', 'xai']

Expected behavior

It works.

Screenshots
If applicable, add screenshots to help explain your problem.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions