Skip to content

Conversation

drajnic
Copy link

@drajnic drajnic commented Sep 13, 2025

  • Allow users to configure a litellm API base and key to use a litellm instance as a model provider.
  • Implement model discovery from the /models and /model_group/info endpoints of the litellm instance.
  • Use the discovered model information, including pricing, to enable cost calculation for litellm models.
  • Improve the cost display to show token usage even when the cost is zero.

@CLAassistant
Copy link

CLAassistant commented Sep 13, 2025

CLA assistant check
All committers have signed the CLA.

@drajnic drajnic force-pushed the feature/custom-litellm branch 2 times, most recently from 2bd81bf to 91ce11a Compare September 13, 2025 06:31
@drajnic drajnic force-pushed the feature/custom-litellm branch 2 times, most recently from cf294c4 to dbd8ccc Compare September 30, 2025 08:42
   - Allow users to configure a litellm API base and key to use a litellm instance as a model provider.
   - Implement model discovery from the /models and /model_group/info endpoints of the litellm instance.
   - Use the discovered model information, including pricing, to enable cost calculation for litellm models.
   - Improve the cost display to show token usage even when the cost is zero.
@drajnic drajnic force-pushed the feature/custom-litellm branch from dbd8ccc to 511683d Compare October 8, 2025 14:30
@ant31
Copy link

ant31 commented Oct 10, 2025

It's already possible to use the littlem models and proxy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants