We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Currently it is not possible to add the model from Nvidia. It would be great for this possibility as the api costs are quite low.
The text was updated successfully, but these errors were encountered:
It doesn't work if you enable Advanced Options and set it as the model? Here is the litellm docs: https://docs.litellm.ai/docs/providers/nvidia_nim
Advanced Options
Unless it's not supported by litellm
Sorry, something went wrong.
Yes @David-Sola , you could access the model via any hosting API that supports the model. For instance OpenRouter should work
No branches or pull requests
Currently it is not possible to add the model from Nvidia.
It would be great for this possibility as the api costs are quite low.
The text was updated successfully, but these errors were encountered: