Fallback model and provider #38323
JohnPreston
started this conversation in
LLMs and Zed Agent
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello team,
I have configured different LLMs, and it would be great to have the ability to implement a chain of models and providers to use.
For example, let's say I have OpenRouter and my local LLM, I could have
Idea inspired of n8n AI Agent, the idea here is to have a series of fallback LLMs. If any error occur in the top-tier LLMs, move on to the next one, and so on.
Thanks for your consideration.
// I did my best to search in the issues and discussions and did not find a similar one
Beta Was this translation helpful? Give feedback.
All reactions