Open router support #28930
Replies: 6 comments 6 replies
-
This is not only affecting OpenAI models, I get the same error using Relates to #24067, #30174, #30188, #16576 Edit: |
Beta Was this translation helpful? Give feedback.
-
I have same error.. |
Beta Was this translation helpful? Give feedback.
-
This is happening for me now too with OpenAI models like o4-mini (high). I've verified that this is not an issue with OpenRouter by manually chatting with the same model in the OpenRouter UI. In case it's helpful, this is my config: "language_models": {
"openai": {
"api_url": "https://openrouter.ai/api/v1",
"available_models": [
{
"name": "openai/o4-mini",
"display_name": "o4-mini (OpenRouter)",
"max_tokens": 200000
},
{
"name": "openai/o4-mini-high",
"display_name": "o4-mini-high (OpenRouter)",
"max_tokens": 200000
},
{
"name": "openai/o3",
"display_name": "o3 (OpenRouter)",
"max_tokens": 200000
},
{
"name": "openai/gpt-4.5-preview",
"display_name": "GPT-4.5 (Preview) (OpenRouter)",
"max_tokens": 128000
},
{
"name": "google/gemini-2.5-pro-preview",
"display_name": "Gemini 2.5 Pro Preview (OpenRouter)",
"max_tokens": 1048576
},
{
"name": "anthropic/claude-sonnet-4",
"display_name": "Claude Sonnet 4 (OpenRouter)",
"max_tokens": 200000
},
{
"name": "anthropic/claude-sonnet-4:thinking",
"display_name": "Claude Sonnet 4 Thinking (OpenRouter)",
"max_tokens": 200000
},
{
"name": "anthropic/claude-opus-4",
"display_name": "Claude Opus 4 (OpenRouter)",
"max_tokens": 200000
}
],
"version": "1"
}
}, |
Beta Was this translation helpful? Give feedback.
-
It is now in the last preview build! |
Beta Was this translation helpful? Give feedback.
-
I wonder how one could configure it to use "middle-out" compression/transform as per OpenRotuer's docs: https://openrouter.ai/docs/features/message-transforms |
Beta Was this translation helpful? Give feedback.
-
Hi, is it possible to select the model provider from Zed? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Currently, I have been using the open-router API as a custom endpoint for the open-ai provider, which give access to a larger amount of models through the OpenAI API with different providers.
I was wondering if there would be any chance Open router could be a separate model provider in the future because as of right now only custom specified non-OpenAI models work in the assistant panel, as I get this error when trying to OpenAI model though Open Router I get this error message "Error interacting with language model data did not match any variant of untagged enum ResponseStreamResult" which seems to be a zed side issue.
If this requested out of scope for the current project, please close this discussion, thank you for considering!
Beta Was this translation helpful? Give feedback.
All reactions