Z.AI GLM model integration #39257
Replies: 3 comments 3 replies
-
I'm using their coding plan package and it is working excellently with Zed just by adding: "language_models": {
"openai_compatible": {
"Z.ai": {
"api_url": "https://api.z.ai/api/coding/paas/v4",
"available_models": [
{
"name": "glm-4.6",
"display_name": "GLM-4.6",
"max_tokens": 200000,
"max_output_tokens": 128000,
"max_completion_tokens": 128000,
"capabilities": {
"tools": true,
"images": false,
"parallel_tool_calls": true,
"prompt_cache_key": true
}
}
]
}
}
} |
Beta Was this translation helpful? Give feedback.
-
I agree. GLM's coding plan is very affordable and works quite well, worthy of being a built-in provider option. |
Beta Was this translation helpful? Give feedback.
-
Now you can also use Ollama Cloud for GML-4.6 and use Zed's Ollama integration: On Ollama (free account) I get ~30-45 t/s, so not stable, but it's quite ok, considering I don't pay for it ;) Does anybody have Ollama subscription (20 USD) to check if it will be faster? Does anybody know the speed of Z.ai (ideally on Pro plan, lite is slow) and BigModel? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Recent release of GLM 4.6 shows outstanding agentic performance with very attracive prices.
Beta Was this translation helpful? Give feedback.
All reactions