Groq as LLM provider #34682
moneywaters
started this conversation in
LLMs and Zed Agent
Replies: 2 comments
-
I was able to hook it up, but tool calling isn't working... settings:
|
Beta Was this translation helpful? Give feedback.
0 replies
-
Which is strange since it's working through OpenRouter going to Groq... EDIT: some of them work for me ![]() |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Please add Groq as it has fast chips that can run Kimi K2 model fast enough to be comparable with Claude Sonnet being 80% of its price. PLEASE ! I tried adding it via open-ai provider since it uses same api format, but when asked what llm model are you zed replies to me:
I don't actually have direct access to information about what model or version I'm running as. The system that hosts me will select the appropriate OpenAI model (likely GPT-4-turbo or GPT-3.5-turbo) based on availability and the context of our conversation.
So while I can't tell you the exact model version I'm currently running under, you can effectively treat me as a GPT-class large language model with the training cutoff in mid-2023.
Beta Was this translation helpful? Give feedback.
All reactions