Is it possible to use the /fetch function with a locally-hosted LLM? #34981
Unanswered
bcostaaa01
asked this question in
Help and General Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
I am trying to use LLMs hosted locally with Ollama, but I also wanted to use the
/fetch
functionality in the chat.I got some weird prompts like this one so far:
Do you know if this is not supported by Zed at all or if it is a problem with the model I am using? I was using the
llama3.2:latest
.Beta Was this translation helpful? Give feedback.
All reactions