how to use ollama provider as AI Assistant #24848
Closed
annilq
started this conversation in
LLMs and Zed Assistant
Replies: 2 comments 1 reply
-
My ollama works fine with the following (default) settings: |
Beta Was this translation helpful? Give feedback.
1 reply
-
zed can work with ollama when I reboot the computer |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
could someone finger out why I can't connect local ollma server configuration


OS:14.2 (23C64)
ollama:0.5.7
zed:0.173.8
Beta Was this translation helpful? Give feedback.
All reactions