LAN-local LLM support & configs #36048
serialhex
started this conversation in
LLMs and Zed Agent
Replies: 1 comment
-
Really, it's very disgusting. OpenAI api can be directed to lan-local lm studio, but never uses it, cause it requires api key, which not used by an agent |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I would like the ability to make it easier to connect to a local LAN-based LLM service. I can currently hack it through using the OpenAI method, but it doesn't work as well as it should. If that's enough context, awesome! If not, here's some more info.
I have a tiny netbook I use for all my programming / writing. It is more than sufficient for editing text & compiling programs. Unfortunately, it is woefully underpowered to run something like Ollama... Intel Celeron N4020 with just 2GB of RAM (not VRAM, regular RAM) it's not going to run even the smallest of models. I also have a decently powerful computer on my LAN, 12th gen i7, 32GB RAM, older AMD graphics card w/ 8GB VRAM. I run Koboldcpp on it and am easily able to run all my LLM goodness on it, with all the cool models I can get from wherever.
At the moment, using the OpenAI option (which is the only option that works) it doesn't detect what model I'm using, and requires a key (using a dummy key works, but it's a stupid workaround). I would like for the option of a "local" or "LAN" LLM option, using the OpenAI API; or Ollama API or Koboldcpp API, or whatever. And have it be configurable. Even the ability to have multiple configs, as I have something similar at work, but the URL & some of the other details are different.
All of this would help in making it a truly local-first editor with LLM support. As there are people who I'm pretty certain work in organizations where they might have a LAN-based LLM, and can't reach out to the internet for LLM services for whatever reason.
Thank you for your time.
Beta Was this translation helpful? Give feedback.
All reactions