Open
Description
I setup a llama server which has a GPU card, i deployed llama.cpp on that server.
Now I want to connect the service from another computer, I modified llmam.vim configuration in vimrc.
let let g:llama_config.endpoint = "http://192.168.1.10:8012/infill"
with the configuration above, vim will throw error message:
"Job failed with exit code: 7"
need your help
thanks
Activity
m18coppola commentedon Mar 7, 2025
You should be able to. Exit code 7 means curl couldn't connect to host. What happens when you enter
curl 192.168.1.10:8012/models
in the terminal?I suspect that you need to add
--host 192.168.1.10
to your./llama-server
command line args in order to allow it to listen to connections from outside of localhost.eastmoutain commentedon Mar 8, 2025
@m18coppola
curl 192.168.1.10:8012/models
from client to server works fine.add
--host 192.168.1.10
to llama-server cmdline instead of--host 127.0.0.1
can solve the issue.thanks a lot, thumb up!