Skip to content

Can I use remote endpoint ? #50

Open
@eastmoutain

Description

@eastmoutain

I setup a llama server which has a GPU card, i deployed llama.cpp on that server.
Now I want to connect the service from another computer, I modified llmam.vim configuration in vimrc.

let let g:llama_config.endpoint = "http://192.168.1.10:8012/infill"

with the configuration above, vim will throw error message:

"Job failed with exit code: 7"

need your help

thanks

Activity

m18coppola

m18coppola commented on Mar 7, 2025

@m18coppola
Contributor

You should be able to. Exit code 7 means curl couldn't connect to host. What happens when you enter curl 192.168.1.10:8012/models in the terminal?

I suspect that you need to add --host 192.168.1.10 to your ./llama-server command line args in order to allow it to listen to connections from outside of localhost.

eastmoutain

eastmoutain commented on Mar 8, 2025

@eastmoutain
Author

@m18coppola

curl 192.168.1.10:8012/models from client to server works fine.

add --host 192.168.1.10 to llama-server cmdline instead of --host 127.0.0.1 can solve the issue.

thanks a lot, thumb up!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @eastmoutain@m18coppola

        Issue actions

          Can I use remote endpoint ? · Issue #50 · ggml-org/llama.vim