-
Notifications
You must be signed in to change notification settings - Fork 60
Open
Description
Inside readme for Crush there is information about manually enabling Local LLM via LMStudio
This works, quite well 🎉
However it would be nice if LMStudio could be also shown as a provider in the TUI.
Couple of observations regarding LMStudio:
-
It runs on a default URL:
http://127.0.0.1:1234 -
It is possible to get the list of models it has downloaded by running a GET request like:
curl http://127.0.0.1:1234/v1/modelsOutput:
{ "data": [ { "id": "meta-llama-3.1-8b-instruct", "object": "model", "owned_by": "organization_owner" }, { "id": "liquid/lfm2-1.2b", "object": "model", "owned_by": "organization_owner" }, { "id": "text-embedding-nomic-embed-text-v1.5", "object": "model", "owned_by": "organization_owner" }, ], "object": "list" }
I believe this should be useful to run on start of crush to generate its crush.json so that as a user I wouldn't have to manually set this up. If the file is generated, then it is easier to go and modify it on demand as per the user.
pythoninthegrass
Metadata
Metadata
Assignees
Labels
No labels