You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Typescript version: 5.1.1
Package version: [email protected] (happens on 2.1.1-alpha and 2.1.0-alpha too)
Hello,
I have a problem getting the embedding models working using TypeScript bindings for gpt4all.
As in the title, when trying to load the model ggml-all-MiniLM-L6-v2-f16 it throws an error:
Should I point the library path to somewhere? I thought those would be taken from the node_modules.
I am not sure what I am doing wrong, so any help would be appreciated
The text was updated successfully, but these errors were encountered:
cebtenzzre
changed the title
Issue: null inference when creating embedding LLModel
typescript: "Do you have runtime libraries installed?", null llmodel
Oct 11, 2023
Problem with: gpt4all backend/bindings
Typescript version: 5.1.1
Package version: [email protected] (happens on 2.1.1-alpha and 2.1.0-alpha too)
Hello,
I have a problem getting the embedding models working using TypeScript bindings for gpt4all.
As in the title, when trying to load the model
ggml-all-MiniLM-L6-v2-f16
it throws an error:Should I point the library path to somewhere? I thought those would be taken from the node_modules.
I am not sure what I am doing wrong, so any help would be appreciated
The text was updated successfully, but these errors were encountered: