Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

typescript: "Do you have runtime libraries installed?", null llmodel #1497

Closed
michalusio opened this issue Oct 11, 2023 · 2 comments
Closed
Labels
bindings gpt4all-binding issues typescript-bindings

Comments

@michalusio
Copy link

michalusio commented Oct 11, 2023

Problem with: gpt4all backend/bindings

Typescript version: 5.1.1
Package version: [email protected] (happens on 2.1.1-alpha and 2.1.0-alpha too)

Hello,

I have a problem getting the embedding models working using TypeScript bindings for gpt4all.

As in the title, when trying to load the model ggml-all-MiniLM-L6-v2-f16 it throws an error:

image

Should I point the library path to somewhere? I thought those would be taken from the node_modules.
I am not sure what I am doing wrong, so any help would be appreciated

@cebtenzzre cebtenzzre added bindings gpt4all-binding issues typescript-bindings labels Oct 11, 2023
@cebtenzzre cebtenzzre changed the title Issue: null inference when creating embedding LLModel typescript: "Do you have runtime libraries installed?", null llmodel Oct 11, 2023
@iimez
Copy link
Collaborator

iimez commented Oct 31, 2023

Tried reproducing using #1390

// node spec/embed.mjs
import { loadModel, createEmbedding, LLModel, DEFAULT_DIRECTORY, DEFAULT_LIBRARIES_DIRECTORY } from '../src/gpt4all.js'
const embedder = await loadModel("all-MiniLM-L6-v2-f16.gguf", { verbose: true, type: 'embedding'})
console.log(createEmbedding(embedder, "Accept your current situation"))
const llm = new LLModel({
  model_name: 'all-MiniLM-L6-v2-f16.gguf',
  model_path: DEFAULT_DIRECTORY,
  library_path: DEFAULT_LIBRARIES_DIRECTORY,
  device: 'cpu'
})
console.log(llm.embed("Embrace the present"))

Both calls behaving as expected.
So this was most likely caused by model format changes and can be closed once the update is merged.

@jacoobes
Copy link
Collaborator

stale

@cebtenzzre cebtenzzre closed this as not planned Won't fix, can't repro, duplicate, stale Dec 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bindings gpt4all-binding issues typescript-bindings
Projects
None yet
Development

No branches or pull requests

4 participants