You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Looks like the issue is that node-llama-cpp needs to be updated and integrated from "^2.8.0" to "3.1.1" in order for model "Llama 3.2 3B" to work. I tried updating, but it looks like there are breaking changes between the two node-llama-cpp versions.
When the model is being initialized:
anythingllm | [backend] error: TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
anythingllm | at new LlamaModel (file:///app/server/node_modules/node-llama-cpp/dist/evaluator/LlamaModel/LlamaModel.js:42:144)
anythingllm | at createLlamaModel (file:///app/server/node_modules/@langchain/community/dist/utils/llama_cpp.js:10:12)
anythingllm | at new ChatLlamaCpp (file:///app/server/node_modules/@langchain/community/dist/chat_models/llama_cpp.js:94:23)
anythingllm | at /app/server/utils/AiProviders/native/index.js:12:27
anythingllm | at async #initializeLlamaModel (/app/server/utils/AiProviders/native/index.js:45:33)
anythingllm | at async #llamaClient (/app/server/utils/AiProviders/native/index.js:57:5)
anythingllm | at async NativeLLM.streamGetChatCompletion (/app/server/utils/AiProviders/native/index.js:134:19)
anythingllm | at async streamChatWithWorkspace (/app/server/utils/chats/stream.js:241:20)
anythingllm | at async /app/server/endpoints/chat.js:86:9
timothycarambat
changed the title
[BUG]: Can't run Native model in Docker
[Chore]: bump node-llama-cpp to 3.1.1 to support newer models
Oct 9, 2024
How are you running AnythingLLM?
Docker (local)
What happened?
Docker sees my models. I start chatting in my workspace, and then I get an error "Failed to load model"
Are there known steps to reproduce?
No response
The text was updated successfully, but these errors were encountered: