v3.8.1
3.8.1 (2025-05-19)
Bug Fixes
getLlamaGpuTypes
: edge case (#463) (1799127)- remove prompt completion from the cached context window (#463) (1799127)
Shipped with llama.cpp
release b5415
To use the latest
llama.cpp
release available, runnpx -n node-llama-cpp source download --release latest
. (learn more)