Releases: withcatai/node-llama-cpp
v2.8.12
v3.0.0-beta.32
3.0.0-beta.32 (2024-06-18)
Bug Fixes
Shipped with llama.cpp
release b3166
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.31
3.0.0-beta.31 (2024-06-17)
Bug Fixes
- remove CUDA binary compression for Windows (#243) (0b85800)
- improve
inspect gpu
command output (#243) (0b85800)
Shipped with llama.cpp
release b3166
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.30
3.0.0-beta.30 (2024-06-17)
Bug Fixes
- avoid duplicate context shifts (#241) (1e7c5d0)
onProgress
onModelDownloader
(#241) (1e7c5d0)- re-enable CUDA binary compression (#241) (1e7c5d0)
- more thorough tests before loading a binary (#241) (1e7c5d0)
- increase compatibility of prebuilt binaries (#241) (1e7c5d0)
Shipped with llama.cpp
release b3166
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.29
3.0.0-beta.29 (2024-06-16)
Bug Fixes
Shipped with llama.cpp
release b3153
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.28
3.0.0-beta.28 (2024-06-15)
Features
- compress CUDA prebuilt binaries (#236) (b89ad2d)
- automatically solve more CUDA compilation errors (#236) (b89ad2d)
Shipped with llama.cpp
release b3153
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.27
3.0.0-beta.27 (2024-06-12)
Features
Shipped with llama.cpp
release b3135
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.26
3.0.0-beta.26 (2024-06-11)
Bug Fixes
Shipped with llama.cpp
release b3135
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.25
3.0.0-beta.25 (2024-06-10)
Bug Fixes
Shipped with llama.cpp
release b3091
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.24
3.0.0-beta.24 (2024-06-09)
Bug Fixes
Shipped with llama.cpp
release b3091
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)