Releases: withcatai/node-llama-cpp
Releases · withcatai/node-llama-cpp
v2.8.0
2.8.0 (2023-11-06)
Features
v2.7.4
2.7.4 (2023-10-25)
Bug Fixes
- do not download redundant node headers (#80) (ff1644d)
- improve cmake custom options handling (#80) (ff1644d)
- do not set
CMAKE_GENERATOR_TOOLSET
for CUDA (#80) (ff1644d)
- do not fetch information from GitHub when using a local git bundle (#80) (ff1644d)
- GBNF JSON schema string const formatting (#80) (ff1644d)
Features
- adapt to the latest
llama.cpp
interface (#80) (ff1644d)
- print helpful information to help resolve issues when they happen (#80) (ff1644d)
- make portable cmake on Windows more stable (#80) (ff1644d)
- update
CMakeLists.txt
to match llama.cpp
better (#80) (ff1644d)
v2.7.2
2.7.2 (2023-10-12)
Features
- minor: save and load history to
chat
command (#71) (dc88531)
v2.7.1
2.7.1 (2023-10-11)
Bug Fixes
GeneralChatPromptWrapper
output (#70) (4ff8189)
- improve JSON schema validation error messages (#69) (c41da09)
v2.7.0
2.7.0 (2023-10-11)
Features
- add JSON schema grammar support (#68) (8ceac05)
- add
promptWithMeta
function to LlamaChatSession
(#68) (8ceac05)
v2.6.3
2.6.3 (2023-10-10)
Bug Fixes
v2.6.2
2.6.2 (2023-10-09)
Bug Fixes
- add documentation to Google Search (#65) (eb61383)
v2.6.1
2.6.1 (2023-10-09)
Bug Fixes