-
Does library support only gguf format models ? |
Beta Was this translation helpful? Give feedback.
Answered by
giladgd
Jan 21, 2024
Replies: 2 comments
-
I think so. It's a wrapper over llama.cpp which itself only supports gguf as far as I'm aware. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
giladgd
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
node-llama-cpp
supports the model file formats supported byllama.cpp
.AFAIK, right now the only format it supports is GGUF which is a format that was invented specifically for
llama.cpp