Custom stop token #193
-
hi, |
Beta Was this translation helpful? Give feedback.
Answered by
giladgd
Apr 10, 2024
Replies: 1 comment
-
In version 3 beta you can generate a completion using For example: import {fileURLToPath} from "url";
import path from "path";
import {getLlama, LlamaCompletion} from "node-llama-cpp";
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const llama = await getLlama();
const model = await llama.loadModel({
modelPath: path.join(__dirname, "models", "stable-code-3b.Q5_K_M.gguf")
});
const context = await model.createContext();
const completion = new LlamaCompletion({
contextSequence: context.getSequence(),
stopGenerationTriggers: [
[ "];" ]
]
});
const input = "const arrayFromOneToTwenty = [1, 2, 3,";
console.log("Input: " + input);
const res = await completion.generateCompletion(input);
console.log("Completion: " + res);
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
giladgd
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
In version 3 beta you can generate a completion using
LlamaCompletion
and configurestopGenerationTriggers
.For example: