node-llama-cpp
Version:
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
18 lines • 597 B
JavaScript
/**
* Like `JSON.stringify` but results in a value formatted in the format that Python produces when using `json.dumps(value)`.
*
* We need to format results this way since this is what many models use in their training data,
* so this is what many models expect to have in their context state.
*/
export function jsonDumps(value) {
return JSON.stringify(value, null, 1)
.split("\n")
.map((line) => {
line = line.trim();
if (line.endsWith(","))
line += " ";
return line;
})
.join("");
}
//# sourceMappingURL=jsonDumps.js.map