node-llama-cpp
Version:
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
8 lines (7 loc) • 358 B
TypeScript
/**
* Like `JSON.stringify` but results in a value formatted in the format that Python produces when using `json.dumps(value)`.
*
* We need to format results this way since this is what many models use in their training data,
* so this is what many models expect to have in their context state.
*/
export declare function jsonDumps(value: any): string;