UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

6 lines (5 loc) 184 B
import process from "process"; export declare function testCmakeBinary(cmakeBinaryPath?: string, { cwd, env }?: { cwd?: string; env?: typeof process.env; }): Promise<boolean>;