UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

3 lines (2 loc) 214 B
import { LlamaOptions } from "../getLlama.js"; export declare function getExampleUsageCodeOfGetLlama(getLlamaOptions: LlamaOptions | "lastBuild" | undefined, prefix?: string, wrapWithSeparators?: boolean): string;