UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

7 lines (6 loc) 244 B
import { BuildOptions } from "../types.js"; export declare function getBuildFolderNameForBuildOptions(buildOptions: BuildOptions): Promise<{ withoutCustomCmakeOptions: string; withCustomCmakeOptions: string; binVariant: string; }>;