UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

6 lines (5 loc) 204 B
import { Llama } from "../Llama.js"; /** * This is used to access various methods in the addon side without actually using a backend */ export declare function getLlamaWithoutBackend(): Promise<Llama>;