UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

9 lines (8 loc) 432 B
import { ChatModelFunctions } from "../../../types.js"; import { ChatWrapper } from "../../../ChatWrapper.js"; export declare class LlamaFunctionCallValidationError<const Functions extends ChatModelFunctions> extends Error { readonly functions: Functions; readonly chatWrapper: ChatWrapper; readonly callText: string; constructor(message: string, functions: Functions, chatWrapper: ChatWrapper, callText: string); }