UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

8 lines (7 loc) 225 B
export default function withOra<T>(message: string | { loading: string; success?: string; fail?: string; useStatusLogs?: boolean; noSuccessLiveStatus?: boolean; }, callback: () => Promise<T>): Promise<T>;