node-llama-cpp
Version:
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
7 lines (6 loc) • 310 B
TypeScript
/**
* Call the functions in the array one by one and return the result of the first one that doesn't throw an error.
*
* If all functions throw an error, throw the error of the last function.
*/
export declare function getFirstValidResult<const T extends (() => any)[]>(options: T): ReturnType<T[number]>;