UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

3 lines (2 loc) 188 B
export declare function getPlatform(): "aix" | "freebsd" | "haiku" | "linux" | "openbsd" | "sunos" | "netbsd" | "win" | "mac"; export type BinaryPlatform = ReturnType<typeof getPlatform>;