UNPKG

node-llama-cpp

Version:

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

7 lines 197 B
export class NoBinaryFoundError extends Error { /** @internal */ constructor(message = "NoBinaryFoundError") { super(message); } } //# sourceMappingURL=NoBinaryFoundError.js.map