UNPKG

inference-server

Version:

Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.

17 lines (15 loc) 495 B
import { ChatHistoryItem, ChatModelFunctions, LlamaChatResponse, LlamaChatResponseFunctionCall, } from 'node-llama-cpp' export interface LlamaChatResult<T extends ChatModelFunctions = any> { responseText: string | null functionCalls?: LlamaChatResponseFunctionCall<T>[] stopReason: LlamaChatResponse['metadata']['stopReason'] } export type ContextShiftStrategy = ((options: { chatHistory: ChatHistoryItem[] metadata: any }) => { chatHistory: ChatHistoryItem[]; metadata: any }) | null