UNPKG

@promptbook/google

Version:

Promptbook: Turn your company's scattered knowledge into AI ready books

22 lines (21 loc) 1.27 kB
import type { LlmExecutionTools } from '../../../../execution/LlmExecutionTools'; import type { LlmExecutionToolsWithTotalUsage } from './LlmExecutionToolsWithTotalUsage'; /** * Intercepts LLM tools and counts total usage of the tools. * * This function wraps the provided `LlmExecutionTools` with a proxy that tracks the cumulative * usage (tokens, cost, etc.) across all model calls. It provides a way to monitor spending * in real-time through an observable. * * @param llmTools - The LLM tools to be intercepted and tracked * @returns Full proxy of the tools with added usage tracking capabilities * @public exported from `@promptbook/core` */ export declare function countUsage<TLlmTools extends LlmExecutionTools>(llmTools: TLlmTools): TLlmTools & LlmExecutionToolsWithTotalUsage; /** * TODO: [🧠][💸] Maybe make some common abstraction `interceptLlmTools` and use here (or use javascript Proxy?) * TODO: [🧠] Is there some meaningfull way how to test this util * TODO: [🧠][🌯] Maybe a way how to hide ability to `get totalUsage` * > const [llmToolsWithUsage,getUsage] = countTotalUsage(llmTools); * TODO: [👷‍♂️] Write a comprehensive manual explaining the construction and usage of LLM tools in the Promptbook ecosystem */