llama-cpp-capacitor
Version:
A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Supports both simple text generation and advanced chat conversations with system prompts, multimodal processing, TTS, LoRA
Filename | Content Type | Size | |
---|---|---|---|
../ | |||
70.8 kB | |||
0 B | |||
84 B | |||
4.55 kB | |||
4.58 kB | |||
4.55 kB | |||
4.65 kB | |||
56.2 kB |