llama-cpp-capacitor
Version:
A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Complete iOS and Android support: text generation, chat, multimodal, TTS, LoRA, embeddings, and more.
| Filename | Content Type | Size | |
|---|---|---|---|
| ../ | |||
| 4.03 MB | |||
| 283 kB | |||
| 17.1 kB | |||
| 189 kB | |||
| 4.45 kB | |||
| 24.1 kB | |||
| 861 B | |||
| 16.4 kB | |||
| 4.43 kB | |||
| 39.1 kB | |||
| 11.5 kB | |||