llama-cpp-capacitor
Version:
A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Complete iOS and Android support: text generation, chat, multimodal, TTS, LoRA, embeddings, and more.
| Filename | Content Type | Size | |
|---|---|---|---|
| 58.5 MB | |||
| 13.3 MB | |||
| 415 kB | |||
| 4.64 MB | |||
| 11.4 kB | |||
| 1.07 kB | |||
| 647 B | |||
| 775 B | |||
| 845 B | |||
| 22.6 kB | |||
| 8.12 kB | |||
| 4.2 kB |