llama-cpp-capacitor
Version:
A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Complete iOS and Android support: text generation, chat, multimodal, TTS, LoRA, embeddings, and more.
| Filename | Content Type | Size | |
|---|---|---|---|
| ../ | |||
| 118 kB | |||
| 1.11 MB | |||
| 13.9 kB | |||
| 7.05 kB | |||
| 554 B | |||
| 1.9 kB | |||
| 14.5 kB | |||
| 122 kB | |||
| 23.8 kB | |||
| 369 kB | |||
| 9.7 kB | |||
| 41 kB | |||
| 10.2 kB | |||
| 79.9 kB | |||
| 8.31 kB | |||
| 50.4 kB | |||
| 1.25 kB | |||
| 1.2 kB | |||
| 6.03 kB | |||
| 1.87 kB | |||
| 13.3 kB | |||
| 46.8 kB | |||