Back to directory
Frontendarusatech/llama-cpp
llama-cpp-capacitor
A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Complete iOS and Android support: text generation, chat, multimodal, TTS, LoRA, embeddings, and more.
Suggested install command
npx skills add arusatech/llama-cpp/llama-cpp-capacitorAlways inspect the linked repository and skill instructions before running commands. Skills are instructions; permissions and execution still matter.
Compatibility
Agent support matrix
3 supported
| Agent | Status |
|---|---|
| Claude Code | Supported |
| OpenCode | Not listed |
| Cursor | Supported |
| MCP | Not listed |
| GitHub Copilot | Not listed |
| Windsurf |