Back to directory
FrontendGijungKim/expo-local-llm
expo-local-llm
Expo module for on-device LLM inference via Apple Foundation Models (iOS) and Gemini Nano (Android)
Suggested install command
npx skills add GijungKim/expo-local-llm/expo-local-llmAlways inspect the linked repository and skill instructions before running commands. Skills are instructions; permissions and execution still matter.
Compatibility
Agent support matrix
3 supported
| Agent | Status |
|---|---|
| Claude Code | Supported |
| OpenCode | Not listed |
| Cursor | Not listed |
| GitHub Copilot | Not listed |
| Windsurf | Not listed |
| Codex |