Back to directory
Docs & Writingdaveinchy/llama.native.js
llama.native.js
use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.
Suggested install command
npm view llama.native.jsAlways inspect the linked repository and skill instructions before running commands. Skills are instructions; permissions and execution still matter.
Compatibility
Agent support matrix
3 supported
| Agent | Status |
|---|---|
| Claude Code | Supported |
| OpenCode | Not listed |
| Cursor | Supported |
| MCP | Not listed |
| GitHub Copilot | Not listed |
| Windsurf |