Back to directory
AI & Automationdinakars777/@dinakars777/cache-llm
@dinakars777/cache-llm
A blazing fast local proxy server that caches LLM API calls to save costs during development.
Suggested install command
npm view @dinakars777/cache-llmAlways inspect the linked repository and skill instructions before running commands. Skills are instructions; permissions and execution still matter.
Compatibility
Agent support matrix
3 supported
| Agent | Status |
|---|---|
| Claude Code | Supported |
| OpenCode | Not listed |
| Cursor | Supported |
| MCP | Not listed |
| GitHub Copilot | Not listed |
| Windsurf |