Back to directory
Cloud & DevOpsrobustonian/ollama-ai-provider-fork
@robustonian/ollama-ai-provider-fork
Vercel AI Provider for running LLMs locally using Ollama
Suggested install command
npx skills add robustonian/ollama-ai-provider-fork/@robustonian/ollama-ai-provider-forkAlways inspect the linked repository and skill instructions before running commands. Skills are instructions; permissions and execution still matter.
Compatibility
Agent support matrix
3 supported
| Agent | Status |
|---|---|
| Claude Code | Supported |
| OpenCode | Not listed |
| Cursor | Supported |
| MCP | Not listed |
| GitHub Copilot | Not listed |
| Windsurf |