Build with Vaner directly.
Vaner ships as a desktop app for everyday users — that's covered on the main page. This page points at the developer-facing docs: CLI install, MCP wiring, self-hosted backends, source builds, and the full API reference. Vaner is a predictive context engine — MCP is one of several surfaces it exposes, alongside a local daemon, a cockpit UI, and per-client plugins.
Install the CLI
One-line installer for Linux/macOS, plus pipx / uv paths and the from-source build.
Connect your AI client
Claude Code, Cursor, Codex CLI, Cline, Continue, Zed, Windsurf, Roo, VS Code, Claude Desktop. One command per client.
Supported hardware
Vaner runs everywhere Ollama runs. Per-accelerator VRAM tiers and the model size that fits each.
MCP tools
The vaner.* tool family — resolve, search, expand, suggest, feedback, predictions, goals, artefacts, and more.
Configuration
Backends, compute presets, retention, gateway. Everything you'd put in .vaner/config.toml.
Self-hosted
Docker image with SSO/OIDC for team deployments. Per-user workspaces, audit logs, policy controls.
Architecture
How the engine works inside — signals, scenarios, prepared work, the unified event bus.
Source on GitHub
Apache-2.0. Open issues, send PRs, read CONTRIBUTING.md.