A transparent proxy that intercepts every MCP tool call, resource read, and response — streaming it all to a real-time dashboard. Sentry for AI agents.
$git clone https://github.com/candordotcodes/candor.gitFrom real-time interception to cost tracking, Candor gives you complete visibility into every agentic workflow.
Zero-config transparent proxy. Point your MCP client through localhost:3100 — every JSON-RPC message is captured without touching your agent code.
Real-time WebSocket feed of every tool call, resource read, and response. Filter by agent, tool, status, or time range as events stream in.
Browse historical agent sessions with full event chains. Expand any request/response payload. See aggregate stats per session at a glance.
Track estimated costs per session, per tool, and per time period. Set monthly budget alerts. Configurable rate tables for any LLM provider.
Define custom rules: latency thresholds, cost limits, tool allowlists. Alerts surface in the dashboard and optionally fire webhooks.
git clone https://github.com/candordotcodes/candor.gitClone the repo and start locally. Zero dependencies on your agent codebase.
Route MCP through localhost:3100Swap one line in your MCP client config. Candor becomes the transparent middle layer.
candor startThe proxy launches alongside the dashboard. Your agents work exactly as before.
Open localhost:3200Watch every tool call, resource read, and response stream in real time.
This is a live simulation. Every event, session, and cost metric you see is generated in real time.