bitrouter
The agentic proxy for modern agent runtimes. Smart, safe, agent-controlled routing across LLMs, tools, and agents.
BitRouter - Open Intelligence Router for LLM Agents
The agentic proxy for modern agent runtimes. Smart, safe, agent-controlled routing across LLMs, tools, and agents.
Overview
As LLM agents grow more autonomous, humans can no longer hand-pick the best model, tool, or sub-agent for every runtime decision. BitRouter is a proxy layer purpose-built for LLM agents (OpenClaw, OpenCode, etc.) to discover and route to LLMs, tools, and other agents autonomously — with agent-native control, guardrails, and observability via CLI + TUI, backed by a high-performance Rust proxy that optimizes for performance, cost, and safety during runtime.
Features
- Multi-provider routing — unified access to OpenAI, Anthropic, Google, and custom providers with cost/performance-aware routing (
core·providers·config) - Streaming & non-streaming — first-class support for both modes across all providers
- Agent firewall — inspect, warn, redact, or block risky content at the proxy layer (
guardrails) - MCP gateway — proxy for MCP servers, agents discover and call tools across hosts (
mcp) - A2A gateway — agent identity, discovery, and task dispatch via A2A v0.3.0 (
a2a) - Skills registry — track and expose agent skills following the agentskills.io standard (
skills) - Agentic payment — 402/MPP payment handling for LLMs, tools, and APIs (
api·accounts) - Observability — per-request spend tracking, metrics, and cost calculation (
observe) - CLI + TUI — monitor and control agent sessions in real time (
cli)
Documentation
DEVELOPMENT.md— workspace architecture and server composition detailsCONTRIBUTING.md— contribution workflow, issue reporting, and provider updatesCLAUDE.md— guidance for AI coding agents working in this repository
Quick Start
Install
cargo install bitrouter
Default (setup wizard)
bitrouter
On first launch, BitRouter runs an interactive setup wizard with two modes:
- Cloud — connect to BitRouter Cloud with x402/Solana wallet payments
- BYOK — bring your own API keys for OpenAI, Anthropic, Google, or custom providers
After setup, the TUI and API server start at http://localhost:8787.
You can re-run the wizard at any time with bitrouter init.
BYOK (bring your own keys)
If you already have provider API keys in your environment, BitRouter auto-detects them — no config file needed:
export OPENAI_API_KEY=sk-...
bitrouter
# Routes to "openai:gpt-4o" at http://localhost:8787
For a foreground server without the TUI, use bitrouter serve.
Agent Skills
Install Agent Skills to give your AI agent the knowledge to register on the BitRouter network, configure services, and start serving requests:
# Any agent (Claude Code, Copilot, Cursor, Codex, etc.)
npx skills add BitRouterAI/agent-skills
Supported Providers
| Provider | Status | Notes |
|---|---|---|
| OpenAI | ✅ | Chat Completions + Responses API |
| Anthropic | ✅ | Messages API |
| ✅ | Generative AI API | |
| OpenRouter | ✅ | Chat Completions + Responses API |
Want to see another provider supported? Open an issue or submit a PR — contributions are welcome. If you're a provider interested in first-party integration, reach out on Discord.
Supported Agent Runtimes
BitRouter works as a drop-in proxy for agent runtimes that support custom API base URLs. Point your runtime at http://localhost:8787 and route to any configured provider.
| Runtime | Integration |
|---|---|
| OpenClaw | Native plugin |
| Claude Code | CLI + Skills |
| ZeroClaw | CLI + Skills |
| Codex CLI | CLI + Skills |
| OpenCode | CLI + Skills |
| Kilo Code | CLI + Skills |
Any agent runtime that can target a custom OpenAI or Anthropic base URL works with BitRouter out of the box. Building an agent runtime or framework? We partner with teams to build native BitRouter integrations — reach out on Discord or open an issue.
Comparison
| BitRouter | OpenRouter | LiteLLM | |
|---|---|---|---|
| Architecture | Local-first proxy + optional cloud | Cloud-only SaaS | Local proxy (Python) |
| Language | Rust | Closed-source | Python |
| Self-hosted | Yes | No | Yes |
| Agent-native | Yes — built for autonomous agent runtimes | No — human-facing API gateway | Partial — SDK-oriented |
| Agent protocols | MCP + A2A + Skills | No | MCP + A2A |
| Agent firewall | Built-in guardrails (inspect, redact, block) | Yes | Yes |
| Cross-protocol routing | Yes (e.g. OpenAI format → Anthropic provider) | Provider-specific | Yes (unified interface) |
| Agentic payments | Stablecoin (402/MPP) + Fiat | Credit-based billing | No |
| Observability | CLI + TUI + per-request cost tracking | Web dashboard | Logging + callbacks + WebUI |
| Extensibility | Trait-based SDK — import and compose crates | API only | Python middleware |
| Performance | ~10ms | ~30ms (cloud) | ~500ms |
| License | Apache 2.0 | Proprietary | Apache 2.0 |
TL;DR — OpenRouter is a cloud API marketplace for humans picking models. LiteLLM is a Python proxy for unifying provider SDKs. BitRouter is a Rust-native proxy purpose-built for autonomous agents — with agent protocols (MCP, A2A, Skills), guardrails, and agentic payments out of the box.
Roadmap
- Core routing engine and provider abstractions
- OpenAI, Anthropic, and Google adapters
- Interactive setup wizard (
bitrouter init) with auto-detection - Custom provider support (OpenAI-compatible / Anthropic-compatible)
- Cross-protocol routing (e.g. OpenAI format → Anthropic provider)
- MCP & A2A & Skills protocol support
- TUI observability dashboard
- Telemetry and usage analytics
- Provider & model routing policy customization
Star History
License
Licensed under the Apache License 2.0.
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found