nanobot

mcp
Guvenlik Denetimi
Uyari
Health Uyari
  • License — License: NOASSERTION
  • Description — Repository has a description
  • Active repo — Last push 0 days ago
  • Low visibility — Only 6 GitHub stars
Code Gecti
  • Code scan — Scanned 12 files during light audit, no dangerous patterns found
Permissions Gecti
  • Permissions — No dangerous permissions requested
Purpose
This platform is a self-hostable, multi-channel AI assistant written in Rust. It ships as a single binary designed to connect to multiple language models and deploy across various communication channels like Discord, Slack, and Telegram.

Security Assessment
Overall Risk: Medium. The automated code scan checked 12 files and found no dangerous patterns, hardcoded secrets, or requests for explicitly dangerous permissions. However, the tool's core purpose inherently requires making external network requests to various LLM providers, communication platforms, and external APIs. Because it is an AI agent platform, it likely processes, stores, and routes sensitive data, including user prompts and chat histories. While the light audit is clean, a comprehensive manual review is recommended to ensure secure handling of API keys and user data.

Quality Assessment
The project is actively maintained, with its most recent code push occurring today. The repository includes continuous integration and deployment workflows, which are strong indicators of professional development practices. While the codebase claims an MIT license in the documentation, the automated scanner flagged the license as "NOASSERTION," meaning the actual license file might be missing or improperly configured in the repository. Additionally, community trust and visibility are currently very low, with only 6 GitHub stars. This means the tool has not yet been widely vetted by a large developer community.

Verdict
Use with caution — the light code scan is clean and the project is highly active, but extremely low community adoption and unverified licensing mean it lacks widespread trust and requires further manual review before handling sensitive data.
SUMMARY

AI Agent Platform built in Rust — Multi-model, MCP tools, 14+ channel integrations. Self-host or use teai.io

README.md

nanobot

A production-grade AI agent platform written in pure Rust.

One binary. Six channels. Fifty tools. Zero cold-start drama.

CI
Deploy
Rust
License: MIT
Release
Stars

Live Demo · API Docs · Report Bug


nanobot is a self-hostable, multi-channel AI assistant that ships as a single Rust binary. It connects to 8+ LLM providers with automatic failover, exposes 50+ agentic tools, and deploys to AWS Lambda for pennies. It powers chatweb.ai and teai.io in production today.

Why nanobot?

nanobot Typical agent frameworks
Language Rust (axum) Python / TypeScript
Cold start < 50 ms on Lambda ARM64 3-10 s
Binary ~9 MB stripped Hundreds of MB + runtime
Channels Web, LINE, Telegram, Discord, Slack, Facebook Usually 1-2
LLM failover Automatic round-robin + circuit breaker Manual config
Voice Built-in STT + TTS External service required
Self-host Single binary, zero dependencies Docker + DB + queue + ...
License MIT Varies

Architecture

                         +------------------+
                         |   Your users     |
                         +--------+---------+
                                  |
            +----------+----------+----------+----------+
            |          |          |          |          |
          Web       LINE    Telegram    Discord    Slack ...
            |          |          |          |          |
            +----------+----------+----------+----------+
                                  |
                       +----------v----------+
                       |   API Gateway /     |
                       |   Reverse Proxy     |
                       +----------+----------+
                                  |
                       +----------v----------+
                       |     nanobot         |
                       |  (single binary)    |
                       |                     |
                       |  +-- Auth & Credits |
                       |  +-- Agentic Loop   |
                       |  +-- Tool Runtime   |
                       |  +-- STT / TTS      |
                       |  +-- Memory Engine  |
                       +----+------+----+----+
                            |      |    |
               +------------+   +--+    +------------+
               |                |                    |
        +------v------+  +-----v------+  +---------v---------+
        | LLM Providers|  |  DynamoDB  |  |   External APIs   |
        | (8+ w/ fail- |  | (sessions, |  | (Brave, Jina,     |
        |  over)       |  |  memory,   |  |  OpenAI TTS, ...) |
        +--------------+  |  credits)  |  +-------------------+
                          +------------+

Features

Multi-LLM with Automatic Failover

nanobot doesn't lock you into a single provider. Configure multiple API keys and it handles the rest -- round-robin load balancing, circuit breakers, and transparent failover across providers.

Provider Models Notes
OpenRouter 100+ models Aggregator -- single key, all models
Anthropic Claude Opus / Sonnet / Haiku Recommended for reasoning
OpenAI GPT-4o, o4-mini Broad tool support
Google Gemini 2.5 Pro / Flash Free tier available
DeepSeek DeepSeek-V3 Strong at code
Moonshot Kimi-K2.5 Long context
Qwen Qwen-Max, Qwen-Plus Alibaba Cloud
MiniMax MiniMax-M2.5 Fast inference

Tiered model selection (economy / normal / powerful) lets you balance cost and quality per request.

50+ Built-in Tools

Agentic mode executes multi-step tool chains automatically. Free users get 1 iteration; Pro users get up to 5 with parallel tool execution.

Core (always available)
Tool Description
web_search Brave / Bing / Jina 3-tier fallback
web_fetch Jina Reader for JS-heavy pages
browser CSS selector queries, screenshots, forms
code_execute Sandboxed shell execution
calculator Arbitrary math expressions
weather Global weather data
wikipedia Encyclopedia lookup
translation Multi-language translation
datetime Time zones, date math
qr_code QR code generation
file_read / file_write / file_list Workspace file operations
filesystem Glob find + regex grep
csv_analysis Summary, filter, aggregate
image_generate DALL-E image generation
music_generate Suno API
video_generate Kling API
Integrations (API key required)
Tool Description
github Read/write files, create PRs
gmail Send and search email
google_calendar Event management
slack Post and search messages
discord Channel messaging
notion Page and database queries
spotify Playback control, search
postgresql Direct SQL queries
youtube_transcript Video transcript extraction
arxiv_search Academic paper search
news_search News aggregation
webhook Trigger arbitrary webhooks
phone_call Amazon Connect integration
web_deploy One-click static site deploy
Developer tools (CLI / workspace mode)
Tool Description
git_status Working tree status
git_diff Staged/unstaged diffs
git_commit Commit with message
run_linter Clippy / ESLint / etc.
run_tests Run project test suite

Skill Marketplace

Users can publish and install custom skills:

  • Prompt skills -- inject system prompts for specialized personas or domain knowledge
  • Tool skills -- expose any HTTPS endpoint as an LLM-callable tool via webhook

Skills are stored in DynamoDB and loaded at chat time. No redeploy required.

Multi-Channel

One codebase serves all channels. Conversations sync across them.

Channel Status Optimizations
Web (SPA) Production Voice-first UI, SSE streaming, auto-TTS
LINE Production 200-char responses, emoji, bullet points
Telegram Production 300-char responses, Markdown formatting
Facebook Messenger Production 300-char concise replies
Discord Production Webhook integration
Slack Production Bot token integration

Voice-First

  • STT: Web Speech API (browser-side, zero server cost)
  • TTS: OpenAI tts-1 with response caching
  • Auto-TTS: Voice input triggers automatic voice output
  • Push-to-talk UI with visual feedback

Long-Term Memory

Two-layer auto-consolidation inspired by OpenClaw:

Session context (20 messages)
        |
        v
Daily log (auto-appended after each conversation)
        |
        v
Long-term memory (consolidated summaries)

Memory persists across channels and sessions via DynamoDB.

A/B Testing Framework

Built-in CRO experimentation:

  • Deterministic variant assignment (hash(uid + testId) % N)
  • Event tracking via POST /api/v1/ab/event
  • Aggregated stats with 90-day TTL
  • No external analytics dependency

Quick Start

Try the hosted API (no setup)

curl -X POST https://chatweb.ai/api/v1/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "What can you do?", "session_id": "demo"}'

Run locally

git clone https://github.com/yukihamada/nanobot.git
cd nanobot

# Set at least one provider key
export ANTHROPIC_API_KEY=sk-ant-...
# or: export OPENAI_API_KEY=sk-...
# or: export OPENROUTER_API_KEY=sk-or-...

# Build and run the web gateway
cargo build --bin chatweb
./target/debug/chatweb gateway --http --http-port 3000
# Open http://localhost:3000

Docker

docker run -p 3000:3000 \
  -e OPENAI_API_KEY=sk-... \
  ghcr.io/yukihamada/nanobot

CLI usage

# Interactive agent mode (uses your local API keys directly)
./target/debug/chatweb agent

# Single-shot message
./target/debug/chatweb agent -m "Summarize today's tech news"

# Check configuration
./target/debug/chatweb status

# Install globally
cargo install --path .
chatweb agent

Deploy to AWS Lambda

# Prerequisites
brew install zig && cargo install cargo-zigbuild
rustup target add aarch64-unknown-linux-musl

# Build for Lambda ARM64 (must use musl, not gnu)
cargo zigbuild --manifest-path crates/nanobot-lambda/Cargo.toml \
  --release --target aarch64-unknown-linux-musl

# Or use the deploy script
LAMBDA_FUNCTION_NAME=nanobot-prod ./infra/deploy-fast.sh

Environment Variables

Variable Required Description
ANTHROPIC_API_KEY One of these Claude models
OPENAI_API_KEY GPT-4o and TTS
OPENROUTER_API_KEY 100+ models via single key
GOOGLE_API_KEY Gemini models
DEEPSEEK_API_KEY DeepSeek-V3
LINE_CHANNEL_SECRET For LINE LINE Messaging API
TELEGRAM_BOT_TOKEN For Telegram Telegram Bot API
STRIPE_SECRET_KEY For billing Stripe integration
NANOBOT_WORKSPACE No Workspace directory (default: ~/.nanobot/workspace)

API Endpoints

Method Path Description
POST /api/v1/chat Send a message, get a response
POST /api/v1/chat/stream SSE streaming response
POST /api/v1/chat/race Multi-model race (economy/normal/powerful)
POST /api/v1/chat/explore Parallel execution across all models
POST /api/v1/speech/synthesize Text-to-speech
GET /api/v1/auth/me Current user info
GET /api/v1/skills Browse skill marketplace
POST /api/v1/skills/publish Publish a custom skill
POST /api/v1/coupon/redeem Apply coupon code
POST /webhooks/line LINE webhook
POST /webhooks/telegram Telegram webhook
POST /webhooks/stripe Stripe webhook

System Requirements

Minimum Recommended
CPU 1 core 2+ cores
RAM 128 MB 512 MB
Disk 20 MB 100 MB

Platforms: Linux (x86_64, ARM64), macOS (Apple Silicon, Intel), Windows (WSL2), AWS Lambda (ARM64)


Security

  • Sandboxed execution -- tool code runs in isolated /tmp/sandbox/{session_id}/
  • HMAC-SHA256 password hashing with configurable keys
  • Rate limiting -- 5 login attempts/min, 3 registrations/min
  • Webhook signature verification -- Telegram, Facebook, Stripe
  • Audit logging -- 90-day TTL in DynamoDB
  • CORS whitelist -- only configured origins allowed

See SECURITY.md for vulnerability reporting.


Roadmap

  • Multi-model failover with circuit breakers
  • Voice-first UI (STT + TTS)
  • 6 channel integrations (Web, LINE, Telegram, Discord, Slack, Facebook)
  • 50+ built-in tools with agentic loop
  • Skill marketplace (publish and install custom tools)
  • A/B testing framework
  • Stripe billing integration
  • Long-term memory engine
  • SSE streaming
  • WebSocket transport (Q2 2026)
  • Multi-agent orchestration (Q2 2026)
  • On-device LLM inference via GGUF (Q3 2026)

Project Structure

nanobot/
  crates/
    nanobot-core/         Core library: handlers, tools, providers, memory
    nanobot-lambda/       AWS Lambda entrypoint
  nanobot-cli/            CLI binary
  web/
    index.html            Web SPA (embedded into binary via include_str!)
    skill.html            Skill marketplace UI
    pricing.html          Pricing page
  infra/
    deploy-fast.sh        One-command Lambda deploy
    template.yaml         SAM template

Contributing

git clone https://github.com/YOUR_USERNAME/nanobot.git
cd nanobot
cargo test --all
cargo clippy --all-targets

See CONTRIBUTING.md for guidelines.


License

MIT -- Copyright (c) 2025-2026 nanobot contributors


Acknowledgments

  • HKUDS/nanobot -- original Python nanobot (this project is a complete Rust rewrite)
  • axum, tokio, serde -- the Rust ecosystem that makes this possible
  • Anthropic, OpenAI, Google -- LLM providers

chatweb.ai -- voice-first AI assistant · teai.io -- developer API

Both powered by nanobot.

Star History Chart

Yorumlar (0)

Sonuc bulunamadi