wechat-ai-bridge
Health Uyari
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 5 GitHub stars
Code Basarisiz
- exec() — Shell command execution in adapters/claude.js
- process.env — Environment variable access in adapters/claude.js
- process.env — Environment variable access in adapters/codex.js
- process.env — Environment variable access in adapters/gemini.js
- network request — Outbound network request in adapters/gemini.js
- exec() — Shell command execution in bridge.js
- process.env — Environment variable access in bridge.js
- process.env — Environment variable access in config.js
- process.env — Environment variable access in dir-manager.js
- exec() — Shell command execution in file-ref-protect.js
Permissions Gecti
- Permissions — No dangerous permissions requested
Bu listing icin henuz AI raporu yok.
Run full Claude Code / Codex / Gemini in WeChat — not an API wrapper. Session management, tool approval, bidirectional file relay. Official iLink API, zero ban risk.
wechat-ai-bridge
Full AI Coding Agents. In WeChat. Private Chat.
Run actual Claude Code / Codex / Gemini from WeChat — not an API wrapper — with session management, tool approval, and file relay.
A self-hosted WeChat bridge that connects to AI coding agents via the official iLink Bot API. No VPN needed. Zero ban risk.
English | 简体中文
How is this different from cc-weixin / wechat-acp?
They connect one AI backend with basic message forwarding. This project adds session management (
/new/resume/sessions), multi-backend switching (Claude + Codex + Gemini), tool approval interaction, and bidirectional file relay — the same workflow you get from telegram-ai-bridge, now in WeChat.
What This Unlocks
Full AI Agent in Your Pocket — No VPN
Send a message in WeChat. Get a full Claude Code response — with Bash, Read, Write, Edit, Glob, Grep, WebFetch, and all native tools. No terminal. No VPN. Works anywhere WeChat works.
Session Management
你: /new ← Start a fresh session
你: /sessions ← List recent sessions
你: /resume 3 ← Pick up where you left off
你: /backend codex ← Switch backend
你: /model ← Switch models (reply with number)
Sessions persist in SQLite across restarts. Pick up after a reboot, a network drop, or a flight. /sessions shows all CC sessions — including ones created from terminal CLI or other bridges — with the last user message as the title. Resume by number: /resume 3.
Multi-Backend Support
| Backend | SDK | Status |
|---|---|---|
claude |
Claude Code (via Agent SDK) | Recommended |
codex |
Codex CLI (via Codex SDK) | Recommended |
gemini |
Gemini Code Assist API | Experimental |
Switch backends per chat with /backend.
Tool Approval via Chat
When the AI needs permission to run a tool:
🔒 工具审批
工具: Bash
git push origin main
请回复数字:
1. 允许
2. 拒绝
3. 始终允许 "Bash"
4. YOLO(全部允许)
Reply 1, 2, 3, or 4. No buttons needed — just text.
Bidirectional File Relay
- Send photos/files to AI: WeChat media is downloaded, decrypted (AES-128-ECB), and injected into the prompt
- Receive files from AI: AI-generated files and screenshots are encrypted, uploaded to CDN, and sent back to your WeChat chat
- Long code output: >2000 chars with >60% code → sent as file attachment with preview
Built-in Resilience
- Rate limiting: Per-user sliding window
- Idle monitoring: Watchdog timer for hung tasks
- Message batching: FlushGate merges rapid consecutive messages (800ms window)
- Send retry: Exponential backoff with error classification
- File reference protection: Prevents auto-linking of
.md,.go,.pyfilenames
Quick Start
Prerequisites: Bun runtime, WeChat version >= 2026.3.20, and at least one backend CLI: Claude Code, Codex, or Gemini CLI.
git clone https://github.com/AliceLJY/wechat-ai-bridge.git
cd wechat-ai-bridge
npm install # or: bun install
bun run bootstrap --backend claude
bun run check --backend claude
bun run start --backend claude
On first launch, a QR code appears in your terminal. Scan it with WeChat to authenticate. Token is saved to ~/.wechat-ai-bridge/token.json for subsequent launches.
WeChat Commands
All commands are plain text — just type and send:
| Command | Description |
|---|---|
/help |
Show all commands |
/new |
Start a new session |
/cancel |
Abort the running task |
/sessions |
List recent sessions |
/resume <#|id> |
Resume by sequence number or session ID |
/backend [name] |
Switch backend (claude/codex/gemini) |
/model [name] |
Pick a model (reply with number) |
/effort [level] |
Set thinking depth |
/status |
Show backend, model, cwd, session |
/dir [path] |
Switch working directory |
/verbose 0|1|2 |
Change progress verbosity |
How It Works
WeChat App ←→ iLink Server (ilinkai.weixin.qq.com) ←→ wechat-ai-bridge ←→ AI Backend
│
├── weixin/ (iLink connection)
├── adapters/ (Claude/Codex/Gemini)
├── sessions.js (SQLite persistence)
└── bridge.js (core message loop)
The bridge uses WeChat's official iLink Bot API — the same protocol behind OpenClaw's WeChat integration. All communication is standard HTTP/JSON with long-polling (getupdates), similar to Telegram's Bot API. Media files are encrypted with AES-128-ECB on CDN.
This is an official Tencent API. Zero ban risk. Backed by WeChat ClawBot Terms of Use.
vs. Existing Projects
| Feature | cc-weixin | wechat-acp | This project |
|---|---|---|---|
| AI backends | Claude only | 6 via ACP | Claude + Codex + Gemini (native SDK) |
| Session management | None | None | /new /resume /sessions /backend |
| Tool approval | Auto-allow | Auto-allow | Interactive (1/2/3/4 choice) |
| Model switching | Hardcoded | Per-preset | /model with numbered selection |
| File relay (in) | Text only | Images+files | Images + files (AES-128-ECB decrypted) |
| File relay (out) | None | None | Auto-detect file paths + CDN upload |
| Rate limiting | None | None | Per-user sliding window |
| Idle monitoring | None | None | Watchdog + auto-reset |
| Message batching | None | None | FlushGate (800ms merge) |
| Cross-platform sessions | None | None | See all CC sessions (CLI + other bridges) |
| Code as file | None | None | >60% code → file attachment |
Configuration
bun run bootstrap --backend claude generates a starter config.json.
{
"shared": {
"cwd": "/Users/you",
"defaultVerboseLevel": 1,
"executor": "direct",
"rateLimitMaxRequests": 10,
"rateLimitWindowMs": 60000,
"idleTimeoutMs": 1800000
},
"backends": {
"claude": {
"enabled": true,
"sessionsDb": "sessions.db",
"model": "claude-sonnet-4-6",
"permissionMode": "default"
}
}
}
Project Structure
start.js— CLI entry with QR login flowconfig.js— Config loader and setup wizardbridge.js— Core message loopprogress.js— Typing indicator + processing statussessions.js— SQLite session persistenceweixin/api.js— iLink HTTP clientweixin/auth.js— QR code login + token persistenceweixin/monitor.js— Long-polling message listenerweixin/send.js— Message sending + text chunkingweixin/media.js— CDN upload/download + AES-128-ECBweixin/types.js— iLink protocol constantsadapters/— AI backend integrations (Claude/Codex/Gemini)executor/— Execution modes
- Authentication: QR code scan →
bot_token(persisted to~/.wechat-ai-bridge/token.json) - Message flow:
getupdateslong-poll (35s hold) → process →sendmessagewithcontext_token - Media: AES-128-ECB encrypted CDN (
novac2c.cdn.weixin.qq.com) - Limitation: 1 WeChat account = 1 bot (1:1 binding)
- No group chat: iLink does not support group messaging
For protocol details, see research.md.
Adding Custom Backends
The adapter interface is designed for easy extension. To add a new AI backend (e.g. OpenCode, Crush, or any CLI agent):
- Create
adapters/yourbackend.jsexportingcreateAdapter(config):
export function createAdapter(config = {}) {
return {
name: "yourbackend",
async *streamQuery({ prompt, sessionId, model, cwd, abortController }) {
yield { type: "session_init", sessionId: "..." };
yield { type: "text", text: "response chunk" };
yield { type: "result", success: true, text: "final answer" };
},
statusInfo() {
return { backend: "yourbackend", model: "...", session: "..." };
}
};
}
- Register it in
adapters/interface.jsand add the name toAVAILABLE_BACKENDSinconfig.js.
Event types: session_init | progress (tool use indicators) | text (streaming chunks) | result (final).
Community interest: OpenCode / Crush support has been requested. OpenCode offers a JS/TS SDK (
@opencode-ai/sdk); Crush provides a Unix socket REST API with SSE streaming. PRs welcome!
Ecosystem
Part of the 小试AI open-source AI workflow:
| Project | Description |
|---|---|
| telegram-ai-bridge | Same architecture, Telegram interface |
| recallnest | MCP memory workbench |
| content-alchemy | 5-stage AI writing pipeline |
| openclaw-tunnel | Docker ↔ host CLI bridge |
License
MIT
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi