kincode
Health Warn
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 5 GitHub stars
Code Warn
- Code scan incomplete — No supported source files were scanned during light audit
Permissions Pass
- Permissions — No dangerous permissions requested
This is a lightweight, open-source AI coding assistant for the terminal. It acts as a unified interface for various AI models (Anthropic, OpenAI, Ollama) and allows users to execute code, edit files, search the web, and manage custom AI personas via "Soul files."
Security Assessment
Overall Risk: Medium. The tool executes shell commands and reads/writes files by design, which poses inherent risks. It includes a permission system to prompt the user before acting, though this can be bypassed entirely using the `-yolo` flag. Network requests are actively made to AI provider APIs, and it includes built-in web fetching and searching capabilities. It does not request dangerous system permissions. However, the automated code scan was incomplete (Go is not supported by the light audit), meaning the underlying source code was not independently verified for hardcoded secrets or hidden malicious logic.
Quality Assessment
The project is licensed under the permissive MIT license and appears to be actively maintained, with recent repository updates. However, it currently has extremely low visibility within the developer community, evidenced by only 5 GitHub stars. Because of this low adoption and the lack of a completed source code scan, community trust and established reliability remain unproven.
Verdict
Use with caution: the tool has standard AI agent risks (like shell execution), is very new with minimal community validation, and lacks an automated code verification audit.
AI coding assistant in Go. 9MB single binary, zero dependencies. MCP support, Soul files for custom AI personas. 10 built-in tools + sub-agents + web search + memory. The only Claude Code alternative with both MCP and Soul files.
kincode
A lightweight AI coding assistant for your terminal. Written in Go. Single binary. Zero dependencies.
Like Claude Code, but open-source and 10x lighter.
Features
- Single binary (~10MB), zero runtime dependencies
- Multi-provider: Anthropic, OpenAI, Ollama (any OpenAI-compatible endpoint)
- 10 built-in tools: bash, file read/write/edit, glob, grep, web_fetch, web_search, memory, agent_spawn
- Permission system with tool call confirmation (or
-yoloto skip) - Soul files with
brain:config — switch persona AND provider/model per soul, kinclaw-kernel-compatible format -servemode — HTTP+SSE server for desktop shell integration (paired with KinClaw Mac Code mode)- Streaming responses with markdown rendering
- Context compaction: auto-summarizes when context gets large
- Sub-agents: spawn parallel tasks with agent_spawn
- MCP support: connect any MCP-compatible tool server
- Persistent memory across sessions
- Web tools: fetch URLs and search the web (DuckDuckGo, no API key)
- Skill templates: reusable prompt patterns (/skill)
- Extended thinking: deep reasoning mode for complex problems
- Session persistence: auto-save/restore conversations across restarts
- Fast: Go concurrency, minimal memory footprint
Quick Start
# Install
go install github.com/LocalKinAI/kincode/cmd/kincode@latest
# Or download binary
curl -fsSL https://github.com/LocalKinAI/kincode/releases/latest/download/kincode-$(uname -s)-$(uname -m) -o kincode
chmod +x kincode
# Run with Anthropic
export ANTHROPIC_API_KEY=sk-ant-...
kincode
# Run with Ollama (free, local)
kincode -provider ollama -model qwen3:8b
# Run with OpenAI
OPENAI_API_KEY=sk-... kincode -provider openai -model gpt-4o
# Run with a soul file
kincode -soul coder.soul.md
# One-shot mode (non-interactive)
kincode "explain this codebase"
# YOLO mode (auto-approve all tool calls)
kincode -yolo "fix the failing tests"
Claude Login (No API Key Needed)
Use your Claude account directly — works with Free, Pro, and Max:
# First time: login via browser
kincode -login
# Then just use it (defaults to Haiku 4.5)
kincode
# Or specify a different model
kincode -model claude-sonnet-4-6
Your session auto-refreshes. No API key needed.
Soul Files
Define custom personas with .soul.md files. Soul format is compatible with the kinclaw kernel — same file drives either:
---
name: "kincode"
brain:
provider: "ollama" # anthropic | openai | ollama
model: "kimi-k2.6:cloud" # picks the brain when no -provider/-model on CLI
temperature: 0.3
context_length: 131072
rules:
- "Read before you write"
- "Stdlib first, deps last resort"
- "Don't apologize. Don't hedge."
---
You are kincode, a senior coding agent. Ship clean, correct, minimal code...
CLI flag (-provider / -model) > soul brain: > legacy top-level model: > hardcoded default. Pass -soul to load:
kincode -soul ~/Documents/Workspace/kincode/souls/coder.soul.md
Server Mode
Run kincode as a daemon for desktop shell integration:
kincode -serve -port 5002 -soul souls/coder.soul.md
Exposes:
| Route | Method | Purpose |
|---|---|---|
/api/health |
GET | readiness probe |
/api/state |
GET | {repo, model, provider, message_count} |
/api/repo |
POST | chdir agent into a repo: {"path": "..."} |
/api/chat |
POST | kick a turn: {"message": "..."} (returns 202; output via SSE) |
/api/chat |
DELETE | interrupt the in-flight turn |
/api/events |
GET | SSE stream of {type, ...} events |
Event types: user_message, text_delta, tool_call, tool_result, turn_done, error, usage.
KinClaw Mac's Code mode drives kincode through this surface. Server mode forces -yolo (no permission loop over HTTP) and falls back to ollama / kimi-k2.6:cloud if no Anthropic creds are configured — same Ollama install kinclaw uses, no extra setup.
Extended Thinking
Enable deep reasoning for complex problems:
---
name: "Architect"
thinking: true
thinking_budget: 15000
---
When enabled, the model will show its reasoning process in dim text before the final response. Useful for complex debugging, architecture decisions, and multi-step analysis.
Slash Commands
| Command | Description |
|---|---|
/help |
Show all commands |
/clear |
Clear conversation history |
/compact |
Manually compress context |
/model <name> |
Switch model mid-session |
/provider <name> |
Switch provider |
/soul <file> |
Load a soul file |
/memory |
Show persistent memory |
/save <file> |
Save conversation to file |
/load <file> |
Load conversation from file |
/tokens |
Show estimated token usage |
/diff |
Show last file edit as colored diff |
/mcp |
List connected MCP servers and tools |
/skill |
List available skill templates |
/skill <name> |
Load a skill template for next message |
/skill create <name> |
Create a new skill interactively |
/version |
Show version |
/quit |
Exit |
Architecture
kincode (9MB single binary)
├── cmd/kincode/ # CLI entry point, flag parsing, soul loading
├── pkg/
│ ├── agent/ # Core loop: message → LLM → tool calls → loop
│ │ # Context compaction (auto-summarize at 80%)
│ ├── provider/ # LLM providers (raw HTTP, no SDKs)
│ │ ├── anthropic # Anthropic Messages API + SSE streaming
│ │ └── openai # OpenAI-compatible (OpenAI/Ollama/DeepSeek/Gemini/...)
│ ├── tools/ # 10 built-in tools
│ │ ├── bash # Shell execution (30s timeout, blocklist)
│ │ ├── file_* # Read, write, edit (with diff visualization)
│ │ ├── glob/grep # File and content search
│ │ ├── web_* # Fetch URLs, DuckDuckGo search
│ │ ├── memory # Persistent key-value store
│ │ └── agent_spawn # Sub-agent for parallel tasks
│ ├── repl/ # Interactive terminal
│ │ # Readline, markdown rendering, 14 slash commands
│ └── permission/ # Tool call approval (yolo / confirm)
└── internal/
└── mcp/ # MCP protocol client (JSON-RPC 2.0 over stdio)
Comparison
| Claude Code | claw-code (Rust) | nano-claude-code (Python) | kincode (Go) | |
|---|---|---|---|---|
| Binary size | ~100MB | ~15MB | N/A (needs Python) | 9MB |
| Memory usage | ~150MB | ~30MB | ~80MB | ~20MB |
| Dependencies | Node.js + npm | Rust toolchain | Python 3.10+ | zero |
| Install | npm install | cargo build | pip install | download & run |
| Providers | Anthropic | Multi | 10+ | any OpenAI-compatible |
| Built-in tools | 40+ | ~20 | 13 | 10 + MCP |
| Sub-agents | ✅ | ❌ | ✅ | ✅ |
| Memory | ✅ | ❌ | ✅ | ✅ |
| Context compaction | ✅ | ✅ | ✅ | ✅ |
| MCP protocol | ✅ | ✅ | ❌ | ✅ |
| Markdown rendering | ✅ | ✅ | ❌ | ✅ |
| Diff visualization | ❌ | ❌ | ✅ | ✅ |
| Web search | ❌ | ❌ | ✅ | ✅ |
| Session persistence | ✅ | ✅ | ✅ | ✅ |
| Soul files | ❌ | ❌ | ❌ | ✅ unique |
| Open source | ❌ | ✅ | ✅ | ✅ |
MCP Support
Connect to any MCP-compatible tool server:
# Create mcp.json
cat > mcp.json << 'EOF'
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": ["GITHUB_TOKEN=ghp_xxx"]
}
}
}
EOF
# Run with MCP servers
kincode -mcp mcp.json
# List connected servers and tools
> /mcp
MCP tools are automatically registered with a mcp_ prefix (e.g., mcp_read_file, mcp_search_repositories). The LLM can call them like any built-in tool.
Build from Source
git clone https://github.com/LocalKinAI/kincode.git
cd kincode
go build -o kincode ./cmd/kincode/
License
MIT
Built by the team behind LocalKin -- a self-evolving AI agent swarm with 78 specialized agents.
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found