openchronicle-mcp
Health Uyari
- License — License: AGPL-3.0
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 8 GitHub stars
Code Gecti
- Code scan — Scanned 12 files during light audit, no dangerous patterns found
Permissions Gecti
- Permissions — No dangerous permissions requested
This tool acts as a self-hostable orchestration engine that provides persistent memory, context management, and auditable decision trails for Large Language Models across multiple sessions and interfaces.
Security Assessment
The overall risk is rated as Medium. By design, the application processes and stores your conversation data and handles sensitive API keys for various LLM providers. The code scan (12 files) found no hardcoded secrets, dangerous code patterns, or requests for overly broad system permissions. However, it actively makes external network requests to AI providers and relies on a local SQLite database for storing your persistent data. While it includes a built-in privacy gate for PII detection, users must still trust the host environment, especially if utilizing the public-facing HTTP API or Discord bot features.
Quality Assessment
The project is highly active, with its most recent code push occurring today. It is legitimately open-source under the AGPL-3.0 license and utilizes modern containerization with an automated CI pipeline. A light code audit yielded clean results, indicating decent baseline development practices. The primary concern is its extremely low community visibility; having only 8 GitHub stars means the codebase has not undergone widespread peer review or extensive real-world battle-testing.
Verdict
Use with caution. The code itself appears safe and well-structured, but its limited community adoption means you should perform your own review before integrating it into sensitive environments.
OpenChronicle is an open-source, self-hosted interaction engine for LLMs that makes conversations durable. It adds persistent memory, deterministic tasking, and auditable decision trails so work doesn’t reset each session. Provider-agnostic by design, it supports multiple interfaces (CLI, Discord, MCP) with explicit, privacy-aware routing.
OpenChronicle
Persistent memory and context for LLM conversations.
Chat context dies between sessions. OpenChronicle fixes that — it's an
orchestration core that gives any LLM durable memory, explainable routing,
and auditable decision history across conversations, sessions, and tools.
Features
- Persistent memory — full-text search (FTS5) with deterministic
fallback, pinning, tagging; conversations resume where you left off - Multi-provider routing — OpenAI, Anthropic, Groq, Gemini, Ollama with
pool-based selection and automatic fallback - Mixture-of-Experts — consensus answers from multiple models via
--moeflag - Streaming responses — token-by-token output with
--no-streamopt-out - Hash-chained event log — tamper-evident audit trail for every decision
- MCP server — 20 tools exposing memory, conversation, and context to
any MCP-compatible client (Claude Code, Goose, VS Code) - HTTP API — 20 REST endpoints mirroring MCP tools, API key auth, rate
limiting, auto-starts withoc serve - Discord bot — slash commands, session mapping, multi-user isolation
- Scheduler — tick-driven job execution with atomic claim and drift
prevention - Asset management — file storage with SHA-256 dedup and generic linking
- Plugin system — extend with stateless task handlers
- Privacy gate — PII detection (6 categories, Luhn validation) before
data leaves your machine - Hexagonal architecture — enforced by tests, not convention
Quick Start
pip install -e ".[openai]"
oc init
export OPENAI_API_KEY=your_key_here
oc chat
That's it. You're in a persistent conversation with memory, streaming, and
full audit trail.
Quick Start (Docker)
docker pull ghcr.io/carldog/openchronicle-mcp:latest
docker compose run --rm openchronicle chat
Persistent volumes: /data (SQLite DB), /config, /plugins, /output.
Interfaces
| Interface | Entry point | Use case |
|---|---|---|
| CLI | oc chat, oc convo ask |
Interactive and scripted use |
| STDIO RPC | oc serve / oc rpc |
Programmatic integration |
| HTTP API | Auto-starts with oc serve |
REST clients, webhooks, web UIs |
| MCP Server | oc mcp serve |
Agent interop (Goose, Claude Code) |
| Discord | oc discord start |
Chat bot with slash commands |
Supported Providers
| Provider | Extra | Streaming | Tool Use |
|---|---|---|---|
| OpenAI | .[openai] |
Yes | Yes |
| Anthropic | .[anthropic] |
Yes | Yes |
| Groq | .[groq] |
Yes | Yes |
| Gemini | .[gemini] |
Yes | Yes |
| Ollama | .[ollama] |
Yes | Yes |
| Stub | (built-in) | Yes | Yes |
Documentation
| Document | Description |
|---|---|
| Architecture | Hexagonal layers, event model, directory tree |
| CLI Commands | Full oc command reference |
| Environment Variables | All ~60 configuration knobs |
| MCP Server Spec | Tool list, transports, integration guide |
| Plugin Guide | Build and register task handlers |
| Design Decisions | Rationale for core subsystems |
| RPC Protocol | JSON-RPC stdio protocol spec |
| Backlog | Roadmap and feature backlog |
Contributing
See CONTRIBUTING.md.
Security
See SECURITY.md.
License
AGPL-3.0 — free to use, modify, and share. Network service use
requires publishing source under the same license.
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi