dory
Health Uyari
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 5 GitHub stars
Code Uyari
- network request — Outbound network request in packages/openclaw-dory/dist/index.js
Permissions Gecti
- Permissions — No dangerous permissions requested
This tool provides a local-first memory daemon and MCP server for AI agents. It uses markdown and SQLite to maintain a shared memory layer so that different AI tools and coding assistants can persistently remember project context and user preferences.
Security Assessment
Overall risk: Medium. The server runs locally and does not request dangerous system permissions. However, the automated scan flagged an outbound network request within a compiled JavaScript file (`packages/openclaw-dory/dist/index.js`). Because this is located in a distribution folder, it is difficult to easily review the underlying source code for data exfiltration risks. The documentation explicitly references passing through raw third-party API keys (`GEMINI_API_KEY`, `OPENROUTER_API_KEY`) and binding to network ports, which introduces standard local network security considerations. No hardcoded secrets were detected.
Quality Assessment
The project is licensed under the permissive MIT standard and was actively updated recently (last push was today). However, community trust and visibility are currently very low. It has only 5 GitHub stars, indicating that the codebase has not been broadly tested or vetted by a large audience. Additionally, the project's status badge explicitly warns that it is still in the "building" phase, meaning it should be considered early-stage or experimental software.
Verdict
Use with caution — the tool is actively maintained and open source, but its early development stage, low community adoption, and unverified outbound network requests warrant careful review before integrating it into sensitive workflows.
One memory layer for every AI agent. Local-first, markdown source of truth, and CLI/HTTP/MCP native. Your agent forgot who you are. Again. Dory fixes that.
🐟 Dory
One memory layer. Every agent. Local by default.
Your agent forgot who you are. Again. Dory fixes that.
The problem
Every AI agent you use keeps its own half-memory.
- Claude remembers one slice.
- Codex keeps another.
- opencode writes to yet another folder.
- OpenClaw and Hermes park sessions somewhere else entirely.
- The next model still asks what you're building, what you prefer, and what already happened.
You end up re-explaining yourself on loop. Decisions get lost. Project state goes stale. No memory actually follows you across tools.
What Dory is
A local-first memory daemon that gives every agent the same brain.
Markdown is the source of truth. SQLite is a disposable sidecar. Agents read and write through a narrow API — wake, search, get, memory-write, link — so Claude, Codex, opencode, OpenClaw, Hermes, and anything with HTTP or MCP share one memory substrate while keeping their own personality.
Dory isn't trying to make every agent identical. It's giving them the same memory so they can act like they share a brain.
Quickstart
git clone <dory-repo-url>
cd dory
uv sync --frozen
mkdir -p data/corpus
export DORY_CORPUS_ROOT="$PWD/data/corpus"
export DORY_INDEX_ROOT="$PWD/.dory/index"
export DORY_AUTH_TOKENS_PATH="$PWD/.dory/auth-tokens.json"
uv run dory init
Try it:
uv run dory memory-write "Atlas is the active focus this week." \
--subject atlas --kind decision --force-inbox
uv run dory search "active focus"
uv run dory wake --profile coding --budget 1200
Serve it over HTTP:
uv run dory-http --corpus-root data/corpus --index-root .dory/index \
--host 127.0.0.1 --port 8766
Or run it as a durable container:
cp .env.example .env
mkdir -p data/corpus
docker compose up -d --build
Docker binds HTTP to 127.0.0.1:8766 by default. Only set DORY_HTTP_BIND=0.0.0.0 behind a trusted LAN, VPN, reverse proxy, or firewall.
Compose builds with network: host so dependency installs use the host resolver on private DNS setups. If runtime containers cannot resolve external hosts, set DORY_DOCKER_DNS_SERVERS in .env. Raw GEMINI_API_KEY / OPENROUTER_API_KEY values are passed through as compatibility aliases for providers that expect those names.
Full walkthrough → docs/getting-started.md
The loop
wake → search → get → memory-write → link
- wake — bounded hot context at session start
- search — hybrid search across durable memory and session evidence
- get — exact markdown, with hashes and metadata
- memory-write — semantic writes (facts, preferences, decisions, project state)
- link — backlinks, neighbors, graph structure
Markdown stays editable by hand. Open it in Obsidian, diff it in git, inspect it in the browser wiki, or let agents update it through guarded write APIs. You always have a human-readable audit trail.
What's in the box
| Surface | What it does |
|---|---|
| CLI | uv run dory — init, search, memory-write, research, ops jobs, migrations |
| HTTP daemon | /v1/wake, /v1/active-memory, /v1/search, /v1/research, /v1/get, /v1/write, /v1/memory-write, /v1/purge, /v1/session-ingest, /v1/link, /v1/status, /v1/stream, /metrics, /wiki |
| Native MCP | uv run dory-mcp --mode stdio or --mode tcp |
| MCP bridge | HTTP-backed bridge for remote daemons |
| Hermes provider | plugins/hermes-dory/ |
| OpenClaw package | packages/openclaw-dory/ |
| Browser wiki | Read/edit the corpus from a browser (auth-gated) |
Deployment shapes
- Repo-local — development, experiments, throwaway corpora.
- Same-host daemon — one workstation, all local agents hit
127.0.0.1. - Docker service — durable always-on daemon, bind-mounted markdown corpus.
- Private remote host — LAN box, VPN host, or VPS reachable over HTTP by multiple machines.
The corpus, index, auth tokens, public URL, and model provider keys are environment-specific. Keep them out of the public repo.
Stack
- Language — Python (uv + pyproject)
- Storage — Markdown source of truth · SQLite (FTS5, graph edges, embedding cache, chunk vectors, session evidence)
- Embeddings — Gemini Embedding 2 (Matryoshka 768); required for the HTTP/MCP/search/write runtime today
- Dreaming & maintenance LLM — Gemini 3.1 Flash via OpenRouter
- Active-memory LLM — optional · OpenRouter or any OpenAI-compatible local/LAN endpoint (Ollama, LM Studio, vLLM). The runtime default is OpenRouter when configured;
.env.examplesets it toofffor deterministic retrieval-only installs - Auth — bearer tokens via
.dory/auth-tokens.json;DORY_ALLOW_NO_AUTH=truefor local dev only
Design influences
Dory is a composite of patterns that already worked:
- Karpathy's LLM Wiki — a persistent markdown layer that compounds instead of forcing rediscovery. Dory keeps that, but generates the wiki from a structured memory core.
- gbrain — human-readable canonical pages, source-backed evidence, entity resolution, backlinks, read-before-write discipline.
- Mem0 — scoped memory APIs, explicit add/update/delete semantics, memory ops as first-class tools instead of hidden chat history.
- MemPalace / memory palace systems — bounded wake-up context, local-first session storage, transcript mining, layered recall.
- Markdown + git — plain files, diffs, reviews, backups, human inspection.
The goal is practical: one memory layer for all agents, with enough structure to stay useful and enough plain text to stay debuggable.
Status
- Core — CLI, HTTP, MCP, search, and semantic writes are in-repo and covered by tests.
- Default runtime — local-first. Server and corpus live wherever you check out.
- Corpus — a fresh checkout ships without
core/user.md,core/soul.md,core/env.md,core/active.md, or thewiki/tree. You populate those. - Locked goals — (1) frozen wake-up block, (2) cross-agent shared memory.
- Public tree — current implementation, synthetic evals, and integration surfaces. Private planning notes stay private.
Docs
| Getting started | Install, init, first wake |
| Agent integration | Wire up Claude, Codex, opencode, OpenClaw, Hermes |
| Contributing | Development setup, validation, commit rules, PR rules |
| Agent guide | Shared instructions for coding agents working in this repo |
| Codebase map | Where everything lives |
| Runtime & data flow | How requests move through the system |
| Surfaces & integrations | CLI, HTTP, MCP, providers |
| Operations & validation | Dream, maintain, reindex, migrate |
| Ops runbook | Day-to-day operation |
| Client runbook | For agent integrators |
| Evals | Benchmarks and coverage |
Contributing
Contributions are welcome, but the public repo has a hard privacy boundary. Use synthetic data in docs, tests, evals, examples, and fixtures. Do not commit private corpora, raw session logs, real personal memories, direct contact details, local absolute paths, private hostnames, tokens, or .env files.
Read CONTRIBUTING.md before opening a PR. The short version: use Conventional Commits, keep changes scoped, run the relevant uv checks, and run scripts/release/check-public-safety.py for public docs or artifacts.
Useful entrypoints
CLI & searchuv run dory # root command
uv run dory init # new corpus
uv run dory search "query" # hybrid search
uv run dory memory-write "..." --subject x --kind decision
uv run dory research "What are we working on?" --kind report
Set DORY_GEMINI_API_KEY or GOOGLE_API_KEY before starting HTTP/MCP or any command that embeds, searches, writes semantic memory, reindexes, or runs evals. LLM query planning, expansion, and reranking are opt-in via DORY_QUERY_PLANNER_ENABLED, DORY_QUERY_EXPANSION_ENABLED, DORY_QUERY_RERANKER_ENABLED.
uv run dory-http --corpus-root <corpus> --index-root <index>
uv run dory-mcp --mode stdio
uv run dory-mcp --mode tcp --host 127.0.0.1 --port 8765
Example MCP config: scripts/claude-code/mcp.example.json. HTTP bearer tokens: uv run dory auth new <name>. The browser wiki login also needs DORY_WEB_PASSWORD.
uv run dory ops dream-once # batch dream pass (+ recall-promotion distillation)
uv run dory ops daily-digest-once # summarize shipped sessions into digests/daily/
uv run dory ops maintain-once # maintenance pass
uv run dory ops wiki-refresh-once # rebuild compiled wiki
uv run dory ops eval-once # eval batch
uv run dory ops watch # foreground corpus watcher
Installers: scripts/ops/install-dory.sh, install-backup-cron.sh, install-ops-launchd.sh.
uv run dory --corpus-root <corpus> migrate <legacy-corpus>
uv run dory --corpus-root <corpus> migrate --estimate --sample 25 <legacy-corpus>
uv run dory --corpus-root <corpus> migrate --interactive <legacy-corpus>
Stages docs, normalizes to markdown evidence, classifies, extracts memory atoms, bootstraps canonical pages, writes a migration report, quarantines edge cases. Afterwards run ops wiki-refresh-once.
Session evidence is stored separately from durable memory. client and solo installs auto-discover local sessions via scripts/ops/client-session-shipper.py. The shipper keeps a local spool plus checkpoint state and polls known harness stores — no manual --source. Endpoint: POST /v1/session-ingest. search with mode="recall" reads the session evidence plane directly.
Preferred write surface is semantic, not path-first.
- CLI:
uv run dory memory-write "Atlas prefers concise status notes." --subject atlas --kind preference - HTTP:
POST /v1/memory-write - MCP:
dory_memory_write - OpenClaw:
memory_write - Hermes:
memory_write(...)
Path-first write stays available for compatibility and debug flows.
License
MIT — see LICENSE.
A note on the name
Not affiliated with, endorsed by, or connected to Disney or Pixar. "Dory" is an affectionate nod to the fish who couldn't hold a thought — a fitting mascot for a memory daemon. The GIF is embedded from Giphy as fan reference under fair use. If any rights holder objects, open an issue and it's gone.
Just keep swimming. 🐟
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi