ERINYS-mem
Health Warn
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 5 GitHub stars
Code Pass
- Code scan — Scanned 12 files during light audit, no dangerous patterns found
Permissions Pass
- Permissions — No dangerous permissions requested
This MCP server provides a reflexive memory system for AI agents. It uses local databases to store, retrieve, decay, and distill memories without relying on external LLM calls for the core retrieval pipeline.
Security Assessment
Overall risk: Low. The automated code scan reviewed 12 files and found no dangerous patterns, hardcoded secrets, or requests for dangerous permissions. The project relies heavily on local processing (specifically SQLite FTS5 and sqlite-vec) to achieve its benchmark speeds, which means it does not make external network requests for core memory retrieval. However, the documentation notes that higher-level features like the "Dream Cycle" and "Distillation" do require an LLM. Depending on how you configure that LLM, you should remain aware of where your data is being sent (e.g., local versus cloud APIs). There are no signs of shell command execution or unauthorized data collection.
Quality Assessment
The project is under the permissive and standard MIT license. It is actively maintained, with its most recent code push occurring today. The main concern is low visibility; with only 5 GitHub stars, the tool has not been widely tested or reviewed by the broader developer community. Despite this lack of community validation, the repository is clean, well-documented, and appears to be a genuine effort to solve AI memory decay.
Verdict
Safe to use, though you should verify the LLM provider configuration if you plan to use its higher-level distillation features.
Reflexive memory for AI agents — forgets, distills, and dreams. MCP server with 25 tools.
ERINYS — Reflexive Memory for AI Agents
100% Recall@5 on LongMemEval-S (_s split) · 94% on LoCoMo · 98% on ConvoMem — Zero LLM calls in the retrieval pipeline.
From memories that existed, it even creates memories that never did.
AI agent memory systems have always mimicked human memory. Short-term, long-term, episodic, semantic — textbook categories bolted straight onto implementations.
Something always felt off.
Humans forget. But existing memory systems don't. They grow endlessly, serving stale facts with the same weight as fresh ones. Humans notice "wait, didn't you say something different before?" But memory systems silently overwrite. Humans connect two unrelated experiences and think "oh, I can use that here." But memory systems just store and retrieve.
What needed to be mimicked wasn't the taxonomy of memory. It was the behavior.
That discomfort is what summoned ERINYS.
ERINYS is a guard dog. It remembers, forgets, questions, and bites.
Origin: ERINYS was built as the retrieval layer for HyperAION, an AI agent self-improvement framework. It is released as a standalone MCP server so any agent stack can use it independently.
Benchmarks
All results use the same mode (enhanced_v2_boost) with zero LLM calls in the retrieval pipeline. Note: higher-level features (Dream Cycle, Distillation) do use an LLM — see below.
| Benchmark | N | R@5 | R@10 | Avg Latency |
|---|---|---|---|---|
| LongMemEval-S | 500 | 100.0% | 100.0% | 10.3 ms |
| LoCoMo | 1,982 | 94.0% | 98.1% | 6.9 ms |
| ConvoMem | 250 | 97.6% | — | — |
Why this matters: No API keys. No network. No tokens burned for retrieval. ERINYS achieves these results with FTS5 + sqlite-vec + algorithmic boosting alone. Your agent's memory searches at the speed of SQLite.
LongMemEval evaluated on
longmemeval_ssplit (~20 sessions/question). Results on the harder_msplit have not yet been evaluated. Full methodology, per-category breakdown, and reproduction commands → benchmarks/BENCHMARKS.md
The story of how we got to 100% → 🇯🇵 Japanese / 🇺🇸 English
What Makes ERINYS Different
Forgetting. Most memory systems only accumulate. ERINYS decays memories over time following the Ebbinghaus forgetting curve. Old noise sinks. Frequently accessed knowledge floats. Search results stay relevant without manual curation. Decay runs automatically — no LLM needed.
Distillation. A specific bugfix ("JWT httpOnly flag was missing") automatically generates three layers: the concrete fact → a reusable pattern ("new endpoints need a security checklist") → a universal principle ("security defaults should be safe without opt-in"). No other memory system does this. ⚠️ Distillation requires an LLM call to generate the abstract/meta layers.
Dream Cycle. Two memories are fed to an LLM: "is there a connection?" Candidate pairs are selected by semantic similarity — close enough to be related (cosine > 0.65), far enough to not be redundant (< 0.90). Currently triggered manually via erinys_dream. ⚠️ Dream Cycle requires LLM calls — it is not part of the zero-LLM retrieval pipeline.
Design Philosophy
Memory has layers
Not all memory is equal. ERINYS organizes knowledge by abstraction level:
- Concrete — what happened. "The JWT httpOnly flag was missing on
/api/auth." - Abstract — patterns from facts. "New API endpoints need a security header checklist."
- Meta — principles from patterns. "Security defaults should be safe without manual opt-in."
A single bugfix generates all three through distillation. The meta layer accumulates principles that transfer across projects and tech stacks.
Forgetting is a feature
Every memory has a strength score that decays over time. A memory saved 6 months ago ranks lower than one saved yesterday. Memories accessed frequently resist decay — repeated retrieval reinforces them.
When strength drops below a threshold, the memory becomes a pruning candidate. The database stays lean. Search stays relevant.
Facts change. History shouldn't disappear
When information updates — "we moved from AWS to GCP" — ERINYS doesn't overwrite. It creates a supersede chain: the old fact is marked as replaced but preserved. You can ask "what did we believe in March?" and get the answer that was true then.
Contradictions should be caught
If memory contains both "use PostgreSQL" and "use SQLite", ERINYS detects the conflict. Instead of silently switching, the agent asks: "you previously chose PostgreSQL — has the requirement changed?"
Search finds meaning, not just keywords
Two searches run simultaneously and fuse results:
- Keyword search (FTS5) — exact term matching.
- Vector search (sqlite-vec) — semantic similarity. "authentication" finds "login", "JWT", "session tokens".
Results merge via Reciprocal Rank Fusion (RRF). High in both = highest score.
Everything stays local
Single SQLite file. No cloud APIs. No API keys. No subscriptions. Offline-capable. Your agent's memory never leaves your machine.
Use Cases
1. Cross-Session Memory for Coding Agents
# Agent saves a learning after fixing a bug
erinys_save(
title="Fixed JWT httpOnly flag missing",
content="Cookie was accessible via JS. Added httpOnly: true, secure: true, sameSite: strict.",
type="bugfix",
project="my-app"
)
# Next week, similar task — agent searches memory
erinys_search(query="authentication cookie security", project="my-app")
# → Returns the JWT fix with relevance score
2. Contradiction Detection
erinys_save(title="Database choice", content="Using SQLite for simplicity", project="my-app")
erinys_conflict_check(observation_id=42)
# → "⚠️ Conflicts with #18: 'Using PostgreSQL for production reliability'"
3. Dream Cycle — Overnight Knowledge Synthesis
erinys_dream(max_collisions=10)
# Picks memory pairs in the "sweet spot" (cosine 0.65–0.90)
# Memory A: "RTK reduces token usage by 60-90%"
# Memory B: "Bootstrap Gate takes 3 seconds due to multiple script calls"
# → Insight: "Apply RTK prefix to Bootstrap Gate scripts to reduce overhead"
4. Temporal Queries
erinys_timeline(query="deployment target", as_of="2026-03-01")
# → "AWS EC2 (decided 2026-02-15)"
erinys_timeline(query="deployment target", as_of="2026-04-01")
# → "GCP Cloud Run (superseded AWS on 2026-03-20)"
5. Knowledge Distillation
erinys_save(title="Forgot CORS headers on new endpoint", type="bugfix", ...)
erinys_distill(observation_id=50, level="meta")
# → concrete: "CORS headers missing on /api/v2/users endpoint"
# → abstract: "New API endpoints need a CORS review checklist"
# → meta: "Security concerns should be opt-out, not opt-in"
6. Obsidian Export
erinys_export(format="markdown")
# → Generates .md files with [[wikilinks]]
# Drop into Obsidian → instant knowledge graph
Quick Start
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
ollama pull gemma4:e4b
# Run as MCP server (stdio)
python -m erinys_memory.server
# Run tests
PYTHONPATH=src pytest tests/ -v
MCP Configuration
Claude Desktop / Claude Code
{
"mcpServers": {
"erinys": {
"command": "/path/to/ERINYS-mem/.venv/bin/python3",
"args": ["-m", "erinys_memory.server"],
"env": {
"ERINYS_DB_PATH": "~/.erinys/memory.db"
}
}
}
}
Gemini (Antigravity)
Add to ~/.gemini/antigravity/settings.json under mcpServers:
{
"erinys": {
"command": "/path/to/ERINYS-mem/.venv/bin/python3",
"args": ["-m", "erinys_memory.server"],
"env": {
"ERINYS_DB_PATH": "~/.erinys/memory.db"
}
}
}
Environment Variables
| Variable | Default | Description |
|---|---|---|
ERINYS_DB_PATH |
~/.erinys/memory.db |
SQLite database path |
ERINYS_EMBEDDING_MODEL |
BAAI/bge-small-en-v1.5 |
fastembed model |
ERINYS_DISTILL_MODEL |
gemma4:e4b |
Local Ollama model for auto-distillation |
ERINYS_DISTILL_ENDPOINT |
http://localhost:11434/api/generate |
Local Ollama generate endpoint |
Tools (25)
Core
erinys_save— Save observation (with topic_key upsert)erinys_get— Get by ID (full content, untruncated)erinys_update— Partial updateerinys_delete— Delete with FK cascadeerinys_search— RRF hybrid search (FTS5 + vector)erinys_save_prompt— Save user prompterinys_recall— Recent observationserinys_context— Session context recallerinys_export— Obsidian-compatible markdown exporterinys_backup— SQLite backuperinys_stats— Database statistics
Graph
erinys_link— Create typed edgeerinys_traverse— BFS graph traversalerinys_prune— Prune weak/decayed edges
Temporal
erinys_reinforce— Boost observation strengtherinys_supersede— Version an observationerinys_timeline— Query as-of timestamperinys_conflict_check— Detect contradictions
Dream Cycle
erinys_collide— Collide two observations via LLMerinys_dream— Batch collision cycle
Distillation
erinys_distill— 3-granularity abstraction (concrete → abstract → meta)
Batch & Eval
erinys_batch_save— Bulk save with auto-linkingerinys_eval— LOCOMO-inspired quality metrics
Session
erinys_session_start— Start sessionerinys_session_end— End session with summaryerinys_session_summary— Save structured summary
Architecture
┌──────────────────────────┐
│ FastMCP Server │ 25 tools, unified envelope
├──────────────────────────┤
│ search.py │ graph.py │ RRF hybrid │ typed edges
│ decay.py │ session.py │ Ebbinghaus │ lifecycle
│ temporal.py│collider.py │ versioning │ cross-pollination
│ distill.py │ db.py │ abstraction│ SQLite + vec
├──────────────────────────┤
│ embedding.py │ fastembed (BAAI/bge-small-en-v1.5)
├──────────────────────────┤
│ SQLite + FTS5 + vec0 │ Local-first, no network at runtime
└──────────────────────────┘
Roadmap
- Auto-Distill on Save — Trigger 3-granularity distillation on every save
- Dream Daemon — Background auto-execution of Dream Cycle (currently manual trigger only)
- Auto-Prune — GC decayed observations when DB exceeds size threshold
- Cron-ready CLI —
erinys dream --max 10for scheduled overnight synthesis - PyPI package —
pip install erinys-memory - Multi-agent support — Scoped memory per agent identity
- LongMemEval
_msplit evaluation - GitHub Release tags + CI badges
License
MIT
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found