lore

mcp
Security Audit
Warn
Health Warn
  • License — License: Apache-2.0
  • Description — Repository has a description
  • Active repo — Last push 0 days ago
  • Low visibility — Only 6 GitHub stars
Code Pass
  • Code scan — Scanned 12 files during light audit, no dangerous patterns found
Permissions Pass
  • Permissions — No dangerous permissions requested
Purpose
This tool is a local semantic search engine that indexes your software patterns and conventions from markdown files. It exposes these patterns to AI coding assistants via an MCP server, allowing the AI to consult your team's best practices before writing code.

Security Assessment
Overall risk: Low. The codebase is written in Rust and compiles into a single binary with no dangerous permissions requested. It runs entirely locally and does not access sensitive system data or execute arbitrary shell commands. However, it does require a local network connection to Ollama (running on port 11434) to generate vector embeddings. There are no hardcoded secrets, and the light code audit found no dangerous patterns. Users should note that prebuilt binaries are not Apple-notarized, requiring a manual Gatekeeper bypass on macOS.

Quality Assessment
The project is actively maintained, with its most recent push occurring today. It uses the permissive Apache-2.0 license and relies on robust, standard components like SQLite (FTS5 and sqlite-vec) for storage. The main drawback is its extremely low community visibility; it currently has only 6 GitHub stars. Consequently, the user base is small, meaning the tool has not undergone widespread peer review or battle-testing.

Verdict
Safe to use, though its experimental nature and low community adoption mean you should expect limited external support.
SUMMARY

Your engineering wisdom, always in context.

README.md

Lore

Your engineering wisdom, always in context.

Lore is a local semantic search engine for your software patterns and conventions, exposed as an MCP
tool for Claude Code. Your knowledge lives as markdown files in a git repository. Lore indexes them
with hybrid full-text and vector search, then serves results over MCP so your AI coding agent
consults your patterns before writing code.

Single Rust binary. No external database. Only runtime dependency is Ollama
for embeddings.

How It Works

Markdown files (git repo, source of truth)
        │
        ▼  ingest
┌──────────────┐     ┌──────────┐
│    lore       │────▶│  Ollama  │  (embed chunks)
│  (Rust binary)│◀────│  :11434  │
└──────┬───────┘     └──────────┘
       │
       ▼
┌──────────────┐
│  SQLite      │  FTS5 (lexical) + sqlite-vec (vector)
│  (single     │  both compiled into the binary
│   .db file)  │
└──────┬───────┘
       │
       ▼  MCP over stdio
┌──────────────┐
│  Claude Code │
└──────────────┘

Quick Start

Prerequisites

  • Rust (latest stable, pinned via rust-toolchain.toml)
  • just — task runner (cargo install just)
  • Ollamabrew install ollama or see install options

Install

Prebuilt binaries are published with every tagged release on the
releases page, accompanied by a SHA256SUMS file for
integrity verification. Pick VERSION from the releases page and set TARGET to one of
x86_64-unknown-linux-gnu (most Linux), x86_64-unknown-linux-musl (Alpine and musl distros),
aarch64-apple-darwin (Apple Silicon), or x86_64-apple-darwin (Intel Mac):

VERSION=0.1.0-alpha.1
TARGET=x86_64-unknown-linux-gnu

curl -LO https://github.com/attila/lore/releases/download/v${VERSION}/lore-${VERSION}-${TARGET}.tar.gz
curl -LO https://github.com/attila/lore/releases/download/v${VERSION}/SHA256SUMS
sha256sum -c SHA256SUMS --ignore-missing   # macOS: shasum -a 256 -c SHA256SUMS --ignore-missing
tar xzf lore-${VERSION}-${TARGET}.tar.gz
sudo mv lore /usr/local/bin/
# (no sudo? mkdir -p ~/.local/bin && mv lore ~/.local/bin/, then ensure ~/.local/bin is on PATH)

macOS Gatekeeper note: tarballs downloaded via curl run without further intervention. If you
download via a browser, macOS may attach the com.apple.quarantine extended attribute and refuse
to launch the binary. Clear it with xattr -d com.apple.quarantine ./lore after extraction (or
right-click → Open the first time). The binary is not Apple-notarized — that requires a paid
Developer ID certificate, which the project does not currently hold.

Build from source

just install

This runs cargo install --path ., placing the lore binary in ~/.cargo/bin/ (which rustup adds
to PATH during Rust installation). To build without installing:

cargo build --release
# binary at ./target/release/lore

Initialize and Use

# Point lore at a directory of markdown files (git repository recommended)
lore init --repo ~/my-patterns

# Test a search
lore search "error handling"

# Check health
lore status

The init command verifies Ollama is running, pulls the embedding model (nomic-embed-text,
~270MB), creates lore.toml and the knowledge database, and runs the first ingestion.

Use with Claude Code

Install the lore plugin to get the MCP server, lifecycle hooks, and the /search and
/coverage-check skills:

claude --plugin-dir /path/to/lore/integrations/claude-code/

The plugin assumes lore is on PATH and uses the default config (~/.config/lore/lore.toml). If
you use a custom config path, either edit integrations/claude-code/mcp.json to add your --config
flag, or add the MCP server manually:

claude mcp add --scope user --transport stdio lore -- \
  lore serve --config /path/to/lore.toml

The manual approach gives only the MCP server. The plugin also includes hooks that inject relevant
patterns before edits, a /search skill for on-demand queries, and a /coverage-check skill that
audits a draft pattern's vocabulary coverage by simulating the PreToolUse hook's own query
extraction against synthetic tool calls. Patterns whose tags: frontmatter list contains
universal opt into an always-on tier — emitted in full at every SessionStart and re-injected on
every relevant tool call — for process-level conventions like push discipline that need continuous
reinforcement (see the "When to use the universal tag" section in the pattern authoring guide).

Commands

Command Purpose
lore init --repo <path> First-time setup: provision Ollama, create config, ingest
lore ingest Re-index the knowledge base after editing markdown files
lore ingest --file <path> Index a single file without requiring a git commit
lore serve Start the MCP server (stdio transport for Claude Code)
lore search <query> Search from the command line
lore extract-queries Simulate the hook's FTS5 query extraction for a tool call
lore status Check health of all components

MCP Tools

The server exposes five tools:

Tool Purpose
search_patterns Semantic + keyword search across all patterns
add_pattern Create a new pattern file, index it, and commit if the base is a git repository
update_pattern Replace an existing pattern's content, re-index, and commit if git is in use
append_to_pattern Add a section to an existing pattern, re-index, and commit if git is in use
lore_status Report knowledge base health: git status, indexed counts, last commit

Knowledge Base Format

Your knowledge base is a directory of markdown files. Any structure works:

my-patterns/
├── error-handling.md
├── testing/
│   ├── unit-tests.md
│   └── integration-tests.md
├── api-design.md
└── code-style.md

Only files with a .md or .markdown extension are ingested. Other files (.txt, .mdx, .rst,
etc.) are silently skipped — they will not appear in search results.

Git is recommended but not required. Lore works against a plain directory, but delta ingest, the
inbox branch workflow, and version history are all unavailable without a git repository. See
Configuration Reference → Git Integration for the full
picture.

Files are chunked by heading — each ## Section becomes a separate searchable unit. YAML
frontmatter tags are extracted and searchable.

To exclude non-pattern files such as README.md, CONTRIBUTING.md, or a drafts/ directory from
indexing, place a .loreignore file at the repository root. The syntax matches .gitignore and
supports negation patterns. See the Configuration Reference for
details.

---
tags: [error-handling, rust, result-types]
---

# Error Handling with Result Types

Always use Result<T, E> for fallible operations...

Search

  • Hybrid (default): Combines FTS5 lexical search and sqlite-vec vector similarity using
    Reciprocal Rank Fusion. Title and tag matches are weighted above body text, so domain-scoped
    queries return the right patterns first.
  • FTS-only: Set hybrid = false in lore.toml to skip Ollama at query time.

Documentation

Guide Description
Pattern Authoring Guide How to write patterns that agents actually follow
Search Mechanics Reference Full search pipeline internals for debugging discoverability
Hook Pipeline and Plugin Reference Hook lifecycle, plugin setup, and injection tuning
Configuration Reference lore.toml options, environment variables, CLI flags
Release Process Maintainer runbook for cutting releases

Development

Prerequisites

Commands

just setup    # configure git hooks (run once after clone)
just ci       # run the full quality gate pipeline
Command What it does
just setup Configure git hooks (run once after clone)
just fmt Check formatting
just fmt-fix Fix formatting
just clippy Run clippy lints
just test Run tests (no Ollama needed)
just deny Run dependency audits
just doc Build documentation
just changelog Regenerate CHANGELOG.md from git history
just ci Run the full pipeline

License

Dual-licensed under MIT and Apache 2.0.

Reviews (0)

No results found