qi
agent
Pass
Health Pass
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 12 GitHub stars
Code Pass
- Code scan — Scanned 4 files during light audit, no dangerous patterns found
Permissions Pass
- Permissions — No dangerous permissions requested
Purpose
This CLI tool provides local-first knowledge search for macOS. It lets you index your documents and query them using BM25 search, vector embeddings, and LLM-powered Q&A entirely on your machine.
Security Assessment
Overall Risk: Low
The tool processes files locally and defaults to communicating exclusively with local services (like Ollama via localhost) for embeddings and text generation. It does not request dangerous system permissions, and a light code scan found no malicious patterns, hardcoded secrets, or hidden obfuscation. Since it indexes user-specified local documents, the main risk involves pointing the generation or embedding provider to an untrusted remote server via the config file. As long as users keep the default local endpoints, the data remains completely private.
Quality Assessment
The project is actively maintained, with its last push occurring today. It is released under the permissive and standard MIT license. With 12 GitHub stars, the community trust level is currently very low, which is typical for a new or niche personal project. It is written in Go, making it easy to compile into a standalone, dependency-free binary.
Verdict
Safe to use.
This CLI tool provides local-first knowledge search for macOS. It lets you index your documents and query them using BM25 search, vector embeddings, and LLM-powered Q&A entirely on your machine.
Security Assessment
Overall Risk: Low
The tool processes files locally and defaults to communicating exclusively with local services (like Ollama via localhost) for embeddings and text generation. It does not request dangerous system permissions, and a light code scan found no malicious patterns, hardcoded secrets, or hidden obfuscation. Since it indexes user-specified local documents, the main risk involves pointing the generation or embedding provider to an untrusted remote server via the config file. As long as users keep the default local endpoints, the data remains completely private.
Quality Assessment
The project is actively maintained, with its last push occurring today. It is released under the permissive and standard MIT license. With 12 GitHub stars, the community trust level is currently very low, which is typical for a new or niche personal project. It is written in Go, making it easy to compile into a standalone, dependency-free binary.
Verdict
Safe to use.
query search engine for llms
README.md
qi
A local-first knowledge search CLI for macOS. Index your documents and search them using BM25 full-text search, vector embeddings, and LLM-powered Q&A — all running locally with no external dependencies.
Quickstart
# Install
go install github.com/itsmostafa/qi@latest
# Initialize config and database
qi init
# Edit config to point at your documents
$EDITOR ~/.config/qi/config.yaml
# Index your documents
qi index
# Search
qi search "my query"
# Hybrid search (BM25 + vector, requires embedding provider)
qi query "my query" --mode hybrid
# Ask a question (requires generation provider)
qi ask "how does X work?"
# Health check
qi doctor
Configuration
The config lives at ~/.config/qi/config.yaml. See config.example.yaml for a fully annotated example.
database_path: ~/.local/share/qi/qi.db
collections:
- name: notes
path: ~/notes
extensions: [.md, .txt]
providers:
embedding:
base_url: http://localhost:11434 # Ollama
model: nomic-embed-text
dimension: 768
generation:
base_url: http://localhost:11434
model: llama3.2
Commands
| Command | Description |
|---|---|
qi init |
Create config and database |
qi index [collection] |
Index all (or one) collection |
qi search <query> |
BM25 full-text search |
qi query <query> |
Hybrid search (BM25 + vector) |
qi ask <question> |
RAG-powered answer with citations |
qi get <id> |
Retrieve document by 6-char hash ID |
qi stats |
Show index statistics |
qi doctor |
Health check |
qi version |
Print version |
Search Modes
qi query supports three modes via --mode:
lexical: BM25 full-text search onlyhybrid(default): BM25 + vector search fused with Reciprocal Rank Fusion (RRF)deep: hybrid + optional reranking
Use --explain to see scoring breakdown:
qi query "chunking algorithm" --mode hybrid --explain
Architecture
- Storage: SQLite with content-addressable blobs (SHA-256 keyed
contenttable), FTS5 for BM25, BLOB-stored embeddings for vector KNN search - Chunking: Break-point scoring (headings=100, code fences=80, blank lines=20) with distance decay from target chunk size
- Providers: OpenAI-compatible HTTP API adapters (Ollama, llama.cpp, etc.)
- Graceful degradation: Vector search and Q&A are optional — BM25 always works
Document IDs
Each document gets a short ID from the first 6 hex characters of its SHA-256 content hash:
qi get abc123
License
This project is licensed under the MIT License - see the LICENSE file for details.
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found