daedra
Health Warn
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 9 GitHub stars
Code Pass
- Code scan — Scanned 7 files during light audit, no dangerous patterns found
Permissions Pass
- Permissions — No dangerous permissions requested
This tool is a self-contained web search MCP server written in pure Rust. It provides an automatic fallback chain across multiple search engines and websites, allowing AI models and CLI tools to search the web from any IP address without requiring external dependencies or mandatory API keys.
Security Assessment
Overall Risk: Low. The tool inherently makes external network requests to perform web searches and fetch webpage contents, which is its primary function. It does not request dangerous local system permissions, and the automated code scan found no dangerous patterns, hardcoded secrets, or hidden shell execution. Users should be aware that search queries will be sent to various external third-party services (like Wikipedia, GitHub, Bing, etc.), but this is completely expected behavior for a search tool.
Quality Assessment
The project is high quality. It is actively maintained, with its most recent push occurring today. It uses the standard, permissive MIT license and utilizes Continuous Integration (CI). The only notable warning is its low community visibility; with only 9 GitHub stars, the tool is very new and has not yet been widely adopted or heavily vetted by a large user base. However, the codebase is clean and written in a memory-safe language.
Verdict
Safe to use, though keep in mind its low community visibility means it has not yet been battle-tested by a wide audience.
Self-contained web search MCP server. Exhaustive backends with automatic fallback. Pure Rust. Works from any IP.
Daedra
Self-contained web search MCP server. Rust. 7 backends. Works from any IP.
No API keys required. No Docker. No Python.
Daedra is a self-contained web search MCP server written in Rust. Multiple search backends with automatic fallback. Works from any IP — datacenter, VPS, residential. No API keys required for basic search.
Why Daedra?
Every major search engine (Google, Bing, DuckDuckGo, Brave) blocks datacenter/VPS IPs with CAPTCHAs since 2025. Daedra solves this with a multi-backend fallback chain that automatically finds a backend that works:
Serper (API) → Tavily (API) → Bing → Wikipedia → StackOverflow → GitHub → DuckDuckGo
No Docker. No Python. No SearXNG. Pure Rust. Daedra IS the search infrastructure.
Install
cargo install daedra
Search backends
| Backend | Type | API Key | Works from VPS? |
|---|---|---|---|
| Serper.dev | Google JSON API | SERPER_API_KEY |
Yes |
| Tavily | AI-optimized API | TAVILY_API_KEY |
Yes |
| Bing | HTML scraping | None | Sometimes (CAPTCHA risk) |
| Wikipedia | OpenSearch API | None | Always |
| StackExchange | Public API | None | Always |
| GitHub | Public API | None / GITHUB_TOKEN |
Always |
| DuckDuckGo | HTML scraping | None | Rarely (blocked since mid-2025) |
Backends are tried in order. First one that returns results wins.
Usage
MCP Server (for Claude, Cursor, pawan, etc.)
{
"mcpServers": {
"daedra": {
"command": "daedra",
"args": ["serve", "--transport", "stdio", "--quiet"]
}
}
}
CLI
# Search
daedra search "rust async runtime" --num-results 5
# Fetch a webpage as Markdown
daedra fetch https://rust-lang.org
# Check backend health
daedra check
# Server info
daedra info
As a Rust library
use daedra::tools::SearchProvider;
use daedra::types::SearchArgs;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let provider = SearchProvider::auto();
let args = SearchArgs {
query: "rust programming".to_string(),
options: None,
};
let results = provider.search(&args).await?;
for r in &results.data {
println!("{} — {}", r.title, r.url);
}
Ok(())
}
MCP Tools
web_search
Search the web with automatic backend fallback.
{
"query": "search terms",
"options": {
"region": "wt-wt",
"safe_search": "MODERATE",
"num_results": 10,
"time_range": "w"
}
}
Aliases: search_duckduckgo (backward compat)
visit_page
Fetch and extract web page content as Markdown.
{
"url": "https://example.com",
"selector": "article.main",
"include_images": false
}
Architecture
Daedra
├── SearchProvider (fallback chain)
│ ├── SerperBackend (Google via API)
│ ├── TavilyBackend (AI-optimized API)
│ ├── BingBackend (HTML scraping)
│ ├── WikipediaBackend (OpenSearch API)
│ ├── StackExchangeBackend (Public API)
│ ├── GitHubBackend (Public API)
│ └── SearchClient (DuckDuckGo HTML)
├── FetchClient (HTML → Markdown)
├── SearchCache (moka async cache)
├── MCP Server
│ ├── STDIO transport (JSON-RPC)
│ └── SSE transport (Axum HTTP)
└── CLI (clap)
Configuration
# Optional API keys (improves result quality)
export SERPER_API_KEY=... # Google results via Serper
export TAVILY_API_KEY=... # AI-optimized search
export GITHUB_TOKEN=... # Higher GitHub API rate limit
# Logging
export RUST_LOG=daedra=info
Ecosystem
| Project | What |
|---|---|
| pawan | CLI coding agent that uses daedra for web search via MCP |
| ares | Agentic retrieval-enhanced server |
| eruka | Context intelligence engine |
Built by DIRMACS.
License
MIT
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found