hardware-probe
mcp
Warn
Health Warn
- License — License: Apache-2.0
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 5 GitHub stars
Code Warn
- network request — Outbound network request in src/api.client.ts
- process.env — Environment variable access in src/config.ts
- network request — Outbound network request in src/integration.test.ts
Permissions Pass
- Permissions — No dangerous permissions requested
No AI report is available for this listing yet.
Expert hardware probe for AI, Gaming, and HPC. Analyzes GPU, VRAM, and Memory Bandwidth to optimize local LLM inference (Ollama) and provide actionable hardware upgrade recommendations.
README.md
Yamaru Hardware Probe (MCP)
Expert system hardware probe and performance diagnostic engine for AI, Gaming, and High-Performance workflows. This is a Model Context Protocol (MCP) server that provides deep system insights beyond simple specifications.
Key Features
- 🔍 Deep Hardware Inventory: Comprehensive analysis of CPU, RAM, GPU (VRAM/Bandwidth), Storage, and OS topology.
- ⚡ Real-time Performance Monitoring: Live tracking of system load and identification of resource-hogging processes.
- 🧊 Thermal & Power Diagnostics: Detects thermal throttling and frequency clipping to resolve unexpected slowness.
- 🤖 AI/LLM Optimization: Specialized tools for predicting LLM performance, calculating quantization fit, and optimizing runtimes (Ollama, CUDA, Metal).
- 🛡️ Privacy-First: Automatic anonymization of unique hardware identifiers before any remote transmission.
Installation
For Gemini CLI Users
gemini extension install @yamaru-eu/hardware-probe
For Manual MCP Setup
Add this to your MCP settings file (e.g., npx-config.json or claude_desktop_config.json):
{
"mcpServers": {
"yamaru-probe": {
"command": "npx",
"args": ["-y", "@yamaru-eu/hardware-probe"]
}
}
}
Available Tools
analyze_local_system: Full hardware inventory.analyze_performance: Real-time performance metrics and top processes.analyze_ram_pressure: Detailed memory pressure and RSS analysis for deep RAM troubleshooting.check_storage_health: Disk SMART health, firmware, and I/O bottleneck analysis.thermal_profile: Real-time CPU/GPU thermal states, fan speeds, and frequency throttling detection.diagnose_antivirus_impact: Detects EDR/Antivirus conflicts and exclusion coverage on dev paths.monitor_system_health: Statistical health report (CPU load, RAM usage, temperature) with min/max/avg over a configurable time window (up to 10 minutes).check_llm_compatibility(BETA): Predicts performance for a specific LLM model via remote API.get_llm_recommendations(BETA): Recommends the best local models via remote API.analyze_inference_config: Deep-dive into AI runtimes and environment variables.
Skills Integration
When used with Gemini CLI, this extension provides the following expert skills:
hardware-performance-expert: Global protocol for system health and troubleshooting.local-inference-optimizer: Specialized logic for fine-tuning local LLM runs.
Development
npm install # Install dependencies
npm run build # Compile TypeScript → dist/
npm run test # Run test suite
npm run inspector # Test tools in the MCP Inspector
License
Apache 2.0 - Part of the Yamaru Project.
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found