gemini-llm-council
Health Gecti
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 11 GitHub stars
Code Uyari
- fs module — File system access in hooks/inject-context.js
- process.env — Environment variable access in hooks/secure-read.js
- fs module — File system access in hooks/secure-read.js
- network request — Outbound network request in package-lock.json
- network request — Outbound network request in package.json
- fs module — File system access in scripts/sync-version.js
Permissions Gecti
- Permissions — No dangerous permissions requested
This extension is a multi-LLM consensus tool for the Gemini CLI. It queries several top-tier models simultaneously via the OpenRouter API to provide peer-reviewed answers and autonomous codebase investigations for complex debugging tasks.
Security Assessment
The overall risk is Medium. The tool relies on outbound network requests to communicate with the OpenRouter API, which is its intended function. It does not execute arbitrary shell commands or request dangerous system permissions. However, it reads the local filesystem (accessing configuration files, READMEs, and package.json) and handles environment variables. While it requires an OpenRouter API key, it delegates the storage to the host Gemini CLI rather than hardcoding it. Developers should be mindful that local project metadata and prompts are actively transmitted over the network to third-party LLM providers.
Quality Assessment
The project is actively maintained, with its most recent update occurring today. It uses the permissive MIT license and is transparent in its operations. With 11 GitHub stars, the community trust is currently low given the small user base, which is typical for niche CLI extensions. The documentation is comprehensive and clearly outlines setup, features, and prerequisites.
Verdict
Use with caution: the extension is actively maintained and open source, but be aware that your local codebase context and queries will be sent externally to third-party LLMs.
Multi-LLM consensus extension for Gemini CLI. Inspired by Andrej Karpathy's llm-council.
Gemini LLM Council Extension
Consult multiple top-tier LLMs simultaneously with automated peer review and synthesis. Leverage the "Wisdom of the Crowd" to get high-confidence answers for complex architectural and debugging tasks.
Inspired by Andrej Karpathy's llm-council.
✨ Advanced Features
- Autonomous Investigator: New
/council:investigatecommand that uses a specialized subagent to autonomously explore your codebase and gather evidence before deliberating. - Hierarchical Config: Save your council settings Globally (for all projects) or Project-specifically (checked into your repo).
- Specialized Personas: Run targeted reviews using built-in audit personas (e.g.,
security,performance). - Automatic IQ: The council automatically detects if your query requires a specific persona and applies it to guide the review phase.
- Customizable: Define your own personas in
~/.gemini/extensions/gemini-llm-council/personas.json. - Ambient Grounding: System hooks automatically inject core project metadata (README, package.json) into council consultations.
- Deep Audit Trail: Offload massive raw deliberations to MCP Resources. Accessible via
council://URIs provided in the summary report.
Prerequisites
- Gemini CLI installed.
- An OpenRouter API Key.
Setup
Link the extension:
gemini extensions link .Configure API Key:
Use the Gemini CLI to set your OpenRouter API key securely.gemini extensions config gemini-llm-council "OpenRouter API Key"Build the extension:
npm install npm run buildConfigure Council Members:
/council:setupChoose between Global (All Projects) or Project (Current Folder) scope.
Commands
| Command | Description |
|---|---|
/council:setup |
Select models, reasoning depth, and configuration scope. |
/council:ask <query> |
One-shot consultation with automated project grounding and persona detection. |
/council:investigate <issue> |
Autonomous: Subagent handles the file-reading and deliberation loop. |
/council:persona <name> <query> |
Consult using a specific persona (e.g., security, performance, or your custom ones). |
/council:status |
Show active members, reasoning effort, and active configuration scope. |
Usage Examples
Autonomous Debugging
/council:investigate "The database connection keeps timing out in production environments."
The council investigator will autonomously find your config files, logs, and connection logic to provide a verified fix.
Security Audit (Automatic or Explicit)
/council:ask "Audit the new user registration flow for potential injection flaws."
The Chairman will automatically detect the "Security" domain and load the appropriate persona.
Global vs Project Config
- Global: Stored in
~/.gemini/extensions/gemini-llm-council/config.json. - Project: Stored in
.gemini/llm-council.json. (Project config overrides global).
Architecture
- Autonomous Orchestration: Uses Gemini CLI Subagents to isolate the heavy lifting of multi-file investigations.
- Intelligent Grounding: Uses BeforeTool Hooks to automatically provide context about your tech stack.
- Clean UI: Moves raw multi-model critiques to MCP Resources, keeping your main chat readable.
Inspiration
This project was inspired by Andrej Karpathy's LLM council project, as shared in his Twitter (X) post.
License
MIT
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi