kodit
Health Gecti
- License รขโฌโ License: Apache-2.0
- Description รขโฌโ Repository has a description
- Active repo รขโฌโ Last push 0 days ago
- Community trust รขโฌโ 117 GitHub stars
Code Gecti
- Code scan รขโฌโ Scanned 12 files during light audit, no dangerous patterns found
Permissions Gecti
- Permissions รขโฌโ No dangerous permissions requested
This tool is an MCP server that indexes external and local code repositories. It allows AI coding assistants to search for and retrieve relevant code snippets to improve code generation accuracy and reduce hallucinations.
Security Assessment
The overall risk is Low. The server operates as a specialized search engine, cloning external Git repositories and indexing local directories to expose relevant code to your AI assistant. While this inherently involves reading source code and making network requests to fetch repositories, the automated code scan found no hardcoded secrets or dangerous patterns. It explicitly respects `.gitignore` and `.noindex` files, ensuring it does not accidentally expose sensitive local files.
Quality Assessment
The project appears to be highly maintained and professional. It is written in Go, has clear and comprehensive documentation, and saw repository activity as recently as today. It is protected by the standard Apache-2.0 license. With over 100 GitHub stars, the tool demonstrates a solid baseline of community trust and active usage among developers.
Verdict
Safe to use.
๐ฉโ๐ป MCP server to index external repositories
Kodit: A Code Indexing MCP Server
Kodit connects your AI coding assistant to external codebases to provide accurate and up-to-date snippets of code.
:star: Help us reach more developers and grow the Helix community. Star this repo!
Helix Kodit is an MCP server that connects your AI coding assistant to external codebases. It can:
- Improve your AI-assisted code by providing canonical examples direct from the source
- Index local and public codebases
- Integrates with any AI coding assistant via MCP
- Search using keyword and semantic search
- Integrate with any OpenAI-compatible or custom API/model
If you're an engineer working with AI-powered coding assistants, Kodit helps by
providing relevant and up-to-date examples of your task so that LLMs make fewer mistakes
and produce fewer hallucinations.
Features
Codebase Indexing
Kodit connects to a variety of local and remote codebases to build an index of your
code. This index is used to build a snippet library, ready for ingestion into an LLM.
- Index local directories and public Git repositories
- Build comprehensive snippet libraries for LLM ingestion
- Support for 20+ programming languages including Python, JavaScript/TypeScript, Java, Go, Rust, C/C++, C#, HTML/CSS, and more
- Advanced code analysis with dependency tracking and call graph generation
- Intelligent snippet extraction with context-aware dependencies
- Efficient indexing with selective reindexing (only processes modified files)
- Privacy first: respects .gitignore and .noindex files
- Auto-indexing configuration for shared server deployments
- Enhanced Git provider support including Azure DevOps
- Index private repositories via a PAT
- Improved progress monitoring and reporting during indexing
- Advanced code slicing infrastructure with Tree-sitter parsing
- Automatic periodic sync to keep indexes up-to-date
MCP Server
Relevant snippets are exposed to an AI coding assistant via an MCP server. This allows
the assistant to request relevant snippets by providing keywords, code, and semantic
intent. Kodit has been tested to work well with:
- Seamless integration with popular AI coding assistants
- Tested and verified with:
- Please contribute more instructions! ... any other assistant is likely to work ...
- Advanced search filters by source, language, author, date range, and file path
- Hybrid search combining BM25 keyword search with semantic search
- Enhanced MCP tools with rich context parameters and metadata
Enterprise Ready
Out of the box, Kodit works with a local SQLite database and very small, local models.
But enterprises can scale out with performant databases and dedicated models. Everything
can even run securely, privately, with on-premise LLM platforms like
Helix.
Supported databases:
- SQLite
- Vectorchord
Supported providers:
- Local (which uses tiny CPU-only open-source models)
- OpenAI
- Secure, private LLM enclave with Helix.
- Any other OpenAI compatible API
Enhanced deployment options:
- Docker Compose configurations with VectorChord
- Kubernetes manifests for production deployments
Quick Start
Documentation
Roadmap
The roadmap is currently maintained as a Github Project.
๐ฌ Support
For commercial support, please contact Helix.ML. To ask a question,
please open a discussion.
License
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi
