loghead
Health Gecti
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 16 GitHub stars
Code Uyari
- network request — Outbound network request in packages/browser/background.js
- network request — Outbound network request in packages/browser/utils.js
- process.env — Environment variable access in packages/core/dist/api/server.js
- process.env — Environment variable access in packages/core/dist/db/client.js
Permissions Gecti
- Permissions — No dangerous permissions requested
This MCP server aggregates application logs from various sources (terminal, Docker, browser) into a local database, making them searchable for AI coding assistants.
Security Assessment
Overall risk: Low. The tool processes development logs, which can sometimes contain sensitive data, so be mindful of what you aggregate. It makes expected outbound network requests primarily within the browser package to collect logs, and relies on local environment variables to store a localhost API URL and authentication token. There are no hardcoded secrets, no dangerous permissions requested, and no evidence of arbitrary shell execution. The architecture relies on a local token (`LOGHEAD_TOKEN`) to secure the connection between the MCP client and the core server.
Quality Assessment
The project is active and recently updated, indicating current maintenance. It uses the permissive MIT license. With 16 GitHub stars, the community adoption and overall trust level are currently very low, which is typical for new or niche developer tools. The setup process is straightforward and well-documented for various popular AI coding environments.
Verdict
Safe to use, provided you are comfortable aggregating your local development logs and routing them through a local server.
Loghead is a tool that allows LLMs in your vibe coding tool to have access to your application logs, no matter where they come from
Loghead
Loghead is a smart log aggregation tool and MCP server. It collects logs from various sources like your terminal, docker containers, or your browser, stores them in a database, and makes them searchable for AI assistants (like Claude, Cursor, or Windsurf).
Think of it as a "long-term memory" for your development logs that your AI coding agent can read.
Prerequisites
Before you start, make sure you have:
- Node.js (v18 or higher).
- Ollama: Download here.
- Ensure it is running (
ollama serve) and accessible athttp://localhost:11434. - Pull the embedding model:
ollama pull qwen3-embedding:0.6b(or similar).
- Ensure it is running (
Setup Guide
1. Start the Core Server
The Core Server (@loghead/core) manages the database, API, and web interface. You must have this running in a background terminal for Loghead to work.
npx @loghead/core
# OR
npx @loghead/core start
This command will:
- Initialize the local SQLite database (
loggerhead.db). - Start the REST API server on port
4567. - Print a Loghead Token.
- Launch the Terminal UI for viewing logs.
2. Connect Your AI Tool (MCP Server)
The MCP Server (@loghead/mcp) is a separate lightweight bridge that allows your AI assistant to talk to the Core Server. You configure your AI tool to run this MCP server automatically.
Use the Loghead Token printed in step 1.
Claude Desktop
Edit your claude_desktop_config.json (usually in ~/Library/Application Support/Claude/ on macOS):
{
"mcpServers": {
"loghead": {
"command": "npx",
"args": ["-y", "@loghead/mcp"],
"env": {
"LOGHEAD_API_URL": "http://localhost:4567",
"LOGHEAD_TOKEN": "<YOUR_MCP_TOKEN>"
}
}
}
}
Windsurf / Cascade
Add the MCP server in your Windsurf configuration:
{
"mcpServers": {
"loghead": {
"command": "npx",
"args": ["-y", "@loghead/mcp"],
"env": {
"LOGHEAD_API_URL": "http://localhost:4567",
"LOGHEAD_TOKEN": "<YOUR_MCP_TOKEN>"
}
}
}
}
Cursor
Go to Settings > MCP and add a new server:
- Name:
loghead - Type:
stdio - Command:
npx -y @loghead/mcp - Environment Variables:
LOGHEAD_API_URL:http://localhost:4567LOGHEAD_TOKEN:<YOUR_MCP_TOKEN>
VS Code (MCP Extension)
To use Loghead as an MCP server in VS Code, add the following configuration to your mcp.json (usually found in your user settings directory):
{
"servers": {
"loghead": {
"command": "npx",
"args": ["-y", "@loghead/mcp"],
"env": {
"LOGHEAD_API_URL": "http://localhost:4567",
"LOGHEAD_TOKEN": "<YOUR_MCP_TOKEN>",
},
},
},
}
Replace <YOUR_MCP_TOKEN> with the token printed by the Loghead server.
This enables VS Code to connect to Loghead via MCP for log search and retrieval.
3. Create a Project
You can manage projects via the CLI (in a separate terminal):
npx @loghead/core projects add "My Awesome App"
# Copy the Project ID returned
4. Add a Log Stream
Create a stream to pipe logs into.
For Terminal Output:
npx @loghead/core streams add terminal --project <PROJECT_ID> --name "Build Logs"
# Copy the Stream Token returned
For Docker Containers:
npx @loghead/core streams add docker --project <PROJECT_ID> --name "Backend API" --container my-api-container
# Copy the Stream Token returned
5. Ingest Logs
Now, feed logs into the stream using the ingestor tools.
Add directly to project:
# Add to package.json's script of your project
dev:log": "<command-to-start-your-project> | npx @loghead/terminal --token <STREAM-TOKEN>
#Add the token
Terminal Pipe:
# Pipe any command into loghead-terminal
npm run build | npx @loghead/terminal --token <STREAM_TOKEN>
# Use custom base URL
npm run build | npx @loghead/terminal --token <STREAM_TOKEN> --base-url https://your-loghead-instance.com
# Default URL is https://loghead.dev
npm run build | npx @loghead/terminal --token <STREAM_TOKEN>
Docker Logs:
# Attach to a running container
npx @loghead/docker --token <STREAM_TOKEN> --container my-api-container
OpenTelemetry (OTLP):
You can point any OpenTelemetry exporter to Loghead.
// Example: Node.js with @opentelemetry/exporter-logs-otlp-http
const { OTLPLogExporter } = require("@opentelemetry/exporter-logs-otlp-http");
const exporter = new OTLPLogExporter({
url: "http://localhost:4567/v1/logs",
headers: {
Authorization: "Bearer <STREAM_TOKEN>",
},
});
How to Use with AI
Once connected, you can ask your AI assistant questions about your logs:
- "What errors appeared in the build logs recently?"
- "Find any database connection timeouts in the backend logs."
- "Why did the application crash?"
CLI Commands Reference
The @loghead/core package provides several commands to manage your log infrastructure.
General
- Start Server & UI:
npx @loghead/core
Projects
- List Projects:
npx @loghead/core projects list - Add Project:
npx @loghead/core projects add "My Project Name" - Delete Project:
npx @loghead/core projects delete <PROJECT_ID>
Streams
List Streams:
npx @loghead/core streams list --project <PROJECT_ID>Add Stream:
# Basic npx @loghead/core streams add <TYPE> <NAME> --project <PROJECT_ID> # Examples npx @loghead/core streams add terminal "My Terminal" --project <PROJECT_ID> npx @loghead/core streams add docker "My Container" --project <PROJECT_ID> --container <CONTAINER_NAME> npx @loghead/core streams add opentelemetry "My OTLP Stream" --project <PROJECT_ID>Get Stream Token:
npx @loghead/core streams token <STREAM_ID>Delete Stream:
npx @loghead/core streams delete <STREAM_ID>
Sample Calculator App
We provide a unified Calculator App in sample_apps/calculator_app that combines a Backend API, Frontend UI, and CLI capabilities to help you test all of Loghead's features in one place.
This app runs an Express.js server that performs calculations and logs them. It includes a web interface and can be containerized with Docker.
Scenario 1: Testing Terminal Ingest (CLI)
Create a Stream:
npx @loghead/core projects add "Calculator Project" # Copy Project ID npx @loghead/core streams add terminal --project <PROJECT_ID> --name "Terminal Logs" # Copy Stream TokenRun & Pipe Logs:
Run the server locally and pipe its output to Loghead.cd sample_apps/calculator_app npm install npm start | npx @loghead/terminal --token <STREAM_TOKEN>Generate Traffic:
Openhttp://localhost:3000and perform calculations. The logs in your terminal will be sent to Loghead.Ask AI: "What calculations were performed recently?"
Scenario 2: Testing Docker Ingest
Create a Stream:
npx @loghead/core streams add docker --project <PROJECT_ID> --name "Docker Container" --container loghead-calc # Copy Stream TokenRun in Docker:
Build and run the app as a container namedloghead-calc.cd sample_apps/calculator_app docker build -t loghead-calc . docker run --name loghead-calc -p 3000:3000 -d loghead-calcAttach Loghead:
npx @loghead/docker --token <STREAM_TOKEN> --container loghead-calcGenerate Traffic & Ask AI:
Perform actions in the browser. Ask: "Did any errors occur in the docker container?" (Try dividing by zero or simulating a crash).
Scenario 3: Testing Browser Ingest (Chrome Extension)
Create a Stream:
npx @loghead/core streams add browser --project <PROJECT_ID> --name "Frontend Logs" # Copy Stream TokenConfigure Extension:
Install the Loghead Chrome Extension (if available) and set the Stream Token.Use the App:
Openhttp://localhost:3000(or the Docker version). The app logs actions toconsole.log, which the extension will capture.Ask AI: "What interactions did the user have in the browser?"
Development
To build from source:
- Clone the repo.
- Install dependencies:
npm install - Build all packages:
npm run build
API Reference
The @loghead/core server exposes a REST API on port 4567 (by default).
Projects
- List Projects
GET /api/projects
- Create Project
POST /api/projects- Body:
{ "name": "string" }
- Delete Project
DELETE /api/projects/:id
Streams
- List Streams
GET /api/streams?projectId=<PROJECT_ID>
- Create Stream
POST /api/streams- Body:
{ "projectId": "string", "type": "string", "name": "string", "config": {} }
- Delete Stream
DELETE /api/streams/:id
Logs
Get Logs
GET /api/logs- Query Params:
streamId: (Required) The Stream ID.q: (Optional) Semantic search query.page: (Optional) Page number (default 1).pageSize: (Optional) Logs per page (default 100, max 1000).
Ingest Logs
POST /api/ingest- Headers:
Authorization: Bearer <STREAM_TOKEN> - Body:
Note:{ "streamId": "string", "logs": [ { "content": "log message", "metadata": { "level": "info" } } ] }logscan also be a single string or object.
Ingest OTLP Logs
POST /v1/logs- Headers:
Authorization: Bearer <STREAM_TOKEN> - Body: Standard OTLP JSON payload.
Contributing
We welcome contributions to Loghead! Whether you're fixing bugs, adding features, or improving documentation, your help is appreciated.
For detailed contribution guidelines, please see CONTRIBUTING.md. This includes information on:
- How to report bugs and suggest features
- Setting up your development environment
- Code style and commit message conventions
- Testing and documentation expectations
- Pull request process
Thank you for helping make Loghead better!
License
This project is licensed under the MIT License - see the LICENSE file for details.
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi