korelate
Health Pass
- License — License: Apache-2.0
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 16 GitHub stars
Code Fail
- process.env — Environment variable access in boot/config.js
- fs module — File system access in boot/config.js
- exec() — Shell command execution in boot/database.js
- process.env — Environment variable access in config/opcua/opcua-server.js
- fs module — File system access in connectors/connectorManager.js
- fs module — File system access in connectors/file/index.js
Permissions Pass
- Permissions — No dangerous permissions requested
This tool is an open-source, AI-powered Unified Namespace (UNS) Operations Center. It allows users to visualize MQTT topic trees, build dynamic SVG HMIs, transform data via ETL, and leverage an embedded autonomous AI agent for alerting and historical time-series analysis.
Security Assessment
Overall risk: Medium. The server interacts heavily with the local environment and files, accessing environment variables and the file system to manage its configuration and connectors. However, there is a critical flag regarding shell command execution (`exec()`) found within the boot database setup. While no hardcoded secrets or dangerous explicit permissions were detected, executing raw shell commands is a high-risk vector. If an attacker can manipulate the database inputs or environment, it could potentially lead to arbitrary command execution.
Quality Assessment
The project appears to be actively maintained, with repository activity as recent as today. It uses the permissive Apache-2.0 license and provides comprehensive documentation, including a live demo and an architecture guide. However, community trust and adoption are currently quite low, indicated by only 16 GitHub stars.
Verdict
Use with caution: the project is actively maintained and well-documented, but the presence of raw shell command execution and low community adoption require a thorough manual code review before deploying in any production environment.
The open-source, AI-powered Unified Namespace (UNS) Operations Center. Visualize MQTT topic trees, build dynamic SVG HMIs, transform data via ETL, and leverage an embedded Autonomous AI Agent for alerting and historical time-series analysis.
Korelate
The Open-Source Unified Namespace Explorer for the AI Era
Live Demo • Architecture • Installation • User Manual / Wiki • Developer Guide • API
📺 Watch the Demo
📖 The Vision: Why UNS? Why Now?
The Unified Namespace (UNS) Concept
The Unified Namespace is the single source of truth for your industrial data. It creates a semantic hierarchy (e.g., Enterprise/Site/Area/Line/Cell) where every smart device, software, and sensor publishes its state in real-time.
- Single Source of Truth: No more point-to-point spaghetti integrations.
- Event-Driven: Real-time data flows instead of batch processing.
- Open Architecture: Based on lightweight, open standards (MQTT, Sparkplug B).
📚 Learn More:
The AI Revolution & Gradual Adoption
In the age of Generative AI and Large Language Models (LLMs), context is king. An AI cannot optimize a factory if the data is locked in silos with obscure names like PLC_1_Tag_404.
Korelate facilitates Gradual Adoption:
- Connect to your existing messy brokers.
- Visualize the chaos.
- Structure it using the built-in Mapper (ETL) to normalize data into a clean UNS structure without changing the PLC code.
- Analyze with the Autonomous AI Agent which monitors alerts, investigates root causes using available tools, and generates reports automatically.
🏗 Architecture & Design
This application is designed for Edge Deployment (on-premise servers, industrial PCs). It prioritizes low latency, low footprint, high versatility, and extreme resilience against data storms.
Component Diagram
graph TD
subgraph FactoryFloor ["Factory Floor"]
PLC["PLCs / Sensors"] -->|MQTT/Sparkplug| Broker1["Local Broker"]
Cloud["AWS IoT / Azure"] -->|MQTTS| Broker2["Cloud Broker"]
CSV["Legacy Systems"] -->|CSV Streams| FileParser["Data Parsers"]
end
subgraph UNSViewer ["Korelate (Docker)"]
Backend["Node.js Server"]
DuckDB[("DuckDB Hot Storage")]
Mapper["ETL Engine V8"]
Alerts["Alert Manager & AI Agent"]
MCP["MCP Server (External AI Gateway)"]
Backend <-->|Subscribe| Broker1
Backend <-->|Subscribe| Broker2
Backend <-->|Loopback Stream| FileParser
Backend -->|Write| DuckDB
Backend <-->|Execute| Mapper
Backend <-->|Orchestrate| Alerts
MCP <-->|Context Query| Backend
end
subgraph PerennialStorage ["Perennial Storage"]
Timescale[("TimescaleDB / PostgreSQL")]
end
subgraph Users ["Users"]
Browser["Web Browser"] <-->|WebSocket/HTTP| Backend
Claude["Claude / ChatGPT"] <-->|HTTP/SSE| MCP
end
Backend -.->|Async Write| Timescale
Storage & Resilience Strategy
To handle environments ranging from a few updates a minute to thousands of messages per second, the architecture uses a multi-tiered and highly resilient approach:
- Extreme Resilience Layer (Anti-Spam & Backpressure):
- Smart Namespace Rate Limiting: Drops high-frequency spam (>50 msgs/s per namespace) early at the MQTT handler level, protecting CPU/RAM while preserving low-frequency critical events.
- Queue Compaction: Deduplicates topic states in memory before DuckDB insertion to prevent Out-Of-Memory (OOM) errors during packet storms.
- Frontend Backpressure: Uses
requestAnimationFrameto batch DOM updates, ensuring the browser UI never freezes, even under extreme load.
- Enterprise Observability:
- Prometheus Metrics: Tracks message throughput, error rates, WebSocket connections, and Dead Letter Queue (DLQ) size in real-time.
- Standardized Error Logging: All system errors include a unique
code,message, and distributedtraceId(mapped fromcorrelationId) to enable seamless troubleshooting in enterprise log aggregators (e.g., ELK, Splunk).
- Tier 1: In-Memory (Real-Time): Instant WebSocket broadcasting for live dashboards.
- Tier 2: Embedded OLAP (DuckDB): * Stores "Hot Data" locally.
- Time-series aggregations via native
time_bucketfunctions. - Auto-pruning prevents disk overflow (
DUCKDB_MAX_SIZE_MB).
- Time-series aggregations via native
- Tier 3: Perennial (TimescaleDB):
- Optional connector.
- "Fire-and-forget" batched ingestion for long-term archival and compliance.
🔌 Connectivity & Protocols (Southbound)
Korelate acts as a high-performance protocol gateway, bringing data from various industrial and IT sources into a unified context:
- 📡 MQTT & Sparkplug B: Native high-performance support with auto-decoding.
- ⚙️ OPC UA: Direct connection to industrial PLCs (Kepware, Ignition, etc.).
- 🔡 Modbus TCP: Legacy support for industrial automation.
- ⚙️ Siemens S7: Native S7-Comm protocol for Siemens PLCs.
- 🔌 EtherNet/IP: CIP protocol for Rockwell and Omron systems.
- 🏢 BACnet/IP: Standard for Building Management Systems (BMS).
- 💡 KNX/IP: Event-driven automation for commercial buildings.
- 📶 SNMP: Polling for network equipment (routers, switches).
- 🚀 Apache Kafka: High-throughput bidirectional integration with Kafka clusters.
- 🗄️ SQL Databases: Polling integration for PostgreSQL, MySQL, and MS SQL Server.
- 🌐 REST API Poller: Active polling of external HTTP GET endpoints.
- 🔗 I3X (RFC 001): Inter-server communication with other UNS nodes, featuring Auto-Discovery of remote semantic topologies.
- 📥 HTTP Webhooks: RESTful ingestion for ERPs and legacy software.
🐳 Installation & Deployment
Prerequisites
- Docker & Docker Compose
- Access to MQTT Broker(s), OPC UA server(s), or local CSV files
1. Quick Start (Local Testing)
For fastest setup with embedded test servers:
git clone [https://github.com/slalaure/korelate.git](https://github.com/slalaure/korelate.git)
cd korelate
# Start with embedded MQTT & OPC UA servers (preconfigured)
docker-compose -f docker-compose.yml.local up -d
- Dashboard:
http://localhost:8080(Admin:admin/ Password:password) - MQTT Broker:
mqtt://localhost:1883 - OPC UA Server:
opc.tcp://localhost:4840 - MCP Endpoint:
http://localhost:3000/mcp
👉 See LOCAL_TESTING.md for complete guide, sample data, and troubleshooting.
2. Standard Deployment
git clone [https://github.com/slalaure/korelate.git](https://github.com/slalaure/korelate.git)
cd korelate
# Setup configuration
cp .env.example .env
# Start the stack (connect to your external brokers)
docker-compose up -d
Then configure your MQTT & OPC UA brokers in the web UI at http://localhost:8080/config.html.
3. Configuration (.env)
The application supports extensive configuration via environment variables.
Connectivity & Permissions
Define multiple providers and explicitly set their Read/Write permissions using arrays. Supported types are mqtt, opcua, and file.
# Define multiple providers (Minified JSON)
DATA_PROVIDERS='[{"id":"local_mqtt", "type":"mqtt", "host":"localhost", "port":1883, "protocol":"mqtt", "subscribe":["#"], "publish":["commands/#"]}, {"id":"factory_opc", "type":"opcua", "endpointUrl":"opc.tcp://localhost:4840", "subscribe":[{"nodeId":"ns=1;s=Temperature", "topic":"uns/factory/temperature"}]}, {"id":"rest_ingest", "type":"http", "pathPrefix":"/api/ingest/rest"}]'
Storage Tuning
DUCKDB_MAX_SIZE_MB=500 # Limit local DB size. Oldest data is pruned automatically.
DUCKDB_PRUNE_CHUNK_SIZE=5000 # Number of rows to delete per prune cycle.
DB_INSERT_BATCH_SIZE=5000 # Messages buffered in RAM before DB write (Higher = Better Perf).
DB_BATCH_INTERVAL_MS=2000 # Flush interval for DB writes.
# Perennial Storage (Optional)
PERENNIAL_DRIVER=timescale # Enable long-term storage (Options: 'none', 'timescale')
PG_HOST=192.168.1.50 # Postgres connection details
PG_DATABASE=korelate
PG_TABLE_NAME=korelate_events
Authentication & Security
# Web Interface & API Authentication
HTTP_USER=admin # Basic Auth User (Legacy/API fallback)
HTTP_PASSWORD=secure # Basic Auth Password
SESSION_SECRET=change_me # Signing key for session cookies
# Google OAuth (Optional)
GOOGLE_CLIENT_ID=...
GOOGLE_CLIENT_SECRET=...
PUBLIC_URL=http://localhost:8080 # Required for OAuth redirects
# Auto-Provisioning
ADMIN_USERNAME=admin # Creates/Updates a Super Admin on startup
ADMIN_PASSWORD=admin
AI & MCP Capabilities
Control what the AI Agent is allowed to do.
MCP_API_KEY=sk-my-secret-key # Secure the MCP endpoint
LLM_API_URL=... # OpenAI-compatible endpoint (Gemini, ChatGPT, Local)
LLM_API_KEY=... # Key for the internal Chat Assistant
# Granular Tool Permissions (true/false)
LLM_TOOL_ENABLE_READ=true # Inspect DB, topics list, history, and search
LLM_TOOL_ENABLE_SEMANTIC=true # Infer Schema, Model Definitions
LLM_TOOL_ENABLE_PUBLISH=true # Publish MQTT messages
LLM_TOOL_ENABLE_FILES=true # Read/Write files (SVGs, 3D Models, Simulators)
LLM_TOOL_ENABLE_SIMULATOR=true # Start/Stop built-in sims
LLM_TOOL_ENABLE_MAPPER=true # Modify ETL rules
LLM_TOOL_ENABLE_ADMIN=true # Prune History, Restart Server
Analytics
ANALYTICS_ENABLED=false # Enable Microsoft Clarity tracking
📘 User Manual / Power User Wiki
1. Authentication, Roles & Multi-Tenancy
The viewer supports Local (Username/Password) and Google OAuth authentication, enabling secure multi-tenant usage.
- Air-Gapped Ready: Local accounts use dynamically generated embedded SVG avatars, requiring zero internet access to external APIs, perfect for isolated OT networks.
- Role-Based Access Control (RBAC):
- Standard User: Can view data, and create Private Charts, Mappers, and HMI views (stored in their own session workspace:
/data/sessions/<id>). - Administrator: Has full control. Can edit Global configurations (
/data), access the Admin Dashboard (/admin), manage users, execute history imports/pruning, upload 3D models and deploy live ETL logic.
- Standard User: Can view data, and create Private Charts, Mappers, and HMI views (stored in their own session workspace:
2. Dynamic Topic Tree
The left panel displays the discovered UNS hierarchy.
- Sparkplug B Support: Topics starting with
spBv1.0/are automatically decoded from Protobuf to JSON. - Protocol Agnostic: The root nodes represent your different broker connections (MQTT, OPC UA, or local CSV data parsers).
- Filtering & Animations: You can filter topics on the fly, disable traversal animations for high-frequency branches, and toggle live updates to freeze the payload viewer for copy-pasting.
3. HMI Dashboards & 3D Digital Twins
Create professional HMIs using HTML, standard vector graphics, and 3D scenes.
- Composite Dashboards: Load
.htmlfiles that act as grid layouts combining multiple embedded SVGs (.embedded-svg), live charts (.embedded-chart), and data bindings into a single, cohesive view. - 3D Integration (A-Frame): Automatically detects and loads A-Frame to render live 3D Digital Twins directly in the browser using
.glb/.gltfassets. - Import / Export: Seamlessly download current views (including their JS bindings) or upload new ones directly from the HMI interface.
- Instant Refresh: Views instantly fetch their latest known state from DuckDB using the "AS OF" SQL logic upon activation.
- Layered Storage: Users can see global views and their own private views. Admins can delete global views directly from the UI.
4. Historical Analysis & Data Management
Navigate through time using the embedded DuckDB engine.
- Time Travel: Use the dual-handle slider or quick-select buttons (1h, 24h, 1M, 1Y, Full) to zoom into specific timeframes.
- Export & Import: * Download filtered data as JSON or CSV for offline analysis.
- Admins can import JSON history files via the Admin panel to backfill the database.
- Pruning: Right-click a topic node or use the Admin tools to permanently delete specific topic patterns (using MQTT wildcards) from the database to reclaim space.
5. Advanced Mapper (ETL Engine)
Transform data on the fly using sandboxed JavaScript. The Mapper normalizes raw, proprietary payloads into standard UNS structures.
- Layered Config: Users can "Save As New" to draft personal versions of mapping logic. Only Admins can "Save Live" to execute the logic in production.
- Dynamic Streams & Loopback: You can use dynamically created data providers (like uploaded CSV streams) as fully functional message buses. You can subscribe to them, map them to new topics, and even publish back into the stream.
- Routing Modes:
- UI Defined (Fan-out): Returns a single
msgobject, which the engine automatically publishes to all comma-separated topics specified in the UI. - Code Defined (Advanced): The script returns an array of
{topic, payload}objects, allowing complex conditional routing and splitting of God-node payloads into multiple semantic topics.
- UI Defined (Fan-out): Returns a single
6. Advanced Charting
Visualize correlations instantly with high performance.
- Backend Aggregation: Handles massive time windows by downsampling millions of rows into 500 buckets using DuckDB's
time_bucket, reducing network payload and browser memory usage. - Smart Axis & Zoom: Automatically groups variables with similar units on shared Y-axes. Supports drag-to-zoom on the timeline.
- Primitive Support: Directly plot simple numerical/boolean payloads (e.g.,
true/falseautomatically scaled to0/1). - Exports & Unsaved Tracking: Easily export charts to CSV or PNG. The UI features pulsating visual indicators to remind you to save your configurations.
- Statistical Modes: Choose from Mean, Min, Max, Median, StdDev, Range, or Sum aggregations.
- AI Learning Studio: Highlight a time range and click "Profile & Learn" to have the AI analyze statistical fingerprints (Frequency, Chatter, Boundaries) and propose UNS model updates or smart alert rules.
7. AI Chat Assistant (Multimodal & Multi-Model)
A floating assistant powered by LLMs running a recursive agentic loop.
- Multi-Model Support: Configure and switch between multiple LLMs (e.g.,
gemini-2.0-flash,gpt-4o,claude-3-5-sonnet) on the fly directly from the Chat UI or the AI Learning Studio. Assign dedicated, high-tier models specifically for background tasks like Autonomous Alert Analysis. - Multimodal Inputs:
- Voice (STT/TTS): Speak to the assistant (continuous listening mode) and hear responses natively.
- Vision: Use your device's camera or upload images/logs to give the AI context on physical equipment.
- Capabilities: Can search data, infer schemas, generate SQL, configure mapping rules, create HMI dashboards (
create_hmi_view), and control built-in simulators. - AI Safety (Approval Workflow): For sensitive operations (modifying model, creating files, updating mapper), the UI prompts for user approval before execution. You can "Approve Once" or "Approve for Session" to streamline your workflow.
- Proxy-Resilient Streaming: Uses NDJSON HTTP streaming with WebSocket fallbacks so the "Thinking..." and "Executing tool..." statuses work flawlessly behind strict reverse proxies. Displays execution durations for full transparency.
- Session Management: Slide out the left menu to switch between historical chats, start new ones, or delete them.
8. Intelligent Alerting & Workflow Engine
Define sophisticated detection rules using JavaScript conditions.
- Rule-Based Engine: Write sandbox JS conditions (e.g.,
return msg.payload.metrics[0].value > 80) to trigger alerts. - Autonomous AI Analyst: When an alert triggers, the AI Agent automatically wakes up, reads the rule's custom
workflow_prompt(e.g., "Check maintenance logs for this machine"), queries DuckDB, and generates a structured Markdown incident report. - Webhooks & Triggers: Automatically push the AI's Markdown report to external systems via HTTP POST (Slack/Teams). You can manage and manually test these webhook endpoints directly from the Admin panel.
- Live Dashboard: Track active alerts, acknowledge/resolve them, and trace exactly which user (or AI) handled the incident and when.
9. I3X Standard API (RFC 001)
Korelate provides a native, northbound implementation of the I3X (Industrial Information Interface eXchange) standard.
- Full Compliance: Implements RFC 001 exploratory, value, and subscription methods.
- Client Auto-Discovery: When connecting to a remote I3X server, Korelate automatically discovers, imports, and visually recreates the remote semantic hierarchy without manual configuration.
- Engineering Units (EngUnit): Automatically extracts and attaches measurement units to VQT (Value-Quality-Timestamp) responses from both static metadata and dynamic MQTT payloads.
- Recursive Context: Supports
maxDepthrecursion in both Last Known Value and Historical queries, allowing clients to fetch entire equipment hierarchies in a single call. - Write-Back Capability: Supports RFC 4.2.2.1, enabling authenticated clients to send command values back to the factory floor via the standardized interface.
10. CDM Modeler (Core Data Model)
Korelate includes a powerful graphical editor to define your plant's semantic hierarchy and metadata.
- Concept Definition: Map industrial concepts to physical MQTT topics.
- Ontology Support: Native support for ISA-95 (OT) and Brick Schema (BMS) conventions for professional-grade UNS structures.
- Responsive 3-Pane IDE: The interface smoothly scales from a widescreen desktop down to a tablet or mobile device.
- Advanced Editing: Includes a "⚙️ Raw" JSON mode for bulk model editing and a specialized Profiling UI to manage nominal values, expected ranges, and data quality levels.
- Data Governance & Security: Explicitly tag data nodes with Sensitivity Levels (Public, Internal, Confidential, Secret) and Privacy/Compliance flags (GDPR, Health/HDS, Financial/PCI).
- Graph Visualization: Explore semantic links using an interactive, 100% dependency-free native SVG Force-Directed graph engine (KorelateGraph).
- I3X Relationships: Link concepts using standardized relationships (
HasParent,HasComponent,SuppliesTo, etc.) natively in the UI.
11. AI Safety & Governance (Admin)
As the AI Agent gains more autonomy, Korelate provides tools to monitor and control its actions.
- Approval UI: Standardizes the "Human-in-the-loop" pattern for any destructive or modifying action initiated by an LLM.
- AI History Dashboard: Accessible in the Admin panel, this view logs every tool execution, showing the user, timestamp, and arguments.
- Automatic Backups & Rollback: Most AI modifications (HMI files, model changes) automatically create point-in-time backups. You can revert any AI action with a single click or by asking the chat assistant ("Undo your last change").
12. Configuration Interface (Admin Only)
Accessible via the Cog icon (/config.html) or the Admin Tab.
- Interactive Wizard & Advanced Editor: Configure your server step-by-step using the new intuitive wizard, or use the advanced
.enveditor if you prefer raw access. - Import / Export: Easily backup your entire configuration to a JSON file or restore it with a single click.
- Environment: Modify
.envvariables (Brokers, LLM settings, Limits) and restart the server directly from the UI. - Certificates: Upload SSL/TLS certificates for secure MQTT MTLS connections.
- UNS Model: Edit the semantic model (
uns_model.json) used by the AI for structured concept searching. - Database Maintenance: Execute Full Reset (Truncate/Vacuum), Import data, and manage Users.
- HMI Assets Management: Upload, list, and delete 3D models (
.glb), HTML views, JS scripts, and images required for your Digital Twins.
👨💻 Developer Guide
Project Structure
📦 root
┣ 📂 data/ # Persistent Volume (Global Configs & DB)
┃ ┣ 📂 certs/ # MQTT Certificates
┃ ┣ 📂 sessions/ # User Data (Private Charts/HMIs/Chats/Mappers)
┃ ┣ 📄 ai_tools_manifest.json # SSOT for AI Tools Definitions
┃ ┣ 📄 charts.json # Global Saved Charts
┃ ┣ 📄 mappings.json # Global ETL Rules
┃ ┣ 📄 uns_model.json # Semantic Model Definition
┃ ┗ 📄 korelate_events.duckdb # Hot DB
┣ 📂 connectors/ # Southbound DB Adapters (MQTT, OPC UA, File)
┣ 📂 storage/ # Database Repositories (DuckDB, Timescale, User)
┣ 📂 core/ # Agnostic Processing Core
┃ ┣ 📂 engine/ # Alert Manager & Mapper Engine
┃ ┗ 📄 messageDispatcher.js # Central Message Hub
┣ 📂 interfaces/ # Northbound API Layers
┃ ┣ 📂 web/ # Express REST Routes for Frontend UI
┃ ┣ 📂 i3x/ # I3X API standard (RFC 001) implementation
┃ ┗ 📂 mcp/ # Model Context Protocol Server for external AI
┣ 📂 public/ # Frontend (Vanilla JS SPA)
┗ 📄 server.js # Main Entry Point
AI Tools Manifest (ai_tools_manifest.json)
The application uses a Single Source of Truth (SSOT) for all LLM tools. Found in public/ai_tools_manifest.json, this file dictates the JSON Schemas, descriptions, and categories for tools used by both the internal Chat API and the external MCP server. Edit this file to expose new capabilities to the AI.
Working with AI Assistants (GEMINI.md)
If you are using an AI coding assistant (like Gemini or Claude) to contribute to Korelate, refer to the GEMINI.md file at the root of the project. It contains the architectural mandates, prompt structures, and guidelines required to maintain consistency with the project's standards.
Testing Strategy
The project maintains a comprehensive test suite (see test_plan.md for full details).
- Unit & Integration Tests (Backend): Use
npx jestto run the Jest suite. - End-to-End Tests (Frontend): Use
npx playwright testto run the Playwright browser tests. - Global Test Runner: Use
node tests/run_all.jsto execute all tests (Unit + E2E) sequentially and generate a consolidated Markdown report intest-results/.
HMI Scripting API
To add logic (animations, color changes) to an HTML Dashboard or SVG, create a file named [filename].js alongside your .html or .svg file.
// data/factory_dashboard.js
window.registerHmiBindings({
// Called on load
initialize: (hmiRoot) => { console.log("View Loaded"); },
// Called on EVERY incoming MQTT message
update: (sourceId, topic, payload, hmiRoot) => {
// Safe Parse
const msg = (typeof payload === 'string') ? JSON.parse(payload) : payload;
if (topic.includes('fan_speed')) {
const fan = hmiRoot.querySelector('#fan_blade');
const speed = msg.value || 0;
fan.style.transform = `rotate(${Date.now() % 360}deg)`;
fan.style.animationDuration = `${1000/speed}ms`;
}
if (topic.includes('status')) {
const rect = svgRoot.querySelector('#status_box');
rect.setAttribute('fill', payload.active ? 'green' : 'red');
}
},
// Called when resetting view
reset: (hmiRoot) => { }
});
Mapper Engine (ETL) Advanced Routing
The Mapper runs inside a secure Node.js vm sandbox. It supports asynchronous DuckDB queries (await db.all(sql)).
By setting the Routing Mode to "Code Defined", you can return an array of messages to split complex payloads.
Example: Splitting a God-Node payload into a structured UNS
// Source Topic: dt/iot/bgs/maintained/FACTORY_01/full
// 1. Safe access to variables
const vars = msg.payload.variables || msg.payload;
// 2. Map distinct semantic topics
const msgAnalyse = {
topic: "france/factory_01/epuration/analyse",
payload: { CH4: vars.AI_AI8402_CH4, CO2: vars.AI_AT8461_CO2 }
};
const msgComp = {
topic: "france/factory_01/epuration/compression",
payload: { Power: vars.AI_C3101_POWER, Pressure: vars.AI_PT3101 }
};
// 3. Return array to publish both messages
return [msgAnalyse, msgComp];
Advanced Example: calculating a moving average
// Calculate average of last 10 readings before publishing
const history = await db.all(`
SELECT CAST(payload->>'value' AS FLOAT) as val
FROM korelate_events
WHERE topic = '${msg.topic}'
ORDER BY timestamp DESC LIMIT 10
`);
const sum = history.reduce((a, b) => a + b.val, 0) + msg.payload.value;
const avg = sum / (history.length + 1);
return {
...msg,
payload: {
current: msg.payload.value,
moving_avg: avg,
timestamp: new Date().toISOString()
}
};
Custom Simulators
Create a file data/simulator-myplant.js. It will be automatically loaded on server start/restart.
module.exports = (logger, publish, isSparkplug) => {
let interval;
return {
start: () => {
interval = setInterval(() => {
const payload = JSON.stringify({ temp: Math.random() * 100 });
publish('factory/line1/sensor', payload, false);
}, 1000);
},
stop: () => clearInterval(interval)
};
};
🧠 AI Integration (Model Context Protocol)
The Korelate MCP Server allows you to connect external AI Agents (like Claude Desktop) directly to your factory floor data context, mirroring the capabilities of the internal Chat API.
Capabilities Exposed to AI:
get_topics_list: Discover what machines are online.search_uns_concept: "Find all machines with a 'Temperature' metric > 50".aggregate_time_series: "Give me the max pressure per hour for the last 7 days".infer_schema: "Give me the JSON schema for the ERP work orders".get_topic_history: "Analyze the last hour of data for anomalies".publish_message: "Turn on the warning light".list_active_alerts: See current issues.create_alert_rule: Define new detection logic.update_alert_status: Acknowledge or resolve alerts.create_hmi_view: Generate full HTML/3D/SVG dashboards dynamically.
Client Config (Claude Desktop config.json):
{
"mcpServers": {
"mqtt_viewer": {
"command": "node",
"args": ["path/to/mcp-client.js"], // Or run via Docker command
"env": {
"MCP_API_KEY": "your-key",
"MCP_URL": "http://localhost:3000/mcp"
}
}
}
}
🔌 API Reference
The application exposes a comprehensive REST API.
| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
POST |
/api/external/publish |
Publish data from 3rd party apps. Requires x-api-key. |
✅ (API Key) |
GET |
/api/context/status |
Get DB size and connection status. | ✅ (Session/Basic) |
GET |
/api/context/last-known |
Gets the precise state of the UNS at a specific timestamp. | ✅ (Session/Basic) |
POST |
/api/context/aggregate |
Returns downsampled time-series data using DuckDB time_bucket. |
✅ (Session/Basic) |
POST |
/api/context/profile |
High-performance statistical profiling (Min, Max, Freq, Chatter) for a time range. | ✅ (Session/Basic) |
POST |
/api/context/learn |
AI-powered synthesis of data profiles to suggest UNS model updates. | ✅ (Session/Basic) |
POST |
/api/context/apply-learn |
Hot-reloads approved AI suggestions into the active Semantic Model and Alert Rules. | ✅ (Admin) |
POST |
/api/publish/message |
Publish MQTT message. | ✅ (Session/Basic) |
GET |
/api/chat/sessions |
List chat history sessions. | ✅ (Session/Basic) |
GET |
/api/chat/session/:id |
Load specific session history. | ✅ (Session/Basic) |
DELETE |
/api/chat/session/:id |
Delete a chat session. | ✅ (Session/Basic) |
POST |
/api/chat/stop |
Abort current generation. | ✅ (Session/Basic) |
POST |
/api/chat/completion |
Streamed LLM completion with Tools. | ✅ (Session/Basic) |
GET |
/api/alerts/active |
List triggered alerts. | ✅ (Session/Basic) |
POST |
/api/alerts/rules |
Create a new alert rule. | ✅ (Session/Basic) |
POST |
/api/alerts/:id/status |
Acknowledge/Resolve an alert. | ✅ (Session/Basic) |
GET |
/api/hmi/list |
List available HMI dashboards (HTML/SVG). | ✅ (Session/Basic) |
GET |
/api/hmi/file |
Serve an HMI asset (HTML, SVG, GLB, JS, etc.). | ✅ (Session/Basic) |
GET |
/api/hmi/bindings.js |
Fetches custom logic scripts for HMI views. | ✅ (Session/Basic) |
GET |
/api/admin/users |
List registered users. | ✅ (Admin) |
POST |
/api/admin/import-db |
Import JSON history data to DuckDB/Timescale. | ✅ (Admin) |
POST |
/api/admin/reset-db |
Truncates the database completely. | ✅ (Admin) |
GET |
/api/admin/hmi-assets |
List all HMI assets stored globally. | ✅ (Admin) |
POST |
/api/admin/hmi-assets |
Upload new HMI assets (.glb, .svg, .html...). | ✅ (Admin) |
GET |
/api/admin/ai_history |
View history of AI-initiated modifications. | ✅ (Admin) |
POST |
/api/admin/ai_history/:id/revert |
Revert a specific AI action using its backup. | ✅ (Admin) |
POST |
/api/admin/data-parsers/csv |
Upload and start a dynamic CSV data stream. | ✅ (Admin) |
GET |
/api/admin/webhooks |
List all registered webhooks. | ✅ (Admin) |
POST |
/api/admin/webhooks |
Register a new webhook subscription. | ✅ (Admin) |
DELETE |
/api/admin/webhooks/:id |
Delete a webhook subscription. | ✅ (Admin) |
POST |
/api/admin/webhooks/:id/test |
Trigger a manual test payload for a webhook. | ✅ (Admin) |
POST |
/api/env/restart |
Restart the application server. | ✅ (Admin) |
GET |
/api/metrics |
Prometheus-formatted metrics (throughput, WS connections, errors, DLQ size). | ✅ (IP Filtered) |
GET/POST |
/api/i3x/* |
Full I3X (RFC 001) API implementation (Namespaces, Objects, Value, History, Subs). | ✅ (IP Filtered) |
🤝 Contributing
We welcome contributions! We believe in Community Driven Innovation.
- Fork the repository.
- Create a feature branch (
git checkout -b feature/AmazingFeature). - Commit your changes (English commit messages please).
- Push to the branch.
- Open a Pull Request.
🛡 License
Licensed under the Apache License, Version 2.0.
You are free to use, modify, and distribute this software, even for commercial purposes, under the terms of the license.
Copyright (c) 2025-2026 Sebastien Lalaurette
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found
