rocketride-server
High-performance AI pipeline engine with a C++ core and 50+ Python-extensible nodes. Build, debug, and scale LLM workflows with 13+ model providers, 8+ vector databases, and agent orchestration, all from your IDE. Includes VS Code extension, TypeScript/Python SDKs, and Docker deployment.
Open-source, developer-native AI pipeline tool.
Build, debug, and deploy production AI workflows - without leaving your IDE.
RocketRide is an open-source data pipeline builder and runtime built for AI and ML workloads. With 50+ pipeline nodes spanning 13 LLM providers, 8 vector databases, OCR, NER, and more — pipelines are defined as portable JSON, built visually in VS Code, and executed by a multithreaded C++ runtime. From real-time data processing to multimodal AI search, RocketRide runs entirely on your own infrastructure.
Home | Documentation | Python SDK | TypeScript SDK | MCP Server
Design, test, and ship complex AI workflows from a visual canvas, right where you write code.
Drop pipelines into any Python or TypeScript app with a few lines of code, no infrastructure glue required.
Features
- VS Code Extension — Build, visualize, and monitor pipelines directly in your editor. The visual pipeline builder lets you drag, connect, and configure nodes without writing boilerplate. Real-time observability tracks token usage, LLM calls, latency, and execution — all without leaving VS Code. Pipelines are defined as portable JSON, meaning they're version-controllable, shareable, and runnable anywhere.
- High-performance C++ runtime — RocketRide's runtime is built in C++ with native multithreading, purpose-built for the throughput demands of AI and data workloads. No bottlenecks, no compromises for production scale.
- 50+ pipeline nodes — A comprehensive library of pre-built nodes covering 13 LLM providers, 8 vector databases, OCR, NER, PII anonymization, chunking strategies, embedding models, and more. All nodes are Python-extensible, so you can build and publish your own.
- Multi-agent workflows — Orchestrate and scale complex agent pipelines with built-in support for CrewAI and LangChain. Chain agents, share memory across pipeline runs, and manage multi-step reasoning workflows at scale. Switch between agentic frameworks with a few clicks for your task.
- Coding agent ready — Install the VS Code extension and RocketRide automatically detects and configures your coding agent — Claude, Cursor, and more. Your agent can build, modify, and deploy pipelines through natural language.
- TypeScript, Python & MCP SDKs — Integrate pipelines into native applications, expose them as callable tools for AI assistants, or build programmatic pipeline workflows into your existing codebase.
- Zero dependency headaches — RocketRide manages Python environments, C++ toolchains, Java/Tika, and all node dependencies automatically. Clone, build, run — no manual setup, no version conflicts, no glue scripts.
- One-click deploy — Run on Docker, on-prem, or RocketRide Cloud (coming soon). RocketRide's architecture is designed for production from day one — not retrofitted from a demo.
Quick Start
Install the extension for your IDE. Search for RocketRide in the extension marketplace:
Click the RocketRide extension in your IDE
Deploy a server - you'll be prompted on how you want to run the server. Choose the option that fits your setup:
- Local (Recommended) - This pulls the server directly into your IDE without any additional setup.
- On-Premises - Run the server on your own hardware for full control and data residency. Pull the image and deploy to Docker or clone this repo and build from source.
Building Your First Pipe
All pipelines are recognized with the
*.pipeformat. Each pipeline and configuration is a JSON object - but the extension in your IDE will render within our visual builder canvas.All pipelines begin with source node: webhook, chat, or dropper. For specific usage, examples, and inspiration on how to build pipelines, check out our guides and documentation.
Connect input lanes and output lanes by type to properly wire your pipeline. Some nodes like agents or LLMs can be invoked as tools for use by a parent node as shown below:
You can run a pipeline from the canvas by pressing the ▶️ button on the source node or from the
Connection Managerdirectly.Deploy your pipelines on your own infrastructure.
Docker - Download the RocketRide server image and create a container. Requires Docker to be installed.
docker pull ghcr.io/rocketride-org/rocketride-engine:latest docker create --name rocketride-engine -p 5565:5565 ghcr.io/rocketride-org/rocketride-engine:latestLocal deployment - Download the runtime of your choice as a standalone process in the 'Deploy' page of the
Connection Manager
Run your pipelines as standalone processes or integrate them into your existing Python and TypeScript/JS applications utilizing our SDK.
Observability
Selecting running pipelines allows for in-depth analytics. Trace call trees, token usage, memory consumption, and more to optimize your pipelines before scaling and deploying. Find the models, agents, and tools best fit for your task.
Contributors
RocketRide is built by a growing community of contributors. Whether you've fixed a bug, added a node, improved docs, or helped someone on Discord, thank you. New contributions are always welcome - check out our contributing guide to get started.
Made with 🤍 in SF & EU
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found