WrenAI

agent
SUMMARY

Open-source text-to-SQL and text-to-chart GenBI agent with a semantic layer. Ask your database questions in natural language — get accurate SQL, charts, and BI insights. Supports 12+ data sources (PostgreSQL, BigQuery, Snowflake, etc.) and any LLM (OpenAI, Claude, Gemini, Ollama).

README.md

Wren AI - Open-Source GenBI Agent

Docs

Canner%2FWrenAI | Trendshift

Ask your database anything in plain English. Wren AI generates accurate SQL, charts, and BI insights — backed by a semantic layer that keeps LLM outputs grounded and trustworthy.

1

😍 Demos

https://github.com/user-attachments/assets/f9c1cb34-5a95-4580-8890-ec9644da4160

▶️ Watch the full GenBI walkthrough — end-to-end from question to chart

💡 Why a Semantic Layer?

Feeding raw DDL to an LLM gets you SQL that looks right but means the wrong thing — "revenue" joins the wrong tables, "active user" uses the wrong filter. Wren AI's semantic layer (MDL) encodes your business definitions once, then every generated query is grounded in that shared understanding. The LLM doesn't guess what your metrics mean; the semantic layer tells it.

🤖 Features

What you get Why it matters
Talk to Your Data Ask in any language → precise SQL & answers Slash the SQL learning curve
GenBI Insights AI-written summaries, charts & reports Decision-ready context in one click
Semantic Layer MDL models encode schema, metrics, joins Keeps LLM outputs accurate & governed
Embed via API Generate queries & charts inside your apps (API Docs) Build custom agents, SaaS features, chatbots (Streamlit Live Demo)

🤩 Learn more about GenBI

🔌 Data Sources

Cloud Warehouses Databases Query Engines
BigQuery PostgreSQL Trino
Snowflake MySQL Athena (Trino)
Redshift Microsoft SQL Server DuckDB
Databricks ClickHouse
Oracle

Don't see yours? Vote for it — community votes drive our connector roadmap.

🧠 LLM Models

Wren AI works with any LLM provider you're already using:

Cloud APIs Platform Services Self-hosted
OpenAI Azure OpenAI Ollama
Anthropic Google AI Studio (Gemini)
DeepSeek Vertex AI (Gemini + Anthropic)
Groq AWS Bedrock
Databricks

[!TIP]
For best results, use a frontier model (GPT-4o, Claude Sonnet, Gemini Pro). Wren AI works with smaller and local models too — accuracy scales with model capability. See configuration examples for setup guides.

🚀 Getting Started

Three ways to get started — pick what fits:

Option Best for Link
Self-hosted (Docker) Full control, local data Installation guide
Wren AI Cloud Try it without setup getwren.ai

Compare OSS vs. Cloud plans. Full documentation at docs.getwren.ai.

2

🏗️ Architecture

wrenai-architecture

User questions flow from the Next.js UI → Apollo GraphQL → AI Service (RAG + LLM) → Wren Engine (semantic query execution) → your database. The semantic layer (MDL) sits at the center, making sure the LLM's SQL reflects your actual business definitions.

👉 Deep dive into the design

🧑‍💻 For Developers

WrenAI is a full-stack AI system with interesting problems at every layer — semantic modeling, RAG retrieval, LLM-driven SQL generation, and query execution across heterogeneous data sources. Here's what the stack actually looks like under the hood:

Layer What it does
wren-ui Next.js + Apollo GraphQL — semantic modeling UI and the BFF that wires everything together
wren-ai-service Python/FastAPI pipeline — intent classification, vector retrieval from Qdrant, LLM prompting, and SQL correction loops
wren-engine Rust + Apache DataFusion — the query execution core that resolves MDL semantics (metrics, joins, access controls) before SQL reaches the database

wren-engine is a separate open-source project and the part of the stack closest to the metal. It's where MDL definitions get translated into actual query plans across 15+ data sources. If you work with Rust, DataFusion, or database connectors, it's worth a look — the codebase is approachable and there are real unsolved problems around query planning, semantic resolution, and MCP (Model Context Protocol) agent integration.

Some areas where contributions tend to have the most impact across both repos:

  • Data source connectors — wren-engine supports 15+ sources; new connectors are always useful
  • MCP integration — wren-engine exposes an MCP server; agent-native workflows are still early and evolving
  • SQL generation quality — prompt engineering, correction loop heuristics, and eval harnesses in wren-ai-service
  • Semantic layer tooling — MDL schema inference, validation, and developer ergonomics in wren-ui

🛠️ Contribution

  1. Read Contribution Guidelines for setup & PR guidelines.
  2. Open an issue for bugs, feature requests, or discussion.
  3. If Wren AI is useful to you, a ⭐ goes a long way — it helps more people find the project.

⭐️ Community

We follow a Code of Conduct to keep the community welcoming for everyone.

🎉 Our Contributors

⬆️ Back to Top

Yorumlar (0)

Sonuc bulunamadi