vobase

mcp
SUMMARY

The app framework built for AI coding agents. Own every line. Your AI already knows how to build on it.

README.md

English / 中文

vobase
The app framework built for AI coding agents.
Own every line. Your AI already knows how to build on it.

npm @vobase/core npm downloads GitHub stars Last commit License MIT Discord

Bun TypeScript Hono Drizzle PostgreSQL Better Auth React Vite TanStack Tailwind CSS shadcn/ui

what you get · get started · code · skills · compare · docs


A full-stack TypeScript framework that gives you auth, database, storage, and jobs in a single process. PGlite (embedded Postgres) for local dev, managed Postgres in production. Like a self-hosted Supabase — but you own every line of code. Like Pocketbase — but it's TypeScript you can read and modify.

AI coding agents (Claude Code, Cursor, Codex) understand vobase out of the box. Strict conventions and agent skills mean generated code works on the first try — not the third.

You own the code. You own the data. You own the infrastructure.


what you get

One bun create vobase and you have a working full-stack app:

Primitive What it does
Runtime Bun — native TypeScript, ~50ms startup, built-in test runner. One process, one container.
Database PostgreSQL via Drizzle. PGlite for zero-config local dev, managed Postgres in production. Full SQL, ACID transactions, pgvector for embeddings.
Auth better-auth. Sessions, passwords, CSRF. RBAC with role guards, API keys, and optional organization/team support. Org/SSO/2FA as plugins.
API Hono — ~14KB, typed routing, Bun-first. Every AI coding tool already knows Hono.
Audit Built-in audit log, record change tracking, and auth event hooks. Every mutation is traceable.
Sequences Gap-free business number generation (INV-0001, PO-0042). Transaction-safe, never skips.
Storage File storage with virtual buckets. Local or S3 backends. Metadata tracked in Postgres.
Channels Multi-channel messaging with pluggable adapters. WhatsApp (Cloud API), email (Resend, SMTP). Inbound webhooks, outbound sends, delivery tracking. All messages logged.
Integrations Encrypted credential vault for external services (OAuth providers, APIs). AES-256-GCM at rest. Platform-aware: opt-in multi-tenant OAuth handoff via HMAC-signed JWT.
Jobs Background tasks with retries, cron, and job chains. pg-boss backed — Postgres only, no Redis.
Knowledge Base Upload PDF, DOCX, XLSX, PPTX, images, HTML. Auto-extract to Markdown, chunk, embed, and search. Hybrid search with RRF + HyDE. Gemini OCR for scanned docs.
AI Agents Declarative agents via Mastra in a top-level mastra/ directory. Multi-provider (OpenAI, Anthropic, Google). Tools, workflows, memory processors, eval scorers. Embedded Mastra Studio at /studio for dev. Frontend stays on AI SDK useChat.
Frontend React + TanStack Router + shadcn/ui + Tailwind v4. Type-safe routing with codegen, code-splitting. You own the component source — no tailwind.config.js needed.
Skills Domain knowledge packs that teach AI agents your app's patterns and conventions.
MCP Module-aware tools with API key auth via @modelcontextprotocol/sdk. AI tools can read your schema, list modules, and view logs before generating code. Same process, shared port.
Deploy Dockerfile + railway.toml included. One railway up or docker build and you're live.

Locally, everything runs in one Bun process with PGlite — no Docker, no external services. bun run dev and you're building. In production, point DATABASE_URL at any Postgres instance.


quick start

bun create vobase my-app
cd my-app
bun run dev

Backend on :3000, frontend on :5173. Ships with a dashboard and audit log viewer out of the box.


what you can build

Every module is a self-contained directory: schema, handlers, jobs, pages. No plugins, no marketplace. Just TypeScript you own.

Use Case What Ships
SaaS Starter User accounts, billing integration, subscription management, admin dashboard. Auth + jobs + webhooks handle the plumbing.
Internal Tools Admin panels, operations dashboards, approval workflows. Status machines enforce business logic. Audit trails track every change.
CRM & Contacts Companies, contacts, interaction timelines, deal tracking. Cross-module references keep things decoupled.
Project Tracker Tasks, assignments, status workflows, notifications. Background jobs handle reminders and escalations.
Billing & Invoicing Invoices, line items, payments, aging reports. Integer money ensures exact arithmetic. Gap-free numbering via transactions.
Your Vertical Property management, fleet tracking, field services — whatever the business needs. Describe it to your AI tool. It generates the module.

Module starters ship as skills: vobase add skill <name>. Like npx shadcn add button — files get copied, you own the code.


how it works

Vobase makes itself legible to every AI coding tool on the market.

The framework ships with strict conventions and agent skills — domain knowledge packs that teach AI tools how your app works. When you need a new capability:

  1. Open your AI tool and describe the requirement
  2. The AI reads your existing schema, module conventions, and the relevant skills
  3. It generates a complete module — schema, handlers, jobs, pages, tests, seed data
  4. You review the diff, run bun run dev, and it works

Skills cover the parts where apps get tricky: money stored as integer cents (never floats), status transitions as explicit state machines (not arbitrary string updates), gap-free business numbers generated inside database transactions (not auto-increment IDs that leave holes).

These conventions are what make AI-generated modules work on the first try.

The thesis: your specs and domain knowledge are the asset. AI tools are the compiler. The compiler improves every quarter. Your skills compound forever.


what a module looks like

Every module declares itself through defineModule(). This convention is what AI tools rely on to generate correct code.

// modules/projects/index.ts
import { defineModule } from '@vobase/core'
import * as schema from './schema'
import { routes } from './handlers'
import { jobs } from './jobs'
import * as pages from './pages'
import seed from './seed'

export default defineModule({
  name: 'projects',
  schema,
  routes,
  jobs,
  pages,
  seed,
  init: (ctx) => {
    // Optional: run setup logic at boot with access to db, scheduler, http, storage, channels
  },
})
modules/projects/
  schema.ts           ← Drizzle table definitions
  handlers.ts         ← Hono routes (HTTP API)
  handlers.test.ts    ← colocated tests (bun test)
  jobs.ts             ← background tasks (pg-boss, no Redis)
  pages/              ← React pages (list, detail, create)
  seed.ts             ← sample data for dev
  index.ts            ← defineModule()
schema example — Drizzle + PostgreSQL with typed columns, timestamps, status enums
// modules/projects/schema.ts
import { pgTable, text, integer, timestamp } from 'drizzle-orm/pg-core'
import { nanoidPrimaryKey } from '@vobase/core'

export const projects = pgTable('projects', {
  id: nanoidPrimaryKey(),
  name: text('name').notNull(),
  description: text('description'),
  status: text('status').notNull().default('active'),    // active -> archived -> deleted
  owner_id: text('owner_id').notNull(),
  created_at: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
})

export const tasks = pgTable('tasks', {
  id: nanoidPrimaryKey(),
  project_id: text('project_id').references(() => projects.id),
  title: text('title').notNull(),
  status: text('status').notNull().default('todo'),       // todo -> in_progress -> done
  assignee_id: text('assignee_id'),
  priority: integer('priority').notNull().default(0),
})
handler example — Hono routes with typed context and authorization
// modules/projects/handlers.ts
import { Hono } from 'hono'
import { getCtx } from '@vobase/core'
import { projects } from './schema'

export const routes = new Hono()

routes.get('/projects', async (c) => {
  const ctx = getCtx(c)
  return c.json(await ctx.db.select().from(projects))
})

routes.post('/projects', async (c) => {
  const ctx = getCtx(c)
  const body = await c.req.json()

  const project = await ctx.db.insert(projects).values({
    ...body,
    owner_id: ctx.user.id,
  })

  return c.json(project)
})

The frontend gets fully typed API calls via codegen:

import { hc } from 'hono/client'
import type { AppType } from './api-types.generated'

const client = hc<AppType>('/')
const res = await client.api.projects.$get()
const projects = await res.json()  // fully typed — autocomplete on every route and response

AppType is code-generated from your server's route tree, giving you end-to-end type safety from handler return values to frontend consumption.

job example — background tasks via pg-boss, no Redis
// modules/projects/jobs.ts
import { defineJob } from '@vobase/core'
import { tasks } from './schema'
import { eq } from 'drizzle-orm'

export const sendReminder = defineJob('projects:sendReminder',
  async (data: { taskId: string }) => {
    const task = await db.select().from(tasks)
      .where(eq(tasks.id, data.taskId))
    // send notification, update status, log the action
  }
)

Schedule from handlers: ctx.scheduler.add('projects:sendReminder', { taskId }, { delay: '1d' })

Retries, cron scheduling, and priority queues — all Postgres-backed via pg-boss.


the ctx object

Every HTTP handler gets a context object with runtime capabilities. Current surface:

Property What it does
ctx.db Drizzle instance. Full PostgreSQL — reads, writes, transactions.
ctx.user { id, email, name, role, activeOrganizationId? }. From better-auth session. Used for authorization checks. RBAC middlewares: requireRole(), requirePermission(), requireOrg().
ctx.scheduler Job queue. add(jobName, data, options) to schedule background work.
ctx.storage StorageService — virtual buckets with local/S3 backends. ctx.storage.bucket('avatars').upload(key, data).
ctx.channels ChannelsService — email and WhatsApp sends. ctx.channels.email.send(msg). All messages logged.
ctx.integrations IntegrationsService — encrypted credential vault. ctx.integrations.getActive(provider) returns decrypted config or null. Platform-managed providers connected via HMAC-signed forwarding.
ctx.http Typed HTTP client with retries, timeouts, and circuit breakers.
ctx.realtime RealtimeService — event-driven server-push via PostgreSQL LISTEN/NOTIFY + SSE. ctx.realtime.notify({ table, id?, action? }, tx?) after mutations.

For jobs, pass dependencies through closures/factories (or import what you need) when calling defineJob(...).

module init context

Modules can declare an init hook that receives a ModuleInitContext at boot — same services as request context (db, scheduler, http, storage, channels, realtime). Unconfigured services use throw-proxies that give descriptive errors if accessed.

ctx extensions for external integrations

Beyond local capabilities (database, user, scheduler, storage), ctx provides outbound connectivity and inbound event handling:

Property What it does
ctx.http Typed fetch wrapper with retries, timeouts, circuit breakers, and structured error responses. Configurable per-app via http in vobase.config.ts.
webhooks (app-level) Inbound webhook receiver with HMAC signature verification, deduplication, and automatic enqueue-to-job. Configured in vobase.config.ts, mounted as /webhooks/* routes — not a ctx property.
// vobase.config.ts
export default defineConfig({
  database: process.env.DATABASE_URL || './data/pgdata',
  integrations: { enabled: true },      // opt-in: encrypted credential store, provider configs
  storage: {                            // opt-in: file storage
    provider: { type: 'local', basePath: './data/files' },
    buckets: { avatars: { maxSize: 5_000_000 }, documents: {} },
  },
  channels: {                           // opt-in: email + WhatsApp
    email: { provider: 'resend', from: '[email protected]', resend: { apiKey: '...' } },
  },
  http: {
    timeout: 10_000,
    retries: 3,
    retryDelay: 500,
    circuitBreaker: { threshold: 5, resetTimeout: 30_000 },
  },
  webhooks: {
    'stripe-events': {
      path: '/webhooks/stripe',
      secret: process.env.STRIPE_WEBHOOK_SECRET,
      handler: 'system:processWebhook',
      signatureHeader: 'stripe-signature',
      dedup: true,
    },
  },
})

Credentials stay in .env. Config declares the shape.


vs the alternatives

Vobase Supabase Pocketbase Rails / Laravel
What you get Full-stack scaffold (backend + frontend + skills) Backend-as-a-service (db + auth + storage + functions) Backend binary (db + auth + storage + API) Full-stack framework
Language TypeScript end-to-end TypeScript (client) + PostgreSQL Go (closed binary) Ruby / PHP
Database PostgreSQL (PGlite local, managed prod) PostgreSQL (managed) SQLite (embedded) PostgreSQL / MySQL
Self-hosted One process, one container 10+ Docker containers One binary Multi-process
You own the code Yes — all source in your project No — managed service No — compiled binary Yes — but no AI conventions
AI integration Agent skills + MCP + strict conventions None None None
How you customize Edit the code. AI reads it. Dashboard + RLS policies Admin UI + hooks Edit the code
Hosting cost As low as $15/mo $25/mo+ (or complex self-host) Free (self-host) Varies
Data isolation Physical (one db per app) Logical (RLS) Physical Varies
License MIT Apache 2.0 MIT MIT

vs Supabase: Self-hosted Supabase is 10+ Docker containers. RLS policies are hard to reason about. You don't own the backend code. Vobase is one process, you own every line — AI agents can read and modify everything.

vs Pocketbase: Pocketbase is a Go binary. You can see the admin UI, but you can't read or modify the internals. When you need custom business logic, you're writing Go plugins or calling external services. Vobase is TypeScript you own — AI agents understand and extend it natively.

vs Rails / Laravel: Great frameworks, but they weren't designed for AI coding agents. Vobase's strict conventions and agent skills mean AI-generated code follows your patterns consistently. Plus: simpler stack (no Redis, single process, TypeScript end-to-end).


runtime architecture

One Bun process. One Docker container. One app.

Docker container (--restart=always)
  └── Bun process (PID 1)
        ├── Hono server
        │     ├── /auth/*       → better-auth (sessions, passwords, CSRF)
        │     ├── /api/*        → module handlers (session-validated)
        │     ├── /api/mastra/* → Mastra agent/tool/workflow API
        │     ├── /studio       → Mastra Studio SPA (dev-only)
        │     ├── /mcp          → MCP server (same process, shared port)
        │     ├── /webhooks/*   → inbound event receiver (signature verified, dedup)
        │     └── /*            → frontend (static, from dist/)
        ├── Drizzle (PGlite local / bun:sql production)
        ├── Built-in modules
        │     ├── _auth         → better-auth behind AuthAdapter contract
        │     ├── _audit        → audit log, record tracking, auth hooks
        │     ├── _sequences    → gap-free business number counters
        │     ├── _integrations → encrypted credential vault, platform OAuth handoff (opt-in)
        │     ├── _storage      → virtual buckets, local/S3 (opt-in)
        │     └── _channels     → unified messaging, adapter pattern (opt-in)
        ├── pg-boss (Postgres-backed job queue)
        ├── Outbound HTTP (typed fetch, retries, circuit breakers)
        └── Audit middleware (all mutations → _audit_log)

mcp server

Runs in the same Bun process on the same port. Authenticated via API keys (better-auth apiKey plugin). When you connect Claude Code, Codex, Cursor, or any MCP-compatible tool, it sees your app:

Tool What it does
list_modules List all registered modules (built-in + user)
read_module Read table names from a specific module schema
get_schema List all table names across every module
view_logs Return recent audit log entries

The AI sees your exact data model, your existing modules, and the conventions before it writes a single line of code.


deployment

Ship a Docker image. Railway, Fly.io, or any Docker host. Set DATABASE_URL for a managed Postgres connection.

Railway (quickest):

railway up

The template ships with Dockerfile and railway.toml pre-configured. Add a Postgres plugin and Railway sets DATABASE_URL automatically.

Docker Compose:

# docker-compose.yml
services:
  vobase:
    image: your-registry/my-vobase:latest
    restart: always
    environment:
      DATABASE_URL: postgres://user:pass@db:5432/vobase
    ports:
      - "3000:3000"
  db:
    image: postgres:17
    volumes:
      - pgdata:/var/lib/postgresql/data
    environment:
      POSTGRES_DB: vobase
      POSTGRES_USER: user
      POSTGRES_PASSWORD: pass
volumes:
  pgdata:

project commands

After scaffolding, your project uses standard tools directly — no wrapper CLI:

Command What it does
bun run dev Start Bun backend with --watch and Vite frontend. Auto-restarts on changes.
bun run db:current Apply SQL fixtures (nanoid function, extensions) to the database.
bun run db:push Push schema to database (dev). No migrations needed.
bun run db:generate Generate migration files for production.
bun run db:migrate Run migrations against the database.
bun run db:seed Seed default admin user and sample data.
bun run db:reset Delete database, re-apply fixtures, push schema, and seed.
bun run db:studio Open Drizzle Studio for visual database browsing.

project structure

my-app/
  .env
  .env.example
  package.json            ← depends on @vobase/core
  drizzle.config.ts
  vobase.config.ts        ← database path, auth, connections, webhooks
  vite.config.ts          ← Vite + TanStack Router + path aliases
  index.html
  server.ts               ← createApp() entry + Mastra init + Studio mount
  AGENTS.md               ← project context and guardrails
  .agents/
    skills/
      integer-money/
        SKILL.md          ← core: all money as integer cents
  mastra/                 ← Mastra primitives (follows Mastra project conventions)
    index.ts              ← Mastra singleton: initMastra(), getMastra(), getMemory()
    studio.ts             ← dev-only Studio SPA middleware
    agents/
      index.ts            ← agent registry
      assistant.ts        ← Vobase Assistant (Claude Sonnet, KB search)
      quick-helper.ts     ← Lead Qualifier (Gemini Pro, escalation)
    tools/
      search-kb.ts        ← RAG tool: hybrid search over knowledge base
      escalate.ts         ← hand off conversation to human staff
    workflows/
      escalation.ts       ← human-in-the-loop escalation flow
      follow-up.ts        ← delayed follow-up scheduling
    processors/
      index.ts            ← dynamic input/output processor factories
      moderation.ts       ← content moderation input processor
      memory/             ← EverMemOS: MemCells → Episodes → Facts
        memory-processor.ts  ← retrieval (input) + boundary detection (output)
        retriever.ts      ← hybrid search (BM25 + vector) with RRF
        formation.ts      ← extract episodes + facts, embed, store
        boundary-detector.ts
        extractors.ts
    evals/                ← eval framework (scorers, runner)
    mcp/                  ← AI module MCP server
    lib/
      deps.ts             ← module-level DI (db, scheduler)
      models.ts           ← model aliases
      observability.ts    ← tracing config
      storage/
        pglite-store.ts   ← PGlite adapter for Mastra storage
  modules/
    ai/                   ← AI dashboard module (schema, routes, jobs, pages)
      index.ts            ← defineModule() — imports from ../../mastra/
      schema.ts           ← EverMemOS tables (mem_cells, episodes, facts, etc.)
      handlers.ts         ← memory API, evals, guardrails, workflow routes
      jobs.ts             ← memory formation, eval runs, follow-up resume
      pages/              ← agent config, memory explorer, evals, workflows, guardrails
    system/               ← admin dashboard (scaffolded)
      index.ts            ← defineModule()
      schema.ts
      handlers.ts         ← health, audit log, sequences, record audits
      pages/
    knowledge-base/       ← document ingestion + hybrid search
      index.ts
      schema.ts
      handlers.ts
      jobs.ts             ← async document processing via queue
      lib/
        extract.ts        ← PDF, DOCX, XLSX, PPTX, HTML, image extraction
        chunker.ts        ← recursive text chunking
        embeddings.ts     ← vector embeddings via AI SDK
        pipeline.ts       ← chunk → embed → store pipeline
        search.ts         ← RRF hybrid search with fast/deep modes
      pages/
    messaging/            ← AI chat + multi-channel replies
      index.ts
      schema.ts
      handlers.ts         ← thread CRUD, streaming chat, channel webhooks
      jobs.ts             ← outbox delivery, channel polling
      lib/
        chat.ts           ← streaming chat via agent.stream()
        channel-reply.ts  ← non-streaming replies via agent.generate()
        memory-bridge.ts  ← bridge to Mastra Memory API
      pages/
    index.ts              ← module registry
    your-module/          ← modules you add
      index.ts            ← defineModule()
      schema.ts
      handlers.ts
      jobs.ts
      pages/
  src/
    main.tsx
    home.tsx
    root.tsx
    routes.ts             ← generated route definitions
    routeTree.gen.ts      ← generated TanStack route tree
    lib/
      api-client.ts
      auth-client.ts
      utils.ts
    components/
      ui/                 ← shadcn/ui (owned by you)
    shell/
      layout.tsx
      sidebar.tsx
      auth/
        login.tsx
        signup.tsx
    styles/
      app.css
  data/
    pgdata/               ← PGlite database (local dev)
    files/                ← optional, created on first upload

Star History

Star History Chart

Star if the repo has helped you


license

MIT. Own everything.

Reviews (0)

No results found