mcp-chat
Health Pass
- License — License: NOASSERTION
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 181 GitHub stars
Code Warn
- process.env — Environment variable access in app/(auth)/auth.ts
- process.env — Environment variable access in app/(chat)/api/chat/route.ts
Permissions Pass
- Permissions — No dangerous permissions requested
This project is an open-source chat application and reference example that demonstrates how to integrate Pipedream's Model Context Protocol (MCP) server into AI agents, providing access to thousands of external APIs and tools.
Security Assessment
Overall Risk: Low. The tool does not execute arbitrary shell commands, contains no hardcoded secrets, and does not request dangerous system permissions. It relies securely on environment variables to handle API keys and authentication secrets, which triggered the automated warnings in the scan. By design, the app makes external network requests to interact with LLM providers (like OpenAI and Anthropic), authenticate users via Google, and connect to Pipedream's APIs. No malicious or unexpected network behavior was identified.
Quality Assessment
The project is actively maintained, evidenced by repository updates as recent as today. It enjoys a solid level of community trust with over 180 GitHub stars. The only drawback is that its license is marked as "NOASSERTION," meaning developers will need to verify the repository's license files manually before utilizing it in commercial or enterprise production environments.
Verdict
Safe to use.
Examples of using Pipedream's MCP server in your app or AI agent.
MCP Chat by Pipedream
MCP Chat is a free, open-source chat app built using the AI SDK, and Pipedream MCP, which provides access to nearly 3,000 APIs and more than 10,000 tools. Use this as a reference to build powerful AI chat applications.
Features · Model Providers · Prerequisites · Deploy Your Own · Running Locally
Check out the app in production at chat.pipedream.com and refer to Pipedream's developer docs for the most up to date information.
Features
- MCP integrations: Connect to thousands of APIs through Pipedream's MCP server with built-in auth
- Automatic tool discovery: Execute tool calls across different APIs via chat
- Uses the AI SDK: Unified API for generating text, structured objects, and tool calls with LLMs
- Flexible LLM and framework support: Works with any LLM provider or framework
- Data persistence: Uses Neon Serverless Postgres for saving chat history and user data and Auth.js for simple and secure sign-in
Model Providers
The demo app currently uses models from Anthropic, OpenAI, and Gemini, but the AI SDK supports many more.
Prerequisites
To run or deploy this app, you'll need:
- A Pipedream account
- A Pipedream project. Accounts connected via MCP will be stored here.
- Pipedream OAuth credentials
- An OpenAI API key
Deploy Your Own
One-click deploy this app to Vercel:
Running locally
- Copy the environment file and add your credentials:
cp .env.example .env # Edit with your values
Note that for easier development, chat persistence and application sign-in are disabled by default in the .env.example file:
# In your .env file
DISABLE_AUTH=true
DISABLE_PERSISTENCE=true
- Install dependencies and start the app:
We recommend using asdf to manage core deps like Node. Install it and run
asdf install
Then:
pnpm install
pnpm dev
Your local app should now be running on http://localhost:3000 🎉
Enabling chat persistence
- Run all required local services:
docker compose up -d
- Run migrations:
POSTGRES_URL=postgresql://postgres@localhost:5432/postgres pnpm db:migrate
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found