mcp-chat

mcp
Guvenlik Denetimi
Uyari
Health Gecti
  • License — License: NOASSERTION
  • Description — Repository has a description
  • Active repo — Last push 0 days ago
  • Community trust — 181 GitHub stars
Code Uyari
  • process.env — Environment variable access in app/(auth)/auth.ts
  • process.env — Environment variable access in app/(chat)/api/chat/route.ts
Permissions Gecti
  • Permissions — No dangerous permissions requested
Purpose
This project is an open-source chat application and reference example that demonstrates how to integrate Pipedream's Model Context Protocol (MCP) server into AI agents, providing access to thousands of external APIs and tools.

Security Assessment
Overall Risk: Low. The tool does not execute arbitrary shell commands, contains no hardcoded secrets, and does not request dangerous system permissions. It relies securely on environment variables to handle API keys and authentication secrets, which triggered the automated warnings in the scan. By design, the app makes external network requests to interact with LLM providers (like OpenAI and Anthropic), authenticate users via Google, and connect to Pipedream's APIs. No malicious or unexpected network behavior was identified.

Quality Assessment
The project is actively maintained, evidenced by repository updates as recent as today. It enjoys a solid level of community trust with over 180 GitHub stars. The only drawback is that its license is marked as "NOASSERTION," meaning developers will need to verify the repository's license files manually before utilizing it in commercial or enterprise production environments.

Verdict
Safe to use.
SUMMARY

Examples of using Pipedream's MCP server in your app or AI agent.

README.md
MCP Chat by Pipedream

MCP Chat by Pipedream

MCP Chat is a free, open-source chat app built using the AI SDK, and Pipedream MCP, which provides access to nearly 3,000 APIs and more than 10,000 tools. Use this as a reference to build powerful AI chat applications.

Features · Model Providers · Prerequisites · Deploy Your Own · Running Locally


Check out the app in production at chat.pipedream.com and refer to Pipedream's developer docs for the most up to date information.

Features

  • MCP integrations: Connect to thousands of APIs through Pipedream's MCP server with built-in auth
  • Automatic tool discovery: Execute tool calls across different APIs via chat
  • Uses the AI SDK: Unified API for generating text, structured objects, and tool calls with LLMs
  • Flexible LLM and framework support: Works with any LLM provider or framework
  • Data persistence: Uses Neon Serverless Postgres for saving chat history and user data and Auth.js for simple and secure sign-in

Model Providers

The demo app currently uses models from Anthropic, OpenAI, and Gemini, but the AI SDK supports many more.

Prerequisites

To run or deploy this app, you'll need:

  1. A Pipedream account
  2. A Pipedream project. Accounts connected via MCP will be stored here.
  3. Pipedream OAuth credentials
  4. An OpenAI API key

Deploy Your Own

One-click deploy this app to Vercel:

Deploy with Vercel

Running locally

  1. Copy the environment file and add your credentials:
cp .env.example .env  # Edit with your values

Note that for easier development, chat persistence and application sign-in are disabled by default in the .env.example file:

# In your .env file
DISABLE_AUTH=true
DISABLE_PERSISTENCE=true
  1. Install dependencies and start the app:

We recommend using asdf to manage core deps like Node. Install it and run

asdf install

Then:

pnpm install
pnpm dev

Your local app should now be running on http://localhost:3000 🎉

Enabling chat persistence

  1. Run all required local services:
docker compose up -d
  1. Run migrations:
POSTGRES_URL=postgresql://postgres@localhost:5432/postgres pnpm db:migrate

Yorumlar (0)

Sonuc bulunamadi