scrapfly-mcp

mcp
Security Audit
Warn
Health Warn
  • No license — Repository has no license file
  • Description — Repository has a description
  • Active repo — Last push 0 days ago
  • Low visibility — Only 7 GitHub stars
Code Pass
  • Code scan — Scanned 6 files during light audit, no dangerous patterns found
Permissions Pass
  • Permissions — No dangerous permissions requested
Purpose
This server connects AI assistants (via the Model Context Protocol) to the Scrapfly web scraping API, allowing them to fetch real-time data, bypass anti-bot protections, extract structured information, and capture screenshots from live websites.

Security Assessment
Overall risk: Medium.
The tool acts as a bridge between your local environment and Scrapfly's external cloud infrastructure, meaning it inherently makes network requests to pass instructions and retrieve data. The automated light code scan found no dangerous patterns, hardcoded secrets, or dangerous local permissions. However, because it provides AI assistants with the ability to arbitrarily scrape and interact with the web using your Scrapfly API key, it handles sensitive account credentials. Additionally, it relies on routing data through a third-party cloud service rather than processing everything locally.

Quality Assessment
The project is actively maintained, with its most recent push occurring today. The codebase appears clean and strictly focused on its intended API integration. However, there are notable visibility and licensing concerns. The repository currently lacks a formal license file, which means the legal terms for using, modifying, or distributing the code are technically undefined. Furthermore, with only 7 stars on GitHub, the tool has very low community visibility, meaning it has not been widely battle-tested or reviewed by the broader open-source community.

Verdict
Use with caution — the code is actively maintained and appears safe from local vulnerabilities, but the lack of a formal license, low community adoption, and heavy reliance on routing data through a third-party service require careful consideration before deploying in sensitive environments.
SUMMARY

Official Scrapfly MCP server for Cursor, Claude Desktop, and any MCP-compatible client. Enterprise-grade web scraping, AI extraction, and anti-bot–aware data access as first-class tools.

README.md

Scrapfly MCP Server

Scrapfly

Give your AI real-time access to any website

🌐 Landing Page📖 Documentation🎮 Live Demo🔑 Get API Key


What is Scrapfly MCP?

The Scrapfly MCP Server connects your AI assistants to live web data through the Model Context Protocol. Transform your AI from being limited by training data to having real-time access to any website.

✨ What Your AI Can Do

Capability Description
🌐 Scrape Live Data Pull current prices, listings, news, or any webpage content in real-time
🛡️ Bypass Anti-Bot Systems Automatically handle CAPTCHAs, proxies, JavaScript rendering, and rate limits
Extract Structured Data Parse complex websites into clean JSON using AI-powered extraction
📸 Capture Screenshots Take visual snapshots of pages or specific elements for analysis

🏆 Why Scrapfly?

Built on battle-tested infrastructure used by thousands of developers:

📖 Learn more: Why Scrapfly MCP?


🚀 Quick Install

Click one of the buttons below to install the MCP server in your preferred IDE:

Install in VS Code
Install in VS Code Insiders
Install in Visual Studio
Install in Cursor


📦 Manual Installation

Standard Configuration

Works with most MCP-compatible tools:

{
  "servers": {
    "scrapfly-cloud-mcp": {
      "type": "http",
      "url": "https://mcp.scrapfly.io/mcp"
    }
  }
}

Cloud Configuration (NPX)

For tools that require a local process:

{
  "mcpServers": {
    "scrapfly": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://mcp.scrapfly.io/mcp"
      ]
    }
  }
}

🔧 IDE-Specific Setup

VS Code

One-Click Install

Install in VS Code

Manual Install

Follow the VS Code MCP guide or use the CLI:

code --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'

After installation, Scrapfly tools will be available in GitHub Copilot Chat.

📖 Full guide: VS Code Integration

VS Code Insiders

One-Click Install

Install in VS Code Insiders

Manual Install

code-insiders --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'

📖 Full guide: VS Code Integration

Visual Studio

One-Click Install

Install in Visual Studio

Manual Install

  1. Open Visual Studio
  2. Navigate to GitHub Copilot Chat window
  3. Click the tools icon (🛠️) in the chat toolbar
  4. Click + Add Server to open the configuration dialog
  5. Configure:
    • Server ID: scrapfly-cloud-mcp
    • Type: http/sse
    • URL: https://mcp.scrapfly.io/mcp
  6. Click Save

📖 Full guide: Visual Studio MCP documentation

Cursor

One-Click Install

Install in Cursor

Manual Install

  1. Go to Cursor SettingsMCPAdd new MCP Server
  2. Use the standard configuration above
  3. Click Edit to verify or add arguments

📖 Full guide: Cursor Integration

Claude Code

Use the Claude Code CLI:

claude mcp add scrapfly-cloud-mcp --url https://mcp.scrapfly.io/mcp

📖 Full guide: Claude Code Integration

Claude Desktop

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "scrapfly": {
      "command": "npx",
      "args": ["mcp-remote", "https://mcp.scrapfly.io/mcp"]
    }
  }
}

📖 Full guide: Claude Desktop Integration

Cline

Add to your Cline MCP settings:

{
  "scrapfly-cloud-mcp": {
    "type": "http",
    "url": "https://mcp.scrapfly.io/mcp"
  }
}

📖 Full guide: Cline Integration

Windsurf

Follow the Windsurf MCP documentation using the standard configuration.

📖 Full guide: Windsurf Integration

Zed

Add to your Zed settings:

{
  "context_servers": {
    "scrapfly-cloud-mcp": {
      "type": "http",
      "url": "https://mcp.scrapfly.io/mcp"
    }
  }
}

📖 Full guide: Zed Integration

OpenAI Codex

Create or edit ~/.codex/config.toml:

[mcp_servers.scrapfly-cloud-mcp]
url = "https://mcp.scrapfly.io/mcp"

📖 More info: Codex MCP documentation

Gemini CLI

Follow the Gemini CLI MCP guide using the standard configuration.

OpenCode

Add to ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "scrapfly-cloud-mcp": {
      "type": "http",
      "url": "https://mcp.scrapfly.io/mcp",
      "enabled": true
    }
  }
}

📖 More info: OpenCode MCP documentation


🛠️ Available Tools

The Scrapfly MCP Server provides 5 powerful tools covering 99% of web scraping use cases:

Tool Description Use Case
scraping_instruction_enhanced Get best practices & POW token Always call first!
web_get_page Quick page fetch with smart defaults Simple scraping tasks
web_scrape Full control with browser automation Complex scraping, login flows
screenshot Capture page screenshots Visual analysis, monitoring
info_account Check usage & quota Account management

📖 Full reference: Tools & API Specification

Example: Scrape a Page

User: "What are the top posts on Hacker News right now?"

AI: Uses web_get_page to fetch https://news.ycombinator.com and returns current top stories

Example: Extract Structured Data

User: "Get all product prices from this Amazon page"

AI: Uses web_scrape with extraction_model="product_listing" to return structured JSON

📖 More examples: Real-World Examples


🔐 Authentication

Scrapfly MCP supports multiple authentication methods:

Method Best For Documentation
OAuth2 Production, multi-user apps OAuth2 Setup
API Key Personal use, development API Key Setup
Header Auth Custom integrations Header Auth

🔑 Get your API key: Scrapfly Dashboard


📊 Configuration Reference

Setting Value
Server Name scrapfly-cloud-mcp
Type Remote HTTP Server
URL https://mcp.scrapfly.io/mcp
Protocol MCP over HTTP/SSE

🖥️ Self-Hosted / Local Deployment

You can run the Scrapfly MCP server locally or self-host it.

CLI Arguments

Flag Description
-http <address> Start HTTP server at the specified address (e.g., :8080). Takes precedence over PORT env var.
-apikey <key> Use this API key instead of the SCRAPFLY_API_KEY environment variable.

Environment Variables

Variable Description
PORT HTTP port to listen on. Used if -http flag is not set.
SCRAPFLY_API_KEY Default Scrapfly API key. Can also be passed via query parameter ?apiKey=xxx at runtime.

Examples

# Start HTTP server on port 8080
./scrapfly-mcp -http :8080

# Start HTTP server using PORT env var
PORT=8080 ./scrapfly-mcp

# Start with API key
./scrapfly-mcp -http :8080 -apikey scp-live-xxxx

# Start in stdio mode (for local MCP clients)
./scrapfly-mcp

Docker

# Build
docker build -t scrapfly-mcp .

# Run (Smithery compatible - uses PORT env var)
docker run -p 8080:8080 scrapfly-mcp

# Run with custom port
docker run -e PORT=9000 -p 9000:9000 scrapfly-mcp

🤝 Framework Integrations

Scrapfly MCP also works with AI frameworks and automation tools:

Framework Documentation
LangChain LangChain Integration
LlamaIndex LlamaIndex Integration
CrewAI CrewAI Integration
OpenAI OpenAI Integration
n8n n8n Integration
Make Make Integration
Zapier Zapier Integration

📖 All integrations: Integration Index


📚 Resources


💬 Need Help?


Scrapfly
Made with ❤️ by Scrapfly
The Web Scraping API for Developers

Reviews (0)

No results found