scrapfly-mcp
Health Uyari
- No license — Repository has no license file
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 7 GitHub stars
Code Gecti
- Code scan — Scanned 6 files during light audit, no dangerous patterns found
Permissions Gecti
- Permissions — No dangerous permissions requested
This server connects AI assistants (via the Model Context Protocol) to the Scrapfly web scraping API, allowing them to fetch real-time data, bypass anti-bot protections, extract structured information, and capture screenshots from live websites.
Security Assessment
Overall risk: Medium.
The tool acts as a bridge between your local environment and Scrapfly's external cloud infrastructure, meaning it inherently makes network requests to pass instructions and retrieve data. The automated light code scan found no dangerous patterns, hardcoded secrets, or dangerous local permissions. However, because it provides AI assistants with the ability to arbitrarily scrape and interact with the web using your Scrapfly API key, it handles sensitive account credentials. Additionally, it relies on routing data through a third-party cloud service rather than processing everything locally.
Quality Assessment
The project is actively maintained, with its most recent push occurring today. The codebase appears clean and strictly focused on its intended API integration. However, there are notable visibility and licensing concerns. The repository currently lacks a formal license file, which means the legal terms for using, modifying, or distributing the code are technically undefined. Furthermore, with only 7 stars on GitHub, the tool has very low community visibility, meaning it has not been widely battle-tested or reviewed by the broader open-source community.
Verdict
Use with caution — the code is actively maintained and appears safe from local vulnerabilities, but the lack of a formal license, low community adoption, and heavy reliance on routing data through a third-party service require careful consideration before deploying in sensitive environments.
Official Scrapfly MCP server for Cursor, Claude Desktop, and any MCP-compatible client. Enterprise-grade web scraping, AI extraction, and anti-bot–aware data access as first-class tools.
Scrapfly MCP Server
Give your AI real-time access to any website
🌐 Landing Page • 📖 Documentation • 🎮 Live Demo • 🔑 Get API Key
What is Scrapfly MCP?
The Scrapfly MCP Server connects your AI assistants to live web data through the Model Context Protocol. Transform your AI from being limited by training data to having real-time access to any website.
✨ What Your AI Can Do
| Capability | Description |
|---|---|
| 🌐 Scrape Live Data | Pull current prices, listings, news, or any webpage content in real-time |
| 🛡️ Bypass Anti-Bot Systems | Automatically handle CAPTCHAs, proxies, JavaScript rendering, and rate limits |
| ⚡ Extract Structured Data | Parse complex websites into clean JSON using AI-powered extraction |
| 📸 Capture Screenshots | Take visual snapshots of pages or specific elements for analysis |
🏆 Why Scrapfly?
Built on battle-tested infrastructure used by thousands of developers:
- 99.9% Uptime — Enterprise-grade reliability
- 100+ Countries — Global proxy network with datacenter & residential IPs
- Anti-Bot Bypass — Advanced ASP technology defeats modern protections
- OAuth2 Security — Enterprise authentication for production deployments
📖 Learn more: Why Scrapfly MCP?
🚀 Quick Install
Click one of the buttons below to install the MCP server in your preferred IDE:
📦 Manual Installation
Standard Configuration
Works with most MCP-compatible tools:
{
"servers": {
"scrapfly-cloud-mcp": {
"type": "http",
"url": "https://mcp.scrapfly.io/mcp"
}
}
}
Cloud Configuration (NPX)
For tools that require a local process:
{
"mcpServers": {
"scrapfly": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.scrapfly.io/mcp"
]
}
}
}
🔧 IDE-Specific Setup
VS CodeOne-Click Install
Manual Install
Follow the VS Code MCP guide or use the CLI:
code --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'
After installation, Scrapfly tools will be available in GitHub Copilot Chat.
VS Code Insiders📖 Full guide: VS Code Integration
One-Click Install
Manual Install
code-insiders --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'
Visual Studio📖 Full guide: VS Code Integration
One-Click Install
Manual Install
- Open Visual Studio
- Navigate to GitHub Copilot Chat window
- Click the tools icon (🛠️) in the chat toolbar
- Click + Add Server to open the configuration dialog
- Configure:
- Server ID:
scrapfly-cloud-mcp - Type:
http/sse - URL:
https://mcp.scrapfly.io/mcp
- Server ID:
- Click Save
Cursor📖 Full guide: Visual Studio MCP documentation
One-Click Install
Manual Install
- Go to
Cursor Settings→MCP→Add new MCP Server - Use the standard configuration above
- Click Edit to verify or add arguments
Claude Code📖 Full guide: Cursor Integration
Use the Claude Code CLI:
claude mcp add scrapfly-cloud-mcp --url https://mcp.scrapfly.io/mcp
Claude Desktop📖 Full guide: Claude Code Integration
Add to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"scrapfly": {
"command": "npx",
"args": ["mcp-remote", "https://mcp.scrapfly.io/mcp"]
}
}
}
Cline📖 Full guide: Claude Desktop Integration
Add to your Cline MCP settings:
{
"scrapfly-cloud-mcp": {
"type": "http",
"url": "https://mcp.scrapfly.io/mcp"
}
}
Windsurf📖 Full guide: Cline Integration
Follow the Windsurf MCP documentation using the standard configuration.
Zed📖 Full guide: Windsurf Integration
Add to your Zed settings:
{
"context_servers": {
"scrapfly-cloud-mcp": {
"type": "http",
"url": "https://mcp.scrapfly.io/mcp"
}
}
}
OpenAI Codex📖 Full guide: Zed Integration
Create or edit ~/.codex/config.toml:
[mcp_servers.scrapfly-cloud-mcp]
url = "https://mcp.scrapfly.io/mcp"
Gemini CLI📖 More info: Codex MCP documentation
Follow the Gemini CLI MCP guide using the standard configuration.
OpenCodeAdd to ~/.config/opencode/opencode.json:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"scrapfly-cloud-mcp": {
"type": "http",
"url": "https://mcp.scrapfly.io/mcp",
"enabled": true
}
}
}
📖 More info: OpenCode MCP documentation
🛠️ Available Tools
The Scrapfly MCP Server provides 5 powerful tools covering 99% of web scraping use cases:
| Tool | Description | Use Case |
|---|---|---|
scraping_instruction_enhanced |
Get best practices & POW token | Always call first! |
web_get_page |
Quick page fetch with smart defaults | Simple scraping tasks |
web_scrape |
Full control with browser automation | Complex scraping, login flows |
screenshot |
Capture page screenshots | Visual analysis, monitoring |
info_account |
Check usage & quota | Account management |
📖 Full reference: Tools & API Specification
Example: Scrape a Page
User: "What are the top posts on Hacker News right now?"
AI: Uses web_get_page to fetch https://news.ycombinator.com and returns current top stories
Example: Extract Structured Data
User: "Get all product prices from this Amazon page"
AI: Uses web_scrape with extraction_model="product_listing" to return structured JSON
📖 More examples: Real-World Examples
🔐 Authentication
Scrapfly MCP supports multiple authentication methods:
| Method | Best For | Documentation |
|---|---|---|
| OAuth2 | Production, multi-user apps | OAuth2 Setup |
| API Key | Personal use, development | API Key Setup |
| Header Auth | Custom integrations | Header Auth |
🔑 Get your API key: Scrapfly Dashboard
📊 Configuration Reference
| Setting | Value |
|---|---|
| Server Name | scrapfly-cloud-mcp |
| Type | Remote HTTP Server |
| URL | https://mcp.scrapfly.io/mcp |
| Protocol | MCP over HTTP/SSE |
🖥️ Self-Hosted / Local Deployment
You can run the Scrapfly MCP server locally or self-host it.
CLI Arguments
| Flag | Description |
|---|---|
-http <address> |
Start HTTP server at the specified address (e.g., :8080). Takes precedence over PORT env var. |
-apikey <key> |
Use this API key instead of the SCRAPFLY_API_KEY environment variable. |
Environment Variables
| Variable | Description |
|---|---|
PORT |
HTTP port to listen on. Used if -http flag is not set. |
SCRAPFLY_API_KEY |
Default Scrapfly API key. Can also be passed via query parameter ?apiKey=xxx at runtime. |
Examples
# Start HTTP server on port 8080
./scrapfly-mcp -http :8080
# Start HTTP server using PORT env var
PORT=8080 ./scrapfly-mcp
# Start with API key
./scrapfly-mcp -http :8080 -apikey scp-live-xxxx
# Start in stdio mode (for local MCP clients)
./scrapfly-mcp
Docker
# Build
docker build -t scrapfly-mcp .
# Run (Smithery compatible - uses PORT env var)
docker run -p 8080:8080 scrapfly-mcp
# Run with custom port
docker run -e PORT=9000 -p 9000:9000 scrapfly-mcp
🤝 Framework Integrations
Scrapfly MCP also works with AI frameworks and automation tools:
| Framework | Documentation |
|---|---|
| LangChain | LangChain Integration |
| LlamaIndex | LlamaIndex Integration |
| CrewAI | CrewAI Integration |
| OpenAI | OpenAI Integration |
| n8n | n8n Integration |
| Make | Make Integration |
| Zapier | Zapier Integration |
📖 All integrations: Integration Index
📚 Resources
- 🌐 MCP Cloud Landing Page — Product overview & features
- 🎮 Live n8n Demo — Try it in your browser
- 📖 Full Documentation
- 🛠️ Tools Reference
- 💡 Examples & Use Cases
- ❓ FAQ
- 🔐 Authentication Guide
💬 Need Help?
Made with ❤️ by Scrapfly
The Web Scraping API for Developers
Yorumlar (0)
Yorum birakmak icin giris yap.
Yorum birakSonuc bulunamadi