
Research
Malicious fezbox npm Package Steals Browser Passwords from Cookies via Innovative QR Code Steganographic Technique
A malicious package uses a QR code as steganography in an innovative technique.
@crawlab/mcp
Advanced tools
A Model Context Protocol (MCP) server for interacting with Crawlab, a distributed web crawler management platform. This server provides tools to manage spiders, tasks, schedules, and monitor your Crawlab cluster through an AI assistant.
npm install
npm run build
# Start the MCP server
mcp-server-crawlab <crawlab_url> [api_token]
# Examples:
mcp-server-crawlab http://localhost:8080
mcp-server-crawlab https://crawlab.example.com your-api-token
You can also set the API token via environment variable:
export CRAWLAB_API_TOKEN=your-api-token
mcp-server-crawlab http://localhost:8080
For development and testing, you can use the MCP Inspector:
npm run inspect
This MCP server is designed to work with AI assistants that support the Model Context Protocol. Configure your AI assistant to connect to this server to enable Crawlab management capabilities.
crawlab_list_spiders
- List all spiders with optional paginationcrawlab_get_spider
- Get detailed information about a specific spidercrawlab_create_spider
- Create a new spidercrawlab_update_spider
- Update spider configurationcrawlab_delete_spider
- Delete a spidercrawlab_run_spider
- Execute a spidercrawlab_list_spider_files
- Browse spider files and directoriescrawlab_get_spider_file_content
- Read spider file contentcrawlab_save_spider_file
- Save content to spider filescrawlab_list_tasks
- List tasks with filtering optionscrawlab_get_task
- Get detailed task informationcrawlab_cancel_task
- Cancel a running taskcrawlab_restart_task
- Restart a completed or failed taskcrawlab_delete_task
- Delete a taskcrawlab_get_task_logs
- Retrieve task execution logscrawlab_get_task_results
- Get data collected by a taskcrawlab_list_schedules
- List all schedulescrawlab_get_schedule
- Get schedule detailscrawlab_create_schedule
- Create a new cron schedulecrawlab_update_schedule
- Update schedule configurationcrawlab_delete_schedule
- Delete a schedulecrawlab_enable_schedule
- Enable a schedulecrawlab_disable_schedule
- Disable a schedulecrawlab_list_nodes
- List cluster nodescrawlab_get_node
- Get node details and statuscrawlab_health_check
- Check system healthcrawlab_system_status
- Get comprehensive system overviewThe server includes several helpful prompts for common workflows:
spider-analysis
Analyze spider performance and provide optimization insights.
Parameters:
spider_id
(required) - ID of the spider to analyzetime_range
(optional) - Time range for analysis (e.g., '7d', '30d', '90d')task-debugging
Debug failed tasks and identify root causes.
Parameters:
task_id
(required) - ID of the failed taskspider-setup
Guide for creating and configuring new spiders.
Parameters:
spider_name
(required) - Name for the new spidertarget_website
(optional) - Target website to scrapespider_type
(optional) - Type of spider (scrapy, selenium, custom)system-monitoring
Monitor system health and performance.
Parameters:
focus_area
(optional) - Area to focus on (nodes, tasks, storage, overall)AI: I'll help you create a new spider for scraping news articles.
[Uses crawlab_create_spider with appropriate parameters]
[Uses crawlab_run_spider to test the spider]
[Uses crawlab_get_task_logs to check execution]
User: "My task abc123 failed, can you help me debug it?"
[Uses task-debugging prompt]
[AI retrieves task details, logs, and provides analysis]
User: "How is my Crawlab cluster performing?"
[Uses system-monitoring prompt]
[AI provides comprehensive health overview and recommendations]
Ensure your Crawlab instance is accessible and optionally configure API authentication:
Add this server to your MCP client configuration:
{
"servers": {
"crawlab": {
"command": "mcp-server-crawlab",
"args": ["http://localhost:8080", "your-api-token"]
}
}
}
npm run build
npm run watch
npm test
npm run lint
npm run lint:fix
MIT License
For issues and questions:
FAQs
MCP server for interacting with Crawlab web crawler management platform
We found that @crawlab/mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
A malicious package uses a QR code as steganography in an innovative technique.
Research
/Security News
Socket identified 80 fake candidates targeting engineering roles, including suspected North Korean operators, exposing the new reality of hiring as a security function.
Application Security
/Research
/Security News
Socket detected multiple compromised CrowdStrike npm packages, continuing the "Shai-Hulud" supply chain attack that has now impacted nearly 500 packages.