
Research
/Security News
CanisterWorm: npm Publisher Compromise Deploys Backdoor Across 29+ Packages
The worm-enabled campaign hit @emilgroup and @teale.io, then used an ICP canister to deliver follow-on payloads.
crawlio-mcp
Advanced tools
MCP server for Crawlio — control the macOS website crawler from AI assistants
MCP server for Crawlio — the AI-native macOS website crawler.
Exposes 36 tools and 4 resources over stdio transport, compatible with any MCP client (Claude Code, Cursor, Windsurf, VS Code, Zed, etc.).
npx crawlio-mcp init
Downloads the binary automatically and configures all detected MCP clients.
brew install crawlio-app/tap/crawlio-mcp
crawlio-mcp init
brew install --cask crawlio-app/tap/crawlio
The desktop app bundles the MCP server — the cask symlinks it to your PATH automatically.
Run init to auto-detect and configure your AI clients:
npx crawlio-mcp init # Configure all detected clients
npx crawlio-mcp init --full # Enable all 36 tools (default: 6 code-mode tools)
npx crawlio-mcp init --portal # HTTP transport + launchd auto-start
npx crawlio-mcp init --dry-run # Preview changes without writing
Add to your MCP client config (.mcp.json, mcp.json, etc.):
{
"mcpServers": {
"crawlio": {
"command": "npx",
"args": ["-y", "crawlio-mcp"]
}
}
}
This npm package is a thin wrapper that locates or downloads the native CrawlioMCP binary and forwards stdio through it. Binary resolution order:
$CRAWLIO_MCP_BINARY environment variable~/.crawlio/bin/CrawlioMCP (npm auto-download cache)/Applications/Crawlio.app/Contents/Helpers/CrawlioMCP~/Applications/Crawlio.app/Contents/Helpers/CrawlioMCP/opt/homebrew/bin/crawlio-mcp (Homebrew)/usr/local/bin/crawlio-mcpIf no binary is found, it downloads from GitHub Releases on first run.
36 tools across 8 categories:
| Category | Count | Tools |
|---|---|---|
| Status & Monitoring | 6 | get_crawl_status, get_crawl_logs, get_errors, get_downloads, get_failed_urls, get_site_tree |
| Control | 4 | start_crawl, stop_crawl, pause_crawl, resume_crawl |
| Settings & Config | 3 | get_settings, update_settings, recrawl_urls |
| Projects | 5 | list_projects, save_project, load_project, delete_project, get_project |
| Export & Extraction | 5 | export_site, get_export_status, extract_site, get_extraction_status, trigger_capture |
| Composite Analysis | 2 | analyze_page (capture + poll + status in one call), compare_pages (two-site comparison with summary) |
| Enrichment | 6 | get_enrichment, submit_enrichment_bundle, submit_enrichment_framework, submit_enrichment_network, submit_enrichment_console, submit_enrichment_dom |
| Observations & Findings | 4 | get_observations, create_finding, get_findings, get_crawled_urls |
| OCR | 1 | extract_text_from_image |
Plus 3 HTTP-only endpoints accessible via execute_api: get_health, get_debug_metrics, dump_state.
| URI | Description |
|---|---|
crawlio://status | Engine state and progress |
crawlio://settings | Current crawl settings |
crawlio://site-tree | Downloaded file tree |
crawlio://enrichment | Browser enrichment data |
FAQs
MCP server for Crawlio — control the macOS website crawler from AI assistants
The npm package crawlio-mcp receives a total of 50 weekly downloads. As such, crawlio-mcp popularity was classified as not popular.
We found that crawlio-mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
/Security News
The worm-enabled campaign hit @emilgroup and @teale.io, then used an ICP canister to deliver follow-on payloads.

Research
/Security News
Attackers compromised Trivy GitHub Actions by force-updating tags to deliver malware, exposing CI/CD secrets across affected pipelines.

Security News
ENISA’s new package manager advisory outlines the dependency security practices companies will need to demonstrate as the EU’s Cyber Resilience Act begins enforcing software supply chain requirements.