
Research
Malicious npm Packages Impersonate Flashbots SDKs, Targeting Ethereum Wallet Credentials
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
websearch-mcp
Advanced tools
A Model Context Protocol (MCP) server implementation that provides real-time web search capabilities through a simple API
A Model Context Protocol (MCP) server implementation that provides a web search capability over stdio transport. This server integrates with a WebSearch Crawler API to retrieve search results.
WebSearch-MCP is a Model Context Protocol server that provides web search capabilities to AI assistants that support MCP. It allows AI models like Claude to search the web in real-time, retrieving up-to-date information about any topic.
The server integrates with a Crawler API service that handles the actual web searches, and communicates with AI assistants using the standardized Model Context Protocol.
To install WebSearch for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @mnhlt/WebSearch-MCP --client claude
npm install -g websearch-mcp
Or use without installing:
npx websearch-mcp
The WebSearch MCP server can be configured using environment variables:
API_URL
: The URL of the WebSearch Crawler API (default: http://localhost:3001
)MAX_SEARCH_RESULT
: Maximum number of search results to return when not specified in the request (default: 5
)Examples:
# Configure API URL
API_URL=https://crawler.example.com npx websearch-mcp
# Configure maximum search results
MAX_SEARCH_RESULT=10 npx websearch-mcp
# Configure both
API_URL=https://crawler.example.com MAX_SEARCH_RESULT=10 npx websearch-mcp
Setting up WebSearch-MCP involves two main parts: configuring the crawler service that performs the actual web searches, and integrating the MCP server with your AI client applications.
The WebSearch MCP server requires a crawler service to perform the actual web searches. You can easily set up the crawler service using Docker Compose.
docker-compose.yml
with the following content:version: '3.8'
services:
crawler:
image: laituanmanh/websearch-crawler:latest
container_name: websearch-api
restart: unless-stopped
ports:
- "3001:3001"
environment:
- NODE_ENV=production
- PORT=3001
- LOG_LEVEL=info
- FLARESOLVERR_URL=http://flaresolverr:8191/v1
depends_on:
- flaresolverr
volumes:
- crawler_storage:/app/storage
flaresolverr:
image: 21hsmw/flaresolverr:nodriver
container_name: flaresolverr
restart: unless-stopped
environment:
- LOG_LEVEL=info
- TZ=UTC
volumes:
crawler_storage:
workaround for Mac Apple Silicon
version: '3.8'
services:
crawler:
image: laituanmanh/websearch-crawler:latest
container_name: websearch-api
platform: "linux/amd64"
restart: unless-stopped
ports:
- "3001:3001"
environment:
- NODE_ENV=production
- PORT=3001
- LOG_LEVEL=info
- FLARESOLVERR_URL=http://flaresolverr:8191/v1
depends_on:
- flaresolverr
volumes:
- crawler_storage:/app/storage
flaresolverr:
image: 21hsmw/flaresolverr:nodriver
platform: "linux/arm64"
container_name: flaresolverr
restart: unless-stopped
environment:
- LOG_LEVEL=info
- TZ=UTC
volumes:
crawler_storage:
docker-compose up -d
docker-compose ps
curl http://localhost:3001/health
Expected response:
{
"status": "ok",
"details": {
"status": "ok",
"flaresolverr": true,
"google": true,
"message": null
}
}
The crawler API will be available at http://localhost:3001
.
You can test the crawler API directly using curl:
curl -X POST http://localhost:3001/crawl \
-H "Content-Type: application/json" \
-d '{
"query": "typescript best practices",
"numResults": 2,
"language": "en",
"filters": {
"excludeDomains": ["youtube.com"],
"resultType": "all"
}
}'
You can customize the crawler service by modifying the environment variables in the docker-compose.yml
file:
PORT
: The port on which the crawler API listens (default: 3001)LOG_LEVEL
: Logging level (options: debug, info, warn, error)FLARESOLVERR_URL
: URL of the FlareSolverr service (for bypassing Cloudflare protection)Here's a quick reference for MCP configuration across different clients:
{
"mcpServers": {
"websearch": {
"command": "npx",
"args": [
"websearch-mcp"
],
"environment": {
"API_URL": "http://localhost:3001",
"MAX_SEARCH_RESULT": "5" // reduce to save your tokens, increase for wider information gain
}
}
}
}
Workaround for Windows, due to Issue
{
"mcpServers": {
"websearch": {
"command": "cmd",
"args": [
"/c",
"npx",
"websearch-mcp"
],
"environment": {
"API_URL": "http://localhost:3001",
"MAX_SEARCH_RESULT": "1"
}
}
}
}
This package implements an MCP server using stdio transport that exposes a web_search
tool with the following parameters:
query
(required): The search query to look upnumResults
(optional): Number of results to return (default: 5)language
(optional): Language code for search results (e.g., 'en')region
(optional): Region code for search results (e.g., 'us')excludeDomains
(optional): Domains to exclude from resultsincludeDomains
(optional): Only include these domains in resultsexcludeTerms
(optional): Terms to exclude from resultsresultType
(optional): Type of results to return ('all', 'news', or 'blogs')Here's an example of a search response:
{
"query": "machine learning trends",
"results": [
{
"title": "Top Machine Learning Trends in 2025",
"snippet": "The key machine learning trends for 2025 include multimodal AI, generative models, and quantum machine learning applications in enterprise...",
"url": "https://example.com/machine-learning-trends-2025",
"siteName": "AI Research Today",
"byline": "Dr. Jane Smith"
},
{
"title": "The Evolution of Machine Learning: 2020-2025",
"snippet": "Over the past five years, machine learning has evolved from primarily supervised learning approaches to more sophisticated self-supervised and reinforcement learning paradigms...",
"url": "https://example.com/ml-evolution",
"siteName": "Tech Insights",
"byline": "John Doe"
}
]
}
To test the WebSearch MCP server locally, you can use the included test client:
npm run test-client
This will start the MCP server and a simple command-line interface that allows you to enter search queries and see the results.
You can also configure the API_URL for the test client:
API_URL=https://crawler.example.com npm run test-client
You can use this package programmatically:
import { createMCPClient } from '@modelcontextprotocol/sdk';
// Create an MCP client
const client = createMCPClient({
transport: { type: 'subprocess', command: 'npx websearch-mcp' }
});
// Execute a web search
const response = await client.request({
method: 'call_tool',
params: {
name: 'web_search',
arguments: {
query: 'your search query',
numResults: 5,
language: 'en'
}
}
});
console.log(response.result);
docker-compose logs crawler
docker-compose logs flaresolverr
npm install -g @modelcontextprotocol/sdk@latest
To work on this project:
npm install
npm run build
npm run dev
The server expects a WebSearch Crawler API as defined in the included swagger.json file. Make sure the API is running at the configured API_URL.
.gitignore
: Specifies files that Git should ignore (node_modules, dist, logs, etc.).npmignore
: Specifies files that shouldn't be included when publishing to npmpackage.json
: Project metadata and dependenciessrc/
: Source TypeScript filesdist/
: Compiled JavaScript files (generated when building)To publish this package to npm:
npm login
)npm version patch|minor|major
)npm publish
The .npmignore
file ensures that only the necessary files are included in the published package:
dist/
Contributions are welcome! Please feel free to submit a Pull Request.
ISC
FAQs
A Model Context Protocol (MCP) server implementation that provides real-time web search capabilities through a simple API
The npm package websearch-mcp receives a total of 128 weekly downloads. As such, websearch-mcp popularity was classified as not popular.
We found that websearch-mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
Security News
Ruby maintainers from Bundler and rbenv teams are building rv to bring Python uv's speed and unified tooling approach to Ruby development.
Security News
Following last week’s supply chain attack, Nx published findings on the GitHub Actions exploit and moved npm publishing to Trusted Publishers.