
Product
Introducing Scala and Kotlin Support in Socket
Socket now supports Scala and Kotlin, bringing AI-powered threat detection to JVM projects with easy manifest generation and fast, accurate scans.
n8n-nodes-mcp
Advanced tools
Important Note: The Server-Sent Events (SSE) transport is deprecated and replaced by the new HTTP Streamable transport. SSE remains available for legacy compatibility, but HTTP Streamable is now the recommended method for all new implementations.
This is an n8n community node that lets you interact with Model Context Protocol (MCP) servers in your n8n workflows.
MCP is a protocol that enables AI models to interact with external tools and data sources in a standardized way. This node allows you to connect to MCP servers, access resources, execute tools, and use prompts.
n8n is a fair-code licensed workflow automation platform.
Installation Credentials Environment Variables Operations Using as a Tool Compatibility Resources
Official Quickstart Video:
Shoutout to all the creators of the following n8n community videos that are great resources for learning how to use this node:
If you have a great video that you'd like to share, please let me know and I'll add it to the list!
Check out my YouTube Series MCP Explained for more information about the Model Context Protocol.
Follow the installation guide in the n8n community nodes documentation.
Also pay attention to Environment Variables for using tools in AI Agents. It's mandatory to set the N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE
environment variable to true
if you want to use the MCP Client node as a tool in AI Agents.
The MCP Client node supports three types of credentials to connect to an MCP server:
HTTP Streamable is the recommended and modern method for all new integrations, providing better efficiency and flexibility compared to SSE.
This example shows how to connect to a locally running MCP server using HTTP Streamable:
Start a local MCP server that supports HTTP Streamable:
npx @modelcontextprotocol/server-example-streamable
Configure MCP Client credentials:
HTTP Streamable
http://localhost:3001/stream
Create a workflow using the MCP Client node:
HTTP Streamable
Deprecated: SSE is deprecated and will not receive further updates, but remains available for legacy compatibility. For new projects, use HTTP Streamable.
This example shows how to connect to a locally running MCP server using Server-Sent Events (SSE):
Start a local MCP server that supports SSE:
npx @modelcontextprotocol/server-example-sse
Configure MCP Client credentials:
Server-Sent Events (SSE)
http://localhost:3001/sse
Create a workflow using the MCP Client node:
Server-Sent Events (SSE)
Note: For new projects, HTTP Streamable is strongly recommended.
The MCP Client node supports passing environment variables to MCP servers using the command-line based transport in two ways:
You can add environment variables directly in the credentials configuration:
This method is useful for individual setups and testing. The values are stored securely as credentials in n8n.
For Docker deployments, you can pass environment variables directly to your MCP servers by prefixing them with MCP_
:
version: '3'
services:
n8n:
image: n8nio/n8n
environment:
- MCP_BRAVE_API_KEY=your-api-key-here
- MCP_OPENAI_API_KEY=your-openai-key-here
- MCP_CUSTOM_SETTING=some-value
# other configuration...
These environment variables will be automatically passed to your MCP servers when they are executed.
This example shows how to set up and use the Brave Search MCP server:
Install the Brave Search MCP server:
npm install -g @modelcontextprotocol/server-brave-search
Configure MCP Client credentials:
npx
-y @modelcontextprotocol/server-brave-search
BRAVE_API_KEY=your-api-key
Add a variables (space comma or newline separated)Create a workflow that uses the MCP Client node:
{"query": "latest AI news"}
The node will execute the search and return the results in the output.
This example demonstrates how to set up multiple MCP servers in a production environment and use them with an AI agent:
version: '3'
services:
n8n:
image: n8nio/n8n
environment:
# MCP server environment variables
- MCP_BRAVE_API_KEY=your-brave-api-key
- MCP_OPENAI_API_KEY=your-openai-key
- MCP_SERPER_API_KEY=your-serper-key
- MCP_WEATHER_API_KEY=your-weather-api-key
# Enable community nodes as tools
- N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
ports:
- "5678:5678"
volumes:
- ~/.n8n:/home/node/.n8n
Create multiple MCP Client credentials in n8n:
Brave Search Credentials:
npx
-y @modelcontextprotocol/server-brave-search
OpenAI Tools Credentials:
npx
-y @modelcontextprotocol/server-openai
Web Search Credentials:
npx
-y @modelcontextprotocol/server-serper
Weather API Credentials:
npx
-y @modelcontextprotocol/server-weather
Create an AI Agent workflow:
Example AI Agent prompt:
I need you to help me plan a trip. First, search for popular destinations in {destination_country}.
Then, check the current weather in the top 3 cities.
Finally, find some recent news about travel restrictions for these places.
With this setup, the AI agent can use multiple MCP tools across different servers, all using environment variables configured in your Docker deployment.
The MCP Client node supports the following operations:
The List Tools operation returns all available tools from the MCP server, including their names, descriptions, and parameter schemas.
The Execute Tool operation allows you to execute a specific tool with parameters. Make sure to select the tool you want to execute from the dropdown menu.
This node can be used as a tool in n8n AI Agents. To enable community nodes as tools, you need to set the N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE
environment variable to true
.
If you're using a bash/zsh shell:
export N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
n8n start
If you're using Docker: Add to your docker-compose.yml file:
environment:
- N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
If you're using the desktop app:
Create a .env
file in the n8n directory:
N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
If you want to set it permanently on Mac/Linux:
Add to your ~/.zshrc
or ~/.bash_profile
:
export N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
Example of an AI Agent workflow results:
After setting this environment variable and restarting n8n, your MCP Client node will be available as a tool in AI Agent nodes.
FAQs
MCP nodes for n8n
The npm package n8n-nodes-mcp receives a total of 17,944 weekly downloads. As such, n8n-nodes-mcp popularity was classified as popular.
We found that n8n-nodes-mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket now supports Scala and Kotlin, bringing AI-powered threat detection to JVM projects with easy manifest generation and fast, accurate scans.
Application Security
/Security News
Socket CEO Feross Aboukhadijeh and a16z partner Joel de la Garza discuss vibe coding, AI-driven software development, and how the rise of LLMs, despite their risks, still points toward a more secure and innovative future.
Research
/Security News
Threat actors hijacked Toptal’s GitHub org, publishing npm packages with malicious payloads that steal tokens and attempt to wipe victim systems.