If you're searching for "n8n MCP", you're trying to solve a specific problem: how do you let AI agents—Claude, ChatGPT, Lovable, or custom agents—trigger and interact with your n8n workflows?
Model Context Protocol (MCP) is the answer. It's a standard that lets AI clients discover and use tools exposed by external servers. n8n recently shipped instance-level MCP support, which means any workflow you build can become a tool that AI agents understand and can call directly.
This tutorial covers:
What n8n's MCP integration actually does
How to expose workflows to AI clients
Connecting Claude Desktop, ChatGPT, and Lovable to your n8n instance
Building AI-powered automation that goes beyond simple triggers
What is MCP and why does it matter for n8n?
MCP (Model Context Protocol) is a communication standard that lets AI assistants talk to external tools. Instead of hardcoding API calls or scraping documentation, AI clients can ask an MCP server: "What tools do you have?" and get back structured schemas they can use.
n8n's MCP integration turns this around. Your n8n instance becomes the MCP server. Every workflow you mark as "MCP accessible" becomes a tool that any MCP-compatible AI client can discover, understand, and execute.
The practical result: you can tell Claude "create a new task in my project tracker" and Claude calls your n8n workflow directly. No copy-pasting, no manual API configuration, no webhook URLs to remember.
n8n's MCP architecture
n8n supports MCP in two directions:
n8n as MCP Server (instance-level)
Your n8n instance exposes workflows as tools. AI clients connect to your instance and can run any workflow you've marked as accessible. This is the new instance-level feature—it takes 30 seconds to enable in settings.
n8n as MCP Client (workflow nodes)
Your n8n workflows can call external MCP servers. The MCP Client node lets you use tools from other MCP servers as steps in your automation. The MCP Client Tool node lets AI agents within n8n use external MCP tools.
Most "n8n MCP" searches are about the first pattern: exposing your workflows to external AI clients.
Enabling instance-level MCP
Open your n8n instance settings
Find the MCP section
Enable "MCP Server"
Copy the connection URL
That's it. Your instance now speaks MCP.
To make specific workflows accessible:
Open the workflow you want to expose
Go to workflow settings
Enable "MCP accessible"
Add a clear name and description (this is what AI clients see)
The name and description matter. When Claude asks your MCP server "what tools do you have?", it receives these descriptions. Write them like you're explaining the workflow to a colleague: "Create a new task in Notion with title, due date, and project."
Connecting AI clients to n8n
Claude Desktop
Claude Desktop supports MCP natively. Add your n8n instance to Claude's MCP configuration:
Open Claude Desktop settings
Navigate to MCP servers
Add your n8n instance URL
Authenticate if required
Once connected, Claude can see all your MCP-accessible workflows. Ask Claude to "show me my available tools" to verify the connection.
ChatGPT
ChatGPT's MCP support works through its plugin/tool system. Connect your n8n instance as an external tool source, and ChatGPT can discover and call your workflows.
Lovable
Lovable (the AI app builder) supports MCP for connecting to external services. Point it at your n8n instance and your automation becomes available during app development.
Practical examples
Example 1: Task creation workflow
Build a workflow that creates tasks in your project management tool:
Input: task title, description, due date, project
Action: create task via your tool's API (Notion, Linear, ClickUp, etc.)
Output: confirmation with task URL
Mark it MCP accessible. Now any AI client connected to your n8n instance can create tasks in your system by asking naturally: "Create a task to review the Q1 report, due Friday, in the Marketing project."
Example 2: Data lookup workflow
Build a workflow that queries your internal systems:
Input: search query, filters
Action: query your database or API
Output: structured results
Your AI assistant can now answer questions about your data: "What were our top 5 deals last month?" The AI calls your workflow, gets the data, and summarizes it.
Example 3: Multi-step operations
Build workflows that handle complex operations:
Input: high-level instruction
Actions: multiple steps (fetch data, transform, update systems, notify)
Output: summary of what happened
The AI client treats this as a single tool. "Process this week's expense reports" could trigger a workflow that fetches receipts, categorizes them, updates your accounting system, and sends a summary.
MCP vs webhooks vs API calls
You might wonder: why MCP instead of just using webhooks?
Webhooks require you to configure URLs, handle authentication, and write integration code. The AI client needs hardcoded knowledge of your webhook structure.
Direct API calls require even more setup—authentication, endpoint discovery, request formatting.
MCP handles discovery automatically. The AI client asks "what can you do?" and gets back a schema. No hardcoding, no manual configuration. When you add a new workflow, it's immediately available to all connected AI clients.
n8n MCP nodes for building AI agents
If you're building AI agents inside n8n (not just exposing workflows), n8n has specific nodes:
MCP Client node: Use tools from external MCP servers as workflow steps. Connect to any MCP server and call its tools programmatically.
MCP Client Tool node: Give your n8n AI agents access to external MCP tools. The agent can dynamically discover and use tools from connected MCP servers.
MCP Server Trigger node: Create custom MCP endpoints within workflows for more complex scenarios.
Self-hosting considerations
n8n's MCP features work best when your instance is reliably accessible. If you're self-hosting:
Ensure your instance has a stable URL
Configure authentication appropriately
Consider network access if AI clients are connecting from outside your network
For a detailed self-hosting guide, see: How to Self-Host n8n.
The bigger picture: AI + automation
MCP represents a shift in how AI systems interact with tools. Instead of building one-off integrations for each AI client, you build workflows once and expose them via a standard protocol.
n8n's MCP integration positions your automation as AI-native infrastructure. Any MCP-compatible AI client—current and future—can use your workflows without additional development.
If you're already using n8n for automation, enabling MCP is the fastest way to make that automation AI-accessible. If you're evaluating automation platforms, native MCP support is increasingly important as AI assistants become primary interfaces for work.
For more on building AI-powered automation: