Creating and Using MCP inside Langflow ( No Code) -Part II
- Harsh Dhariwal
- 4 hours ago
- 3 min read
What is a MCP ?
MCP is an open protocol that standardizes how applications provide context to LLMs. We can consider MCP like a USB-C port for AI applications. The same way USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
Why do we need a MCP ?
While using LLM’s we often face these problems while providing it access to external data
LLM’s are trained over a long period of time like for example a year so the data available inside them is a year old
The models have an input context limit , for example: for the query “ How’s the weather in Delhi” , we cannot provide the weather details of the entire country
The more context you provide , the model costing increases as well , as the models are priced per token usage
With the increasing expectations from LLM’s , it becomes important that our LLM has access to the latest data and not some year old trained data. Even though our LLM’s do not have access to the latest and relevant information , they have the intelligence to use the available tools and make queries or call API’s to get that missing information . MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides.
A MCP offers:
A growing list of pre-built integrations that your LLM can directly plug into
The flexibility to switch between LLM providers and vendors
Best practices for securing your data within your infrastructure
Architecture of a MCP

MCP Hosts: User interface or AI tools which have the access to the MCP’s
MCP Clients: Protocol clients that maintain 1:1 connections with servers
MCP Servers: programs that each expose specific capabilities through the standardized Model Context Protocol
Local Data Sources: local files/database/services that your MCP server has access to
Remote Services: External resources your MCP has access to like API’s , web searches and other integrations
Using a MCP using a no code solution
LangFlow brings together a thoughtfully organized and expansive suite of pre-built components that streamline the creation of AI workflows. These components are categorized for clarity and ease of access, covering nearly every aspect of building, testing, and deploying LLM-powered solutions.( https://www.letsai.tech/post/extending-langflow-components-part-i )
Langflow offers a MCP component which can operate with both protocols
Stdio - standard input output mode in your terminal , MCP has to be in the same server
SSE - API mode , in which the MCP can be hosted on a different server

For this tutorial, we can use the SSE protocol MCP that langflow offers at
http://localhost:7860/api/v1/mcp/sse where http://localhost:7860 is the path at which your langflow is running
MCP can be connected and provided to a Agent , as a toolset
Create a Agent

Then we can connect our MCP as a tool and chat input to our tool

Now we can hit the playground to see it working

To Create a MCP tool using langflow
Create a flow which has the access to the third party and local resources
Install uv to run uvx commands. uvx is included with uv in the Langflow package.
To add a Langflow server, add an entry for your Langflow server's /v1/mcp/sse endpoint. This example assumes the default Langflow server address of http://127.0.0.1:7860.
{
"mcpServers": {
"langflow": {
"url": "http://127.0.0.1:7860/api/v1/mcp/sse"
}
}
}
Save the mcp.json file, and then click the Reload icon
Your Langflow server is now available to the MCP host as an MCP server, and all of its flows are registered as tools. You can now use your flows as tools in Cursor. Cursor determines when to use tools based on your queries, and requests permissions when necessary.
This blog concludes what is a MCP , why do we need it , what it offers and how can MCP's be build and consumed using a no code solution like Langflow.
#Langflow #NoCode #MCP #ModelContextProtocol #AIWorkflows #AIIntegration #AIApplications #LLMTools #Automation #OpenAI #DataIntegration #AIArchitecture #AIAgents #AIStandardization #AIWorkflowAutomation #NoCodeAI #AIDevelopment #Langchain #MCPinLangflow #AIProtocols
Comments