Alpha Notice: These docs cover the v1-alpha release. Content is incomplete and subject to change.For the latest stable version, see the current LangGraph Python or LangGraph JavaScript docs.
Model Context Protocol (MCP) is an open protocol that standardizes how applications provide tools and context to LLMs. LangChain agents can use tools defined on MCP servers using the langchain-mcp-adapters library. MCP Install the langchain-mcp-adapters library to use MCP tools in LangGraph:
pip install langchain-mcp-adapters

Transport types

MCP supports different transport mechanisms for client-server communication:
  • stdio: Client launches server as a subprocess and communicates via standard input/output. Best for local tools and simple setups.
  • Streamable HTTP: Server runs as an independent process handling HTTP requests. Supports remote connections and multiple clients.
  • Server-Sent Events (SSE): a variant of streamable HTTP optimized for real-time streaming communication.

Use MCP tools

langchain-mcp-adapters enables agents to use tools defined across one or more MCP server.
Accessing multiple MCP servers
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain.agents import create_agent

client = MultiServerMCPClient(
    {
        "math": {
            "transport": "stdio",  # Local subprocess communication
            "command": "python",
            # Absolute path to your math_server.py file
            "args": ["/path/to/math_server.py"],
        },
        "weather": {
            "transport": "streamable_http",  # HTTP-based remote server
            # Ensure you start your weather server on port 8000
            "url": "http://localhost:8000/mcp",
        }
    }
)

tools = await client.get_tools()
agent = create_agent(
    "anthropic:claude-3-7-sonnet-latest",
    tools
)
math_response = await agent.ainvoke(
    {"messages": [{"role": "user", "content": "what's (3 + 5) x 12?"}]}
)
weather_response = await agent.ainvoke(
    {"messages": [{"role": "user", "content": "what is the weather in nyc?"}]}
)
MultiServerMCPClient is stateless by default. Each tool invocation creates a fresh MCP ClientSession, executes the tool, and then cleans up.

Custom MCP servers

To create your own MCP servers, you can use the mcp library. This library provides a simple way to define tools and run them as servers.
pip install mcp
Use the following reference implementations to test your agent with MCP tool servers.
Math server (stdio transport)
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Math")

@mcp.tool()
def add(a: int, b: int) -> int:
    """Add two numbers"""
    return a + b

@mcp.tool()
def multiply(a: int, b: int) -> int:
    """Multiply two numbers"""
    return a * b

if __name__ == "__main__":
    mcp.run(transport="stdio")
Weather server (streamable HTTP transport)
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Weather")

@mcp.tool()
async def get_weather(location: str) -> str:
    """Get weather for location."""
    return "It's always sunny in New York"

if __name__ == "__main__":
    mcp.run(transport="streamable-http")

Expose LangChain tools via MCP

You can also expose existing LangChain tools through an MCP server using the to_fastmcp function. This allows you to make your LangChain tools available to any MCP client.
Make LangChain tools available via MCP
from langchain_core.tools import tool
from langchain_mcp_adapters.tools import to_fastmcp
from mcp.server.fastmcp import FastMCP


@tool
def add(a: int, b: int) -> int:
    """Add two numbers"""
    return a + b

@tool
def get_user_info(user_id: str) -> str:
    """Get information about a user"""
    return f"User {user_id} is active"


# Convert LangChain tools to FastMCP
fastmcp_tools = [to_fastmcp(tool) for tool in (add, get_user_info)]

# Create server using converted tools
mcp = FastMCP("LangChain Tools", tools=fastmcp_tools)
mcp.run(transport="stdio")

Stateful tool usage

For stateful servers that maintain context between tool calls, use client.session() to create a persistent ClientSession.
Using MCP ClientSession for stateful tool usage
from langchain_mcp_adapters.tools import load_mcp_tools

client = MultiServerMCPClient({...})
async with client.session("math") as session:
    tools = await load_mcp_tools(session)

Additional resources