Eager to build your own MCP (Model Context Protocol) server using FastMCP and test it with Stands and LangGraph? This tutorial guides you through setting up a lightweight Python-based MCP server, exposing tools, validating them via Stands, and seamlessly integrating with a LangGraph agent—empowering your AI to call custom tool endpoints in real time. Let’s dive in and see how to build MCP server with FastMCP in python.
Prerequisites
Before jumping into the tutorial, make sure you have uv installed.

TL;DR: uv is a blazing-fast Python package installer and resolver, built with Rust. It’s designed to be a drop-in replacement for pip
and pip-tools
, making dependency management quicker and smoother.
You can install it by following the steps here.
Creating the MCP server with fastMCP
Now, let’s create a simple MCP server with FastMCP. Here we are going to create two tools. The first tool will give the weather of any city, and another tool is for addition. And we will wrap our MCP server in FastAPI to make it easy if you want to deploy it on server.
First, we need to create a virtual environment for our project using UV. Open your IDE and run the following commands in the terminal inside your project folder:
PS D:\\mcp_demo> uv init PS D:\\mcp_demo> uv venv
This will initialize a UV-based virtual environment and generate the necessary files for UV to manage your project. Now, enable the virtual environment.
For Windows.
PS D:\\mcp_demo> .\\.venv\\Scripts\\activate
For Mac
source .venv/bin/activate
Now, install the necessary packages for MCP
uv add fastmcp requests fastapi
Now, create a **main.py** file, which will be our MCP server.
#import packages from fastmcp import FastMCP import requests from fastapi import FastAPI #initiate instance of MCP mcp = FastMCP(name= "Hello MCP") #create tools @mcp.tool() def add(x:str,y:str): """Adds two numbers""" return x + y @mcp.tool() def get_weather_by_city(city_name: str): """ Fetches current weather information for a given city using OpenStreetMap Nominatim and Open-Meteo APIs. Returns a dictionary with temperature, windspeed, and weather code. """ # Step 1: Get latitude and longitude from city name geo_url = "<https://nominatim.openstreetmap.org/search>" geo_params = {"q": city_name, "format": "json", "limit": 1} geo_resp = requests.get(geo_url, params=geo_params, headers={"User-Agent": "weather-app"}) if geo_resp.status_code != 200 or not geo_resp.json(): return {"error": "City not found"} geo_data = geo_resp.json()[0] lat, lon = geo_data["lat"], geo_data["lon"] # Step 2: Fetch weather using Open-Meteo weather_url = ( "<https://api.open-meteo.com/v1/forecast>" f"?latitude={lat}&longitude={lon}¤t_weather=true" ) weather_resp = requests.get(weather_url) if weather_resp.status_code == 200: return weather_resp.json().get("current_weather", {}) else: return {"error": f"Failed to fetch weather: {weather_resp.status_code}"} #Wrap it in FastAPI mcp_app = mcp.http_app(path='/mcp') app = FastAPI(lifespan=mcp_app.lifespan) app.mount('/mcp-server',mcp_app) #add healthcheck endpoint @app.get('/health') def health_check(): return {"status":"healthy"} if __name__=='__main__': #Make sure transport is streamable-http.It is mandatory for running MCP remotely. #transport = 'stdio' is suitable for local MCP. mcp.run(transport="streamable-http",host='0.0.0.0')
As you can see above, we have wrapped our MCP tools in FastAPI, so we can run the app on uvicorn and can do a health check of our MCP server when we deploy it on the server. Now our MCP server with FastMCP is ready.
Test the server using MCP Clients
To test the MCP server, first start it by running the following command:
uv run uvicorn main:app
This will start our app at http://127.0.0.1:8000 (this host URL will be used for establishing a connection)
There are various methods to test your MCP client. i.e., You can connect fastMCP with Claude desktop using the command, connect with Cursor IDE, etc.
But here we will programmatically build an MCP client. For that, you can use the AWS strands library or ReAct agent from Langgraph. We will check with both.
1. Using AWS Strands
Strands Agents is a simple yet powerful SDK that takes a model-driven approach to building and running AI agents. From simple conversational assistants to complex autonomous workflows, from local development to production deployment, Strands Agents scales with your needs.
Install the strands by the following command:
uv add strands-agents
Now let’s build the MCP client.
from mcp.client.streamable_http import streamablehttp_client from strands.tools.mcp import MCPClient from strands import Agent streamable_http_mcp_client = MCPClient(lambda: streamablehttp_client("<http://127.0.0.1:8000/mcp-server/mcp>")) with streamable_http_mcp_client: agent = Agent(model="us.anthropic.claude-3-5-sonnet-20241022-v2:0",tools=streamable_http_mcp_client.list_tools_sync()) response = agent("Tell me the weather of New York. Also add 5 celsius in the temprature and give me the resultant temprature.") print(response.message['content'][0]['text'])
This will print the following:
Current weather in New York: - Temperature: 30.7°C - Wind Speed: 8.7 m/s - Weather Code: 3 (partly cloudy) After adding 5°C to the current temperature: The resultant temperature would be 35.7°C
Voila! As you can see, our Agent is running as expected.
1. Using LangGraph’s React Agent
You can also create an MCP client with LangGraph. Install the following package for it:
uv add langgraph langchain-aws langchain-mcp-adapter
from langchain_mcp_adapters.client import MultiServerMCPClient from langchain_aws import ChatBedrockConverse from langgraph.prebuilt import create_react_agent import asyncio async def main(): client= MultiServerMCPClient( { "test":{ "url":"<http://127.0.0.1:8000/mcp-server/mcp/mcp_app>", "transport":"streamable_http" } } ) tools = await client.get_tools() llm = ChatBedrockConverse( model="us.anthropic.claude-3-5-sonnet-20241022-v2:0", temperature=0) agent= create_react_agent(model=llm,tools=tools) res= await agent.ainvoke({"messages":{"role":"user","content":"What is the weather in New York? Also add 5 celsius in the temprature and give me the resultant temprature."}}) print(res) #run res= asyncio.run(main()) print(res)
The resultant temperature would be 35.7°C after addition of 5 celsius.
Conclusion
You’ve now built a fully functional MCP server with FastMCP wrapped in FastAPI, leveraging UV for environment and dependency management, and successfully tested it with intelligent agents from AWS Strands and LangGraph. This setup forms a solid foundation for creating modular, scalable, and interoperable AI tooling. With this structure in place, you can now extend tool capabilities, harden the system for production, and plug it directly into larger AI pipelines or orchestration frameworks.
Also Read: How to Create an MCP Server in Python: A Beginners Guide