Spaces:
Sleeping
A newer version of the Gradio SDK is available:
6.1.0
Step-by-Step Guide: Building & Deploying MCP Servers
β What We've Built
You now have two working MCP server examples in external_mcp_servers/:
app_time_mcp_server.py(Original)- Simple, parameter-less tool.
- Returns Berlin time only.
- Great for first-time understanding.
app_world_time_mcp_server.py(Upgraded)- Accepts a
cityparameter. - Returns time for 25+ cities.
- Demonstrates how LLMs pass arguments to tools.
- Accepts a
π Understanding the Components
1. The Core Function
Simple (Berlin):
def get_berlin_time():
# No arguments needed
return {"time": "..."}
Advanced (World):
def get_time_for_city(city: str = "Berlin"):
# Takes an argument!
# LLM will send: {"city": "Tokyo"}
return {"city": "Tokyo", "time": "..."}
2. The Gradio Interface
demo = gr.Interface(
fn=get_time_for_city,
inputs=gr.Textbox(...), # Defines input schema for MCP
outputs=gr.JSON(...),
api_name="get_time_for_city"
)
3. The MCP Magic
demo.launch(mcp_server=True)
This single line turns your web app into an AI tool server!
π Deploying to HuggingFace Spaces (Critical Details)
Deploying is easy, but there are two common pitfalls to watch out for.
Step 1: Configure the Entry File
You don't need to rename your file to app.py! You can tell HuggingFace which file to run.
- Go to your Space's Files tab.
- Click on
README.mdto edit it. - Look at the YAML Header (the metadata at the top between
---). - Change the
app_fileline:
For Berlin Time:
app_file: app_time_mcp_server.py
For World Time:
app_file: app_world_time_mcp_server.py
This is much cleaner than renaming files!
Step 2: β οΈ Check the Port Number
This is the most common error!
HuggingFace Spaces must run on port 7860.
app_time_mcp_server.pyis already set to 7860. βapp_world_time_mcp_server.pyis set to 7861 (for local testing). β
You MUST change this line in app.py before deploying:
# CHANGE THIS:
server_port=7861
# TO THIS:
server_port=7860
If you forget this, you will see: OSError: Cannot find empty port.
Step 3: Upload to Spaces
- Create a new Space (SDK: Gradio).
- Upload:
app.py(your chosen server)requirements.txt
- Wait for "Running" status.
π Connecting Your Agent
Once deployed, your agent needs to know where to look.
Update src/config/settings.py
servers["berlin_time"] = {
# 1. Use your Space URL
# Format: https://huggingface.co/spaces/USERNAME/SPACE_NAME/gradio_api/mcp/
"url": "https://gfiamon-date-time-mpc-server-tool.hf.space/gradio_api/mcp/",
# 2. Use 'sse' transport (Server-Sent Events)
"transport": "sse"
}
How It Works
- Agent Starts: Connects to that URL via SSE.
- Discovery: Asks "What tools do you have?"
- World Time: Server replies "I have
get_time_for_citywhich takes acitystring". - User Asks: "Time in Tokyo?"
- LLM: "Call
get_time_for_city(city='Tokyo')" - Agent: Sends command to HF Space -> Gets result -> Shows user.
π Teaching This to Colleagues
Key Teaching Points:
One Codebase, Two Interfaces:
- Humans use the Web UI (click buttons).
- AI Agents use the MCP API (hidden endpoint).
- Both are powered by the same Python function.
Deployment Simplicity:
- No Docker, no Nginx, no complex config.
- Just
app.py+requirements.txton HF Spaces.
The "Port Trap":
- Remind them: Local dev can use any port, but HF Spaces enforces port 7860.
Dynamic Discovery:
- Show them how you can update the tool on HF (e.g., add "Mars Time"), restart the agent, and it instantly knows about the new feature without changing agent code.