Gradio is an open-source Python package for creating AI-powered web applications. Gradio is compliant with the MCP server protocol and powers hundreds of MCP servers hosted on Hugging Face Spaces. The Gradio team is betting big on Gradio and Spaces being one of the best method to construct and host AI-powered MCP servers.
To that end, listed here are a few of the big improvements we have added to Gradio MCP servers as of version 5.38.0.
Seamless Local File Support
For those who’ve tried to make use of a distant Gradio MCP server that takes a file as input (image, video, audio), you’ve got probably encountered this error:

This happens since the Gradio server is hosted on a unique machine, meaning any input files have to be accessible via a public URL so that they could be downloaded remotely.
While some ways exist to host files online, all of them add a manual step to your workflow. Within the age of LLM agents, shouldn’t we expect them to handle this for you?
Gradio now features a “File Upload” MCP server that agents can use to upload files on to your Gradio application. If any tools in your Gradio MCP server require file inputs, the connection documentation will now show you the right way to start the “File Upload” MCP server:

Learn more about using this server (and vital security considerations) within the Gradio Guides.
Real-time Progress Notifications
Depending on the AI task, getting results can take a while. Now, Gradio streams progress notifications to your MCP client, allowing you to observe the status in real-time!
As an MCP developer, it’s highly really useful to implement your MCP tools to emit these progress statuses. Our guide shows you ways.
Transform OpenAPI Specs to MCP in One Line
If you ought to integrate an existing backend API into an LLM, you may have to manually map API endpoints to MCP tools. This generally is a time-consuming and error prone chore. With this release, Gradio can automate your complete process for you! With a single line of code, you possibly can integrate what you are promoting backend into any MCP-compatible LLM.
OpenAPI is a widely adopted standard for describing RESTful APIs in a machine-readable format, typically as a JSON file. Gradio now features the gr.load_openapi function, which creates a Gradio application directly from an OpenAPI schema. You’ll be able to then launch the app with mcp_server=True to robotically create an MCP server to your API!
import gradio as gr
demo = gr.load_openapi(
openapi_spec="https://petstore3.swagger.io/api/v3/openapi.json",
base_url="https://petstore3.swagger.io/api/v3",
paths=["/pet.*"],
methods=["get", "post"],
)
demo.launch(mcp_server=True)
Find more details within the Gradio Guides.
Improvements to Authentication
A typical pattern in MCP server development is to make use of authentication headers to call services on behalf of your users. As an MCP server developer, you ought to clearly communicate to your users which credentials they need to offer for correct server usage.
To make this possible, you possibly can now type your MCP server arguments as gr.Header. Gradio will robotically extract that header from the incoming request (if it exists) and pass it to your function. The advantage of using gr.Header is that the MCP connection docs will robotically display the headers you must supply when connecting to the server!
In the instance below, the X-API-Token header is extracted from the incoming request and passed in because the x_api_token argument to make_api_request_on_behalf_of_user.
import gradio as gr
def make_api_request_on_behalf_of_user(prompt: str, x_api_token: gr.Header):
"""Make a request to everyone's favorite API.
Args:
prompt: The prompt to send to the API.
Returns:
The response from the API.
Raises:
AssertionError: If the API token shouldn't be valid.
"""
return "Hello from the API" if not x_api_token else "Hello from the API with token!"
demo = gr.Interface(
make_api_request_on_behalf_of_user,
[
gr.Textbox(label="Prompt"),
],
gr.Textbox(label="Response"),
)
demo.launch(mcp_server=True)
You’ll be able to read more about this within the Gradio Guides.
Modifying Tool Descriptions
Gradio robotically generates tool descriptions out of your function names and docstrings. Now you possibly can customize the tool description even further with the api_description parameter. In this instance, the tool description will read “Apply a sepia filter to any image.”
import gradio as gr
import numpy as np
def sepia(input_img):
"""
Args:
input_img (np.array): The input image to use the sepia filter to.
Returns:
The sepia filtered image.
"""
sepia_filter = np.array([
[0.393, 0.769, 0.189],
[0.349, 0.686, 0.168],
[0.272, 0.534, 0.131]
])
sepia_img = input_img.dot(sepia_filter.T)
sepia_img /= sepia_img.max()
return sepia_img
gr.Interface(sepia, "image", "image",
api_description="Apply a sepia filter to any image.")
.launch(mcp_server=True)
Read more within the guide.
Conclusion
Want us so as to add a brand new MCP-related feature to Gradio? Tell us within the comments of the blog or on GitHub. Also in the event you’ve built a cool MCP server or Gradio app tell us within the comments and we’ll amplify it!

