DEV Community

Cover image for What is FastAPI MCP? Effortless AI Integration for Your FastAPI APIs
Auden
Auden

Posted on

11 4 4 4 3

What is FastAPI MCP? Effortless AI Integration for Your FastAPI APIs

Today, let's dive into an exceptionally practical tool—FastAPI MCP. If you're building APIs with FastAPI and want your endpoints to be directly accessible to AI models (like GPT, Claude, etc.), this guide is for you.

📚 Table of Contents

What Is FastAPI MCP?

Put simply, FastAPI MCP is a zero-configuration solution that automatically exposes your FastAPI endpoints as Model Context Protocol (MCP) tools. With MCP compatibility, AI models can directly interact with your APIs in a seamless, standardized way.

In essence, FastAPI MCP acts as a bridge: it makes your APIs discoverable and callable by various AI models. Imagine enabling Claude or GPT—through tools like Cursor or Claude Desktop—to fetch data, process information, or trigger business logic by simply calling your API endpoints. That's not just powerful, it's fun!

Introducing FastAPI MCP

Why Use FastAPI MCP?

When developing AI-powered applications, it's increasingly common to have scenarios where LLMs need to interact with external services, for example:

  • Querying your database

  • Invoking computational services

  • Accessing internal tools

  • ... and much more

Traditionally, this required building custom endpoints or writing dedicated adapters for each AI integration. With FastAPI MCP, just a few lines of code can make your existing API directly available to AI models—saving significant development time and effort.

Step-by-Step: Getting Started with FastAPI MCP Locally

Step 1: Prepare Your Python Environment

First, ensure you have Python installed (version 3.10 or above is recommended). You can check your Python version by running:

python --version
# or
python3 --version
Enter fullscreen mode Exit fullscreen mode

If you see a proper version number (like Python 3.10.x), your environment is ready.

Introducing FastAPI MCP

Step 2: Install Required Packages

Install FastAPI, Uvicorn, and FastAPI MCP:

pip install fastapi uvicorn fastapi-mcp
Enter fullscreen mode Exit fullscreen mode

Step 3: Create a Simple FastAPI Application

To ensure compatibility and minimize configuration, start with a basic example. In a new file called main.py, add the following:

from fastapi import FastAPI
from fastapi_mcp import FastApiMCP

app = FastAPI(title="Simple API")

@app.get("/hello", operation_id="say_hello")
async def hello():
    """A simple greeting endpoint"""
    return {"message": "Hello World"}

# Expose MCP server
mcp = FastApiMCP(app, name="Simple MCP Service")
mcp.mount()

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(app, host="127.0.0.1", port=8000)
Enter fullscreen mode Exit fullscreen mode

This example includes a single /hello endpoint that returns a greeting.

Step 4: Run and Test Your Application

Now start your server:

python main.py
Enter fullscreen mode Exit fullscreen mode

or

uvicorn main:app --reload
Enter fullscreen mode Exit fullscreen mode

You should see your FastAPI app running at http://127.0.0.1:8000.

Introducing FastAPI MCP

Step 5: Explore the MCP Endpoint

Open your browser at http://127.0.0.1:8000/mcp.

Note: Unlike standard REST APIs, the MCP endpoint utilizes Server-Sent Events (SSE), so you'll see output such as:

event: endpoint
data: /mcp/messages/?session_id=a543519a5f3848febfd4f40b5ad3b5c7
Enter fullscreen mode Exit fullscreen mode

This means the MCP server is up and ready to accept connections from AI clients.

What is FastAPI MCP

Connecting FastAPI MCP to an AI Client

Suppose you want to connect FastAPI MCP to a client like Cursor. Here’s how:

Method 1: SSE (Server-Sent Events) Connection

Most modern MCP clients (Claude Desktop, Cursor, Windsurf, etc.) support SSE. In the client settings, use a configuration like:

{
  "mcpServers": {
    "fastapi-mcp": {
      "url": "http://localhost:8000/mcp"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

For example, in Cursor, go to Settings → "MCP" → "Add new global MCP server", and add the above config in your mcp.json.

What is FastAPI MCP

Once your FastAPI MCP server is running, the AI IDE will automatically detect it and enable new functionality.

What is FastAPI MCP

Method 2: Using mcp-remote as a Bridge

If you need authentication support or your MCP client does not support SSE, you can use mcp-remote as a bridge:

{
  "mcpServers": {
    "fastapi-mcp": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "http://localhost:8000/mcp",
        "8080"
      ]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

In Practice: FastAPI MCP in Action

Once you’ve configured your AI client to talk to your FastAPI MCP server, you can simply ask—for example in Cursor’s Agent tab:

“Call the /hello endpoint for me”.

The AI will run the MCP tool and return the endpoint result, just as any human developer might.

How to use FastAPI MCP

Advanced Usage Tips

Once you’ve mastered the basics, FastAPI MCP offers powerful features for production scenarios:

Selectively Expose Endpoints

You likely won’t want every endpoint exposed to AI. FastAPI MCP lets you fine-tune which are accessible:

# Only expose specific operations
mcp = FastApiMCP(app, include_operations=["say_hello", "get_user_info"])

# Exclude certain operations
mcp = FastApiMCP(app, exclude_operations=["delete_user", "update_settings"])

# Expose only endpoints with specific tags
mcp = FastApiMCP(app, include_tags=["public", "read_only"])

# Exclude endpoints with specific tags
mcp = FastApiMCP(app, exclude_tags=["admin", "sensitive"])
Enter fullscreen mode Exit fullscreen mode

Add Authentication

To secure your MCP endpoints, leverage FastAPI dependencies:

from fastapi import FastAPI, Depends, Security
from fastapi.security import APIKeyHeader

api_key_header = APIKeyHeader(name="X-API-Key")

async def verify_api_key(api_key: str = Security(api_key_header)):
    if api_key != "your-secret-key":
        raise HTTPException(status_code=403, detail="Invalid API key")
    return api_key

mcp = FastApiMCP(app, mcp_dependencies=[Depends(verify_api_key)])
mcp.mount()
Enter fullscreen mode Exit fullscreen mode

Now, every MCP call requires a valid API key in the request headers.

Custom Response Processing

You may wish to include metadata or customize responses sent to AI:

async def response_processor(request, response, response_data):
    response_data["processed_by"] = "custom_processor"
    response_data["timestamp"] = datetime.now().isoformat()
    return response_data

mcp = FastApiMCP(app, response_processor=response_processor)
mcp.mount()
Enter fullscreen mode Exit fullscreen mode

Separate Deployment

For complex setups, you may want to host your MCP server separately from your main API:

# api_app.py
from fastapi import FastAPI
api_app = FastAPI()

# mcp_app.py
from fastapi import FastAPI
from fastapi_mcp import FastApiMCP
from api_app import api_app

mcp_app = FastAPI()
mcp = FastApiMCP(api_app)
mcp.mount(mcp_app)
Enter fullscreen mode Exit fullscreen mode

Deploy api_app and mcp_app independently as needed.

Security Best Practices

When using FastAPI MCP:

  1. Only expose safe, read-only endpoints—avoid dangerous operations like DELETE and PUT.

  2. Require authentication where appropriate.

  3. Use Pydantic models for strict parameter validation.

  4. Consider filtering or masking sensitive data in responses.

Conclusion

Getting started with FastAPI MCP is remarkably straightforward. With minimal effort, you can turn your API into an AI-accessible interface without rewriting existing code or spending time on custom adapters.

In summary, the core steps are:

  1. Set up Python

  2. Install dependencies (fastapi, uvicorn, and fastapi-mcp)

  3. Create a FastAPI app

  4. Add MCP support with just a few lines of code

  5. Connect and test from your favorite AI IDE or tool

For further capabilities and the latest features, see the official FastAPI MCP documentation on GitHub.


Other Recommended MCP Servers

Apidog MCP Server allows you to provide your Apidog API documentation to AI-powered IDEs like Cursor, as well as other tools that support MCP. It covers multiple use cases: you can connect it to APIs documented within your Apidog projects, access publicly published API documentation, or even use OpenAPI/Swagger files.

Setup is extremely straightforward—just make sure you have Node.js (version 18 or newer) installed. Choose the appropriate configuration method based on your scenario. For private deployments, you can also add custom API base URLs. With Apidog MCP Server, developers can leverage AI assistants to generate code from API docs, modify code, search API documentation, and more—all dramatically enhancing development efficiency.

In practice, you simply instruct the AI assistant what you want to do with the API documentation (for example: "Generate all MVC code for the /users endpoint based on the API docs"), and the AI will understand and complete the task. This is especially valuable in collaborative team settings, ensuring all developers work from a unified, standardized API documentation source.

How to use FastAPI MCP

Neon image

Serverless Postgres in 300ms (❗️)

10 free databases with autoscaling, scale-to-zero, and read replicas. Start building without infrastructure headaches. No credit card needed.

Try for Free →

Top comments (0)

Quickstart image

Django MongoDB Backend Quickstart! A Step-by-Step Tutorial

Get up and running with the new Django MongoDB Backend Python library! This tutorial covers creating a Django application, connecting it to MongoDB Atlas, performing CRUD operations, and configuring the Django admin for MongoDB.

Watch full video →

👋 Kindness is contagious

DEV works best when you're signed in—unlocking a more customized experience with features like dark mode and personalized reading settings!

Okay