DEV Community

Hitesh
Hitesh

Posted on

6 1 1 1 1

Fast API Request Handling

In FastAPI, how route handlers (endpoints) behave in terms of parallelism and concurrency depends on whether they are defined using async def or def, and whether the work inside them is I/O-bound or CPU-bound.

Image description

Here are the four combinations of route handlers and how they affect parallel or concurrent handling of requests:


✅ 1. async def with async I/O-bound work (e.g., await asyncio.sleep, database calls)

@router.get("/async-io")
async def async_io_route():
    await asyncio.sleep(2)
    return {"status": "async io"}
Enter fullscreen mode Exit fullscreen mode
  • Handled concurrently
  • Non-blocking — multiple such requests can be handled at the same time.
  • Best performance for I/O tasks like DB queries, network calls, file access.

✅ 2. async def with CPU-bound work (e.g., heavy computation, no await)

@router.get("/async-cpu")
async def async_cpu_route():
    result = sum(i * i for i in range(10**7))
    return {"result": result}
Enter fullscreen mode Exit fullscreen mode
  • Not truly concurrent for CPU-bound work.
  • Blocks the event loop — slows down other async endpoints.
  • BAD practice — use a thread pool for CPU-bound tasks instead.

✅ 3. def with CPU-bound work

@router.get("/sync-cpu")
def sync_cpu_route():
    result = sum(i * i for i in range(10**7))
    return {"result": result}
Enter fullscreen mode Exit fullscreen mode
  • Parallel execution via thread pool executor (Starlette/FastAPI handles this).
  • Slower than async I/O but doesn't block the event loop.
  • Suitable for CPU-bound work when properly limited.

✅ 4. def with I/O-bound work (e.g., time.sleep)

@router.get("/sync-io")
def sync_io_route():
    time.sleep(2)
    return {"status": "sync io"}
Enter fullscreen mode Exit fullscreen mode
  • Blocks thread and wastes resources.
  • Not concurrent nor parallel in a performant way.
  • Worst option — avoid using blocking I/O in sync routes.

Summary Table

Route Type I/O Type Concurrent? Notes
async def Async I/O ✅ Yes Best option for scalable I/O-bound endpoints
async def CPU-bound ❌ No Blocks the event loop — BAD
def CPU-bound ✅ Parallel Runs in thread pool — acceptable for CPU tasks
def Blocking I/O ❌ No Blocks threads — worst case, avoid

Best Practices

  • Use async def + await for I/O-bound operations.
  • Offload CPU-heavy operations to a thread/process pool (e.g., run_in_executor()).
  • Avoid blocking operations like time.sleep() in FastAPI routes.

Here’s a clear and concise table showing different FastAPI route types, the kind of operation they perform, and whether the request handling is parallel or concurrent:


🧩 FastAPI Route Behavior Comparison

Route Type Operation Type Example Code Snippet Behavior Notes
async def Async I/O-bound await asyncio.sleep(1) ✅ Concurrent Best for DB queries, API calls, file I/O, etc.
async def CPU-bound sum(i * i for i in range(10**7)) ❌ Blocking Blocks event loop – BAD pattern
async def CPU-bound (offload) await loop.run_in_executor(None, cpu_task) ✅ Parallel Offloads to thread pool – does not block event loop
async def CPU-bound (multi-core) run_in_executor(ProcessPool, cpu_task) ✅✅ True Parallel Uses multiple CPU cores – best for heavy computations
def CPU-bound sum(i * i for i in range(10**7)) ✅ Parallel Runs in thread pool – doesn't block event loop
def Blocking I/O time.sleep(2) ❌ Blocking Wastes threads – avoid blocking I/O in sync functions

✅ Legend

  • Concurrent: Multiple tasks share the same thread (async I/O).
  • Parallel: Tasks run in separate threads or processes simultaneously.
  • Blocking: One task prevents others from proceeding.

Heroku

Deploy with ease. Manage efficiently. Scale faster.

Leave the infrastructure headaches to us, while you focus on pushing boundaries, realizing your vision, and making a lasting impression on your users.

Get Started

Top comments (1)

Collapse
 
ansilgraves profile image
Ansil Graves

This is a good overview, but I feel like it glosses over how thread pools can become a bottleneck themselves, especially with high traffic. Maybe a bit more detail about thread pool limitations and tuning would be helpful?

ACI image

ACI.dev: Fully Open-source AI Agent Tool-Use Infra (Composio Alternative)

100% open-source tool-use platform (backend, dev portal, integration library, SDK/MCP) that connects your AI agents to 600+ tools with multi-tenant auth, granular permissions, and access through direct function calling or a unified MCP server.

Check out our GitHub!