When I recently hit a production issue where one of our Azure Functions unexpectedly fired hundreds of API calls within seconds, it pushed me down a rabbit hole.
🔍 What caused it?
Node.js was the culprit — or rather, my misunderstanding of how Node.js handles caching and concurrency.
Due to a cold start, the in-memory cache reset, and with Azure's high parallelism settings, each parallel thread triggered its own set of calls. Boom — API surge.
This got me asking...
How does Node.js actually work behind the scenes?
Is Node.js really "single-threaded"?
How does it compare to Java and other backend technologies?
Let’s break it down.
🔧 Node.js Architecture — A Quick Recap
At its core, Node.js is built around:
- 🧵 A single-threaded event loop
- ⏳ A non-blocking I/O model
- 🔧 An internal thread pool (libuv) for some tasks
- 🧠 Optional worker threads for CPU-heavy work
This model shines in high I/O scenarios: chat apps, API gateways, real-time apps, etc.
🚦 How Request Handling Works
- I/O-bound requests (e.g., DB reads, file reads) → delegated to non-blocking handlers
- CPU-heavy work (e.g., image processing) → needs special treatment or it blocks the event loop
- Event Loop picks up completed work and executes callbacks
⚡ What Makes Node.js Different?
✅ Strengths:
Feature | Node.js |
---|---|
Event Loop | Yes |
Single Threaded | By default |
Asynchronous by Design | Yes |
CPU-bound Work | Needs manual scaling |
Native Multithreading | Now possible via Worker Threads |
Scales via Cluster or Workers | Yes |
Best Use | APIs, Microservices, Real-time apps |
🔄 Comparing with Java (and Others)
Feature | Node.js | Java |
---|---|---|
Thread Model | Single (default) + workers | Multi-threaded |
Default Concurrency | Event Loop | Thread Pools |
Memory Overhead | Low | Higher |
Scaling | Manual (cluster/fork/worker) | Automatic via thread pools |
Heavy CPU Work | Needs worker threads | Native threading |
Ideal Use | Real-time, async I/O | Complex business logic, CPU-heavy apps |
💡 Java inherently manages threads and thread pools.
Node requires explicit configuration to utilize more CPU cores (via cluster, worker threads, etc.)
🛠️ Worker Threads in Node.js — Multithreading in Action
const { Worker } = require('worker_threads');
function runWorker(path, data) {
return new Promise((resolve, reject) => {
const worker = new Worker(path, { workerData: data });
worker.on('message', resolve);
worker.on('error', reject);
worker.on('exit', (code) => {
if (code !== 0) reject(new Error(`Worker stopped with code ${code}`));
});
});
}
// Usage:
runWorker('./heavyTask.js', { payload: 123 }).then(console.log);
Using this, you can spawn real threads up to the number of CPU cores available on the system.
🤯 Real-World Problem: Azure + Node.js + Cold Start = Disaster
In my recent experience:
- Node.js Azure Function used in-memory cache
- Cold start triggered cache reset
- High degree of parallelism (16 threads!) meant every invocation missed the cache
- Result: hundreds of external API calls = system overload
This led me to study Node’s internals more deeply and compare with Java, which would’ve used:
- JVM-level thread pooling
- Shared memory space
- Potentially avoided the burst entirely
📊 Thread Utilization and Scaling
- A system with 8 cores, 16 logical processors can theoretically handle 16 threads in parallel
- But OS and system apps use some threads → leave 2–4 for system
- Use max 12–14 threads (Java or Node workers)
In Node.js:
const os = require('os');
const usableThreads = os.cpus().length - 2; // e.g., 14 threads
📌 When Should You Choose Node.js?
Choose Node.js if:
- You’re building real-time apps (chat, WebSocket)
- You need API gateways, proxies, or high I/O throughput
- You can split heavy work into services or offload to queues
Avoid Node.js if:
- Your app involves heavy CPU-bound synchronous logic
- You need robust multithreading natively
- You prefer strongly typed OOP-heavy codebases
✅ Final Thoughts
- Node.js isn’t just single-threaded — it’s smartly single-threaded, with options to scale via worker threads or clusters.
- But those choices are manual, and you must understand the architecture to avoid pitfalls like I did on Azure.
- If you’re choosing a backend tech stack, factor in:
- Type of load: I/O-heavy vs CPU-heavy
- Scaling model: thread-based or event-driven
- Operational complexity
.
.
.
NodeJs, Backend, Backend Architecture, Event Loop, Multi-threading, Worker Threads, Java, Java Vs NodeJs, Performance, Azure, Cold Start, High Parallelism, CPU Bound Tasks
Top comments (0)