DEV Community

Cover image for Understanding Node.js, the Event Loop and the Safe Use of Singletons
DevUnionX
DevUnionX

Posted on

5 4 4 2 2

Understanding Node.js, the Event Loop and the Safe Use of Singletons

Node.js is well-known for its single-threaded event loop, but that doesn't mean it's limited to handling one request at a time. In fact, Node.js is designed to handle high concurrency efficiently - without spawning a new thread per request. Let's explore how it works, and how to safely use singleton patterns in this environment.

🌀 How Node.js Handles Concurrency

Node.js uses a single-threaded event loop, but it offloads time-consuming operations (like disk I/O, database queries, or network requests) to the system's background threads via the libuv library. This allows the main thread to remain responsive.
Here's what happens when a request comes in:
1- Node.js begins processing the request.
2- If it encounters an asynchronous operation, it delegates the work
3- While the async task is running, the event loop continues with other requests.
4- When the async task completes, its callback is pushed to the event queue.
5- The event loop picks it up when ready and executes it.
This design allows Node.js to serve millions of requests concurrently - as long as the tasks are non-blocking and stateless.
Safe Singleton Example
Consider the following configuration class:

// config.js
class Config {
 constructor() {
 this.settings = { dbHost: 'localhost', dbPort: 3306 };
 }
 get(key) {
 return this.settings[key];
 }
}
const config = new Config();
module.exports = config;
Enter fullscreen mode Exit fullscreen mode

This singleton is instantiated once at startup. Every file that imports it gets the same instance.
Now, even if a million requests call config.get('dbHost'), there's no problem because:
It's a read-only operation.
It accesses in-memory data.
It doesn't involve any I/O.
It doesn't mutate shared state.

Each call is fast, safe, and independent.

📦 Real-World Example Using Express

const express = require('express');
const app = express();
const config = require('./config');
app.get('/', (req, res) => {
 const dbHost = config.get('dbHost');
 res.send(`Database host is ${dbHost}`);
});
app.listen(3000, () => {
 console.log('Server is running on port 3000');
});
Enter fullscreen mode Exit fullscreen mode

If this server receives 1 million requests:
Each request calls config.get('dbHost').
That's just a simple in-memory lookup.
The event loop stays unblocked and efficient.

⚠️ When a Singleton Becomes a Problem

Singletons are not always safe. Here are two common pitfalls:
a) Blocking I/O Operations

const fs = require('fs');
class Logger {
 constructor() {
 this.logFile = fs.createWriteStream('./app.log');
 }
log(message) {
 this.logFile.write(`${new Date().toISOString()} ${message}\n`);
 }
}
Enter fullscreen mode Exit fullscreen mode

All requests share the same file stream.
Writes are blocking and I/O-bound.
Under high load, this can block the event loop.

b) Shared Mutable State

class EmailSender {
 constructor() {
 this.recipients = [];
 }
setRecipients(list) {
 this.recipients = list;
 }
send(message) {
 this.recipients.forEach(email => {
 console.log(`Sending "${message}" to ${email}`);
 });
 }
}
Enter fullscreen mode Exit fullscreen mode

If two requests call setRecipients() at the same time, they'll overwrite each other's data.
This leads to race conditions and incorrect behavior.

🛠️ Solutions for Singleton Pitfalls

a) Async Logging with Batching

const fs = require('fs');
const { setImmediate } = require('timers');

class AsyncLogger {
  constructor() {
    this.logQueue = [];
    this.processing = false;
    this.logFile = './app.log';
  }

  log(message) {
    this.logQueue.push(`${new Date().toISOString()} ${message}\n`);
    if (!this.processing) {
      this.processQueue();
    }
  }

  async processQueue() {
    this.processing = true;

    while (this.logQueue.length > 0) {
      const batch = this.logQueue.splice(0, 100);
      await fs.promises.appendFile(this.logFile, batch.join(''));
      await new Promise(resolve => setImmediate(resolve));
    }

    this.processing = false;
  }
}
Enter fullscreen mode Exit fullscreen mode

Queues log messages instead of writing immediately
Batches writes to minimize I/O overhead
Uses setImmediate() to yield control to the event loop
Scales better under load without blocking

b) Avoid Shared State with AsyncLocalStorage

const { AsyncLocalStorage } = require('async_hooks');

class ThreadSafeEmailSender {
  constructor() {
    this.localContext = new AsyncLocalStorage();
  }

  withRecipients(list, callback) {
    this.localContext.run({ recipients: list }, callback);
  }

  send(message) {
    const store = this.localContext.getStore();
    if (!store || !store.recipients) {
      throw new Error('No recipients set for this context');
    }

    store.recipients.forEach(email => {
      console.log(`Sending "${message}" to ${email}`);
    });
  }
}
// Usage in Express
const emailer = new ThreadSafeEmailSender();

app.post('/send-email', (req, res) => {
  emailer.withRecipients(req.body.recipients, () => {
    emailer.send(req.body.message);
    res.send('Emails sent!');
  });
});
Enter fullscreen mode Exit fullscreen mode

Creates isolated contexts for each request
Prevents shared state conflicts
Maintains the singleton pattern safely

🚀 Conclusion

Node.js excels at handling high concurrency as long as your code avoids blocking and shared state. Singletons are powerful - but must be used with care:
✅ Safe: Read-only, in-memory, non-blocking
 ⚠️ Unsafe: Blocking I/O or shared mutable state
Use patterns like queuing, batching, and context isolation to keep your app scalable and performant.

FALLOW ME FOR MORE

https://x.com/DevUnionX

AWS GenAI LIVE image

Real challenges. Real solutions. Real talk.

From technical discussions to philosophical debates, AWS and AWS Partners examine the impact and evolution of gen AI.

Learn more

Top comments (1)

Collapse
 
sawyerwolfe profile image
Sawyer Wolfe

Great overview! I appreciate the clear examples. Have you found any particular challenges when using AsyncLocalStorage in larger apps?