DEV Community

Mofajjal Rasul
Mofajjal Rasul

Posted on

1

Node.js Memory Apocalypse: Why Your App Dies on Big Files (And How to Stop It Forever)

Your Node.js script works perfectly with test data. Then you feed it a real 10GB log file. Suddenly: crash. No warnings, just ENOMEM. Here's why even seasoned developers make this mistake—and the bulletproof solution.


The Root of All Evil: fs.readFile

fs.readFile is the equivalent of dumping a dump truck’s contents into your living room. It loads every single byte into RAM before you can touch it. Observe:

// Processing a 3GB database dump? Enjoy 3GB RAM usage
fs.readFile('./mega-database.sql', 'utf8', (err, data) => {
  parseSQL(data); // Hope you have 3GB to spare
});
Enter fullscreen mode Exit fullscreen mode
  • CLI tools crash processing large CSVs
  • Data pipelines implode on video files
  • Background services die silently at 3AM

This isn’t “bad code”—it’s how fs.readFile operates. And it’s why your production system fails catastrophically.


Streams: The Memory Ninja Technique

Streams process data like a conveyor belt—small chunks enter, get processed, then leave memory forever. No RAM explosions:

// Process 100GB file with ~50MB memory
const stream = fs.createReadStream('./giant-dataset.csv');

stream.on('data', (chunk) => {
  analyzeChunk(chunk); // Work with 64KB-1MB pieces
});

stream.on('end', () => {
  console.log('Processed entire file without going nuclear');
});
Enter fullscreen mode Exit fullscreen mode

Real-World Massacre: File Processing

The Suicide Approach (Common Mistake)

// Data import script that crashes on big files
function importUsers() {
  fs.readFile('./users.json', (err, data) => {
    JSON.parse(data).forEach(insertIntoDatabase); // 💀
  });
}
Enter fullscreen mode Exit fullscreen mode

The Stream Survival Guide (Correct Way)

// Processes 50GB JSON file without memory issues
const ndjsonStream = fs.createReadStream('./users.ndjson');
const jsonParser = new TransformStream({ /* parse line-by-line */ });

ndjsonStream.pipe(jsonParser)
  .on('data', (user) => insertIntoDatabase(user));
Enter fullscreen mode Exit fullscreen mode

Streams handle terabyte-scale files like they’re nothing. Your app stays responsive—no OOM crashes.


Pro Tip: Streams + Pipelines = Unstoppable

Combine streams with Node.js’ pipeline for error-proof processing:

const { pipeline } = require('stream');
const zlib = require('zlib');

// Compress 20GB log file with constant memory
pipeline(
  fs.createReadStream('./server.log'),
  zlib.createGzip(), // Compress chunk-by-chunk
  fs.createWriteStream('./server.log.gz'),
  (err) => {
    if (err) console.error('Pipeline failed:', err);
    else console.log('Compressed 20GB file like a boss');
  }
);
Enter fullscreen mode Exit fullscreen mode

When to Use Which Weapon

fs.readFile (Handle With Care):

  • Configuration files (<5MB)
  • Small static assets (icons, tiny JSON)
  • Only when you absolutely need all data in memory

fs.createReadStream (Default Choice):

  • Log processing
  • Media encoding/transcoding
  • Database imports/exports
  • Any file bigger than your phone’s RAM

The Harsh Truth

fs.readFile is Node.js’ version of a loaded gun—safe in controlled environments, deadly in production. Streams aren’t “advanced” techniques; they’re essential survival skills for any serious Node.js developer.

Next time you write fs.readFile, ask: “Will this file ever grow?” If the answer is yes (and it always is), you’ve just found your memory leak. Switch to streams—before your pager goes off at midnight.

ITRS image

See What Users Experience in The Browser — Anywhere, Anytime

Simulate logins, checkouts, and payments on SaaS, APIs, and internal apps. Catch issues early, baseline web performance, and stay ahead of incidents. Easily record user journeys right from your browser.

Start Free Trial

Top comments (0)

Scale globally with MongoDB Atlas. Try free.

Scale globally with MongoDB Atlas. Try free.

MongoDB Atlas is the global, multi-cloud database for modern apps trusted by developers and enterprises to build, scale, and run cutting-edge applications, with automated scaling, built-in security, and 125+ cloud regions.

Learn More

👋 Kindness is contagious

Explore this insightful write-up, celebrated by our thriving DEV Community. Developers everywhere are invited to contribute and elevate our shared expertise.

A simple "thank you" can brighten someone’s day—leave your appreciation in the comments!

On DEV, knowledge-sharing fuels our progress and strengthens our community ties. Found this useful? A quick thank you to the author makes all the difference.

Okay