Skip to main content

Command Palette

Search for a command to run...

Stop Using JSON.parse on Huge Payloads: Streaming JSON in Node.js

Let's handle massive JSON data in Node.js using streams instead of blocking your server with JSON.parse()

Updated
3 min read
Stop Using JSON.parse on Huge Payloads: Streaming JSON in Node.js

Think JSON.parse() is harmless? Think again.

When you're dealing with massive logs, analytics dumps, or API exports, that innocent line of code can quietly bring your Node.js app to its knees — eating up memory, blocking the event loop, and even crashing your server.

In this post, I'll show you the smarter, scalable way to handle huge JSON payloads using streaming parsers that process data piece by piece — no memory explosions, no drama.

🚨 Introduction: The Hidden Danger of JSON.parse()

  • Everyone uses JSON.parse(), but it loads the entire payload into memory before parsing.

  • This is fine for small payloads, but deadly for:

    • Logs

    • API exports

    • Analytics events

    • Large nested objects

Result?

  • High memory usage

  • Long GC pauses

  • App crashes or OOM (Out of Memory) errors

🔍 Why JSON.parse() Fails at Scale

const data = fs.readFileSync('huge.json');
const parsed = JSON.parse(data); // 💣 if file is >200MB+
  • Memory spikes because the entire buffer is loaded before parsing

  • Node.js event loop blocked → server becomes unresponsive

💡 The Solution: Streaming JSON Parsing

Meet the heroes:

They allow:

  • Chunked reading of JSON

  • Event-based parsing of deeply nested objects

  • Low memory footprint

🛠️ Example with stream-json

📁 Sample: huge.json

{
  "users": [
    { "id": 1, "name": "Alice" },
    { "id": 2, "name": "Bob" },
    ...
  ]
}

✅ Stream it:

const fs = require('fs');
const { parser } = require('stream-json');
const { streamArray } = require('stream-json/streamers/StreamArray');

const stream = fs.createReadStream('huge.json')
  .pipe(parser())
  .pipe(streamArray());

stream.on('data', ({ value }) => {
  console.log('User:', value);
});

stream.on('end', () => {
  console.log('Done processing large JSON.');
});

🎯 Result:

  • Processes one item at a time

  • Uses minimal memory

  • Ideal for ETL, analytics, or data migration scripts

📉 Performance Benchmark

MethodMemory UsedTime Taken
JSON.parse()500MB+8s
stream-json~50MB9s

✅ Slightly slower, but way safer and much more scalable

🧠 When to Use Streaming JSON Parsing

Use streaming if:

  • File or payload size > 10MB

  • You’re only processing parts of the JSON

  • You're building:

    • Log processors

    • Data transformers

    • API data sync tools

🚫 Gotchas & Best Practices

  • Streaming JSON needs well-formed JSON (no trailing commas, etc.)

  • Use libraries like stream-json with known structure (e.g., arrays or object roots)

  • You can combine stream-json with Transform streams for batch processing

Conclusion: Stream Don’t Crash

If you’re working with large JSON files, stop loading them all at once.

Use stream-based parsing for safer, cleaner, and more scalable Node.js apps.

📣 Call to Action:

"Have you ever crashed a Node app because of a giant JSON? Let’s hear your horror stories—or your fixes—in the comments. 😅"

Scaling JavaScript & Node.js

Part 8 of 16

Learn how to build scalable, high-performance applications in JavaScript and Node.js — this series breaks down real-world solutions for handling large datasets, improving API speed, and writing production-grade backend code.

Up next

Offset vs Cursor vs Keyset Pagination: Best Practices for Scalable APIs

How to paginate large datasets efficiently using Offset, Cursor, and Keyset strategies—complete with pros, cons, and real-world use cases