Summary

Node.js v18 introduces the standardized Web Streams API, providing ReadableStream, WritableStream, and TransformStream globals alongside utilities to convert existing Node.js streams. This unifies streaming code across browser and server environments and simplifies flow control, backpressure handling, and data transformation (nodejs.org).

1. Introduction

For years, Node.js apps relied on the callback‑and‑event based stream module to process data in chunks. While powerful, it differs from the WHATWG Streams spec used by browsers. Node.js v18 bridges that gap by embedding the Web Streams API natively, allowing identical code to run in browser, Deno, and server contexts (2ality.com, nodejs.org).

2. Web Streams Basics in Node.js v18

2.1 Global Constructors

  • ReadableStream: A source of data chunks, with methods like getReader() and async iteration.
  • WritableStream: A destination for data, exposing getWriter().
  • TransformStream: A pair of a readable and writable stream that applies a transform algorithm. (nodejs.org)

2.2 Core Methods & Piping

  1. Reading:

    const reader = readableStream.getReader();
    const { value, done } = await reader.read();
    
  2. Writing:

    const writer = writableStream.getWriter();
    await writer.write(chunk);
    await writer.close();
    
  3. Pipe Through: chain transforms elegantly:

    const upperCase = new TransformStream({
      transform(chunk, controller) {
        controller.enqueue(chunk.toUpperCase());
      }
    });
    fetch('/large.txt')
      .then(res => res.body)
      .then(body => body.pipeThrough(upperCase))
      .then(stream => stream.pipeTo(fileWritable));
    ``` ([nodejs.org](https://nodejs.org/api/webstreams.html))
    

3. Converting Between Node Streams and Web Streams

Node.js v18 provides helper methods on the node:stream module to interoperate:

import { Readable, Writable } from 'node:stream';

// Node → Web
const webReadable = Readable.toWeb(nodeReadable);
const webWritable = Writable.toWeb(nodeWritable);

// Web → Node
const nodeReadable2 = Readable.fromWeb(webReadable);
const nodeWritable2 = Writable.fromWeb(webWritable);

Under the hood, these methods wrap existing stream implementations and wire up backpressure signals (nodejs.org).

4. Use Cases & Examples

4.1 Streaming File Transformation

Compress a large CSV on-the-fly:

import fs from 'fs';
import { CompressionStream } from 'node:stream/web';

const fileIn = fs.createReadStream('data.csv');
const webIn = Readable.toWeb(fileIn);
const gzipStream = new CompressionStream('gzip');
const webOut = webIn.pipeThrough(gzipStream);
const fileOut = fs.createWriteStream('data.csv.gz');
await webOut.pipeTo(Writable.toWeb(fileOut));

4.2 Backpressure-Controlled Uploads

const uploadStream = new WritableStream({
  write(chunk) { /* send chunk to remote */ }
});

fetch('/upload', { method: 'POST', body: webReadable })
  .then(res => console.log('Done'));

The Web Streams pipeline automatically pauses reading when the upload is slow, avoiding memory blowups (nodejs.org).

5. Pitfalls & Tips

  • Legacy Streams: Some APIs still return Node streams, convert them before mixing with Web Streams.
  • Performance: Web Streams have slightly more overhead due to spec compliance; for ultra-low‑latency, benchmark both APIs (nodejs.org).
  • Backpressure Semantics: Web Streams propagate backpressure differently than Node’s old streams, familiarize yourself to avoid subtle bugs.

6. Full Example Walkthrough

Build a simple CSV line counter:

import fs from 'fs';
import readline from 'node:readline';
import { Readable } from 'node:stream';

// 1. Node stream → Web stream
const fileStream = fs.createReadStream('big.csv');
const webStream = Readable.toWeb(fileStream);

// 2. Split lines via TransformStream\ nconst lineSplitter = new TransformStream({
  transform(chunk, controller) {
    const text = new TextDecoder().decode(chunk);
    text.split(/\r?\n/).forEach(line => controller.enqueue(line));
  }
});

// 3. Count lines
let count = 0;
const counter = new WritableStream({
  write(line) { if (line) count++; }
});
await webStream.pipeThrough(lineSplitter).pipeTo(counter);
console.log('Lines:', count);

7. Conclusion & Further Reading

The Web Streams API in Node.js v18 unifies streaming patterns across environments, simplifies pipeline authoring, and standardizes backpressure. While some legacy modules still rely on old streams, conversion helpers make adoption seamless. For more: