Socket
Socket
Sign inDemoInstall

readable-stream

Package Overview
Dependencies
0
Maintainers
1
Versions
103
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

    readable-stream

An exploration of a new kind of readable streams for Node.js


Version published
Maintainers
1
Install size
21.0 kB
Created

Package description

What is readable-stream?

The readable-stream package is a userland stream module, compatible with the built-in stream module provided by Node.js. It offers the same interface and functionality as the native module, but with additional updates and bug fixes. It is particularly useful for ensuring consistent stream behavior across different Node.js versions.

What are readable-stream's main functionalities?

Creating a readable stream

This feature allows you to create a readable stream that you can pipe to other streams or consume manually. The 'read' method is called when the stream wants to pull more data.

const { Readable } = require('readable-stream');
const myReadableStream = new Readable({
  read(size) {
    this.push('some data');
    this.push(null); // No more data
  }
});
myReadableStream.on('data', (chunk) => {
  console.log(chunk.toString());
});

Creating a writable stream

This feature allows you to create a writable stream where you can write data. The 'write' method is called when the stream receives data to write.

const { Writable } = require('readable-stream');
const myWritableStream = new Writable({
  write(chunk, encoding, callback) {
    process.stdout.write(chunk);
    callback();
  }
});
process.stdin.pipe(myWritableStream);

Creating a transform stream

This feature allows you to create a transform stream that can modify data as it is read from a readable stream before it is written to a writable stream.

const { Transform } = require('readable-stream');
const myTransformStream = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});
process.stdin.pipe(myTransformStream).pipe(process.stdout);

Creating a duplex stream

This feature allows you to create a duplex stream that is both readable and writable. It can be used to read data from one source and write to another.

const { Duplex } = require('readable-stream');
const myDuplexStream = new Duplex({
  read(size) {
    this.push('data from read method');
    this.push(null);
  },
  write(chunk, encoding, callback) {
    console.log(chunk.toString());
    callback();
  }
});
myDuplexStream.on('data', (chunk) => {
  console.log(chunk.toString());
});
myDuplexStream.write('data for write method');

Other packages similar to readable-stream

Readme

Source

readable-stream

Stability: 1 - Experimental

An exploration of a new kind of readable streams for Node.js

This is an abstract class designed to be extended. It also provides a wrap method that you can use to provide the simpler readable API for streams that have the "readable stream" interface of Node 0.8 and before.

Usage

var Readable = require('readable-stream');
var r = new Readable();

r.read = function(n) {
  // your magic goes here.
  // return n bytes, or null if there is nothing to be read.
  // if you return null, then you MUST emit 'readable' at some
  // point in the future if there are bytes available, or 'end'
  // if you are not going to have any more data.
  //
  // You MUST NOT emit either 'end' or 'readable' before
  // returning from this function, but you MAY emit 'end' or
  // 'readable' in process.nextTick().
};

r.on('end', function() {
  // no more bytes will be provided.
});

r.on('readable', function() {
  // now is the time to call read() again.
});

Justification

Writable streams in node are very straightforward to use and extend. The write method either returns true if the bytes could be completely handled and another write should be performed, or false if you would like the user to back off a bit, in which case a drain event at some point in the future will let them continue writing. The end() method lets the user indicate that no more bytes will be written. That's pretty much the entire required interface for writing.

However, readable streams in Node 0.8 and before are rather complicated.

  1. The data events start coming right away, no matter what. There is no way to do other actions before consuming data, without handling buffering yourself.
  2. If you extend the interface in userland programs, then you must implement pause() and resume() methods, and take care of buffering yourself.

So, while writers only have to implement write(), end(), and drain, readers have to implement (at minimum):

  • pause() method
  • resume() method
  • data event
  • end event

If you are using a readable stream, and want to just get the first 10 bytes, make a decision, and then pass the rest off to somewhere else, then you have to handle buffering, pausing, and so on. This is all rather brittle and easy to get wrong for all but the most trivial use cases.

Additionally, this all made the reader.pipe(writer) method unnecessarily complicated and difficult to extend without breaking something. Backpressure and error handling is especially challenging and brittle.

Solution

The reader does not have pause/resume methods. If you want to consume the bytes, you call read(). If bytes are not being consumed, then effectively the stream is in a paused state. It exerts backpressure on upstream connections, doesn't read from files, etc.

If read() returns null, then a future readable event will be fired when there are more bytes ready to be consumed.

This is simpler and conceptually closer to the underlying mechanisms. The resulting pipe() method is much shorter and simpler.

Compatibility

It's not particularly difficult to wrap older-style streams in this new interface, or to wrap this type of stream in the older-style interface.

The Readable class takes an argument which is an old-style stream with data events and pause() and resume() methods, and uses that as the data source. For example:

var r = new Readable(oldReadableStream);

// now you can use r.read(), and it will emit 'readable' events

The Readable class will also automatically convert into an old-style data-emitting stream if any listeners are added to the data event. So, this works fine, though you of course lose a lot of the benefits of the new interface:

var r = new ReadableThing();

r.on('data', function(chunk) {
  // ...
});

// now pause, resume, etc. are patched into place, and r will
// continually call read() until it returns null, emitting the
// returned chunks in 'data' events.

r.on('end', function() {
  // ...
});

Keywords

FAQs

Last updated on 27 Jul 2012

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc