![pnpm 9.5 Introduces Catalogs: Shareable Dependency Version Specifiers](https://cdn.sanity.io/images/cgdhsj6q/production/c06a5b74512bcc6da29d38cef96a454fa947810d-1024x1024.webp?w=400&fit=max&auto=format)
Security News
pnpm 9.5 Introduces Catalogs: Shareable Dependency Version Specifiers
pnpm 9.5 introduces a Catalogs feature, enabling shareable dependency version specifiers, reducing merge conflicts and improving support for monorepos.
readable-stream
Advanced tools
Package description
The readable-stream package is a userland stream module, compatible with the built-in stream module provided by Node.js. It offers the same interface and functionality as the native module, but with additional updates and bug fixes. It is particularly useful for ensuring consistent stream behavior across different Node.js versions.
Creating a readable stream
This feature allows you to create a readable stream that you can pipe to other streams or consume manually. The 'read' method is called when the stream wants to pull more data.
const { Readable } = require('readable-stream');
const myReadableStream = new Readable({
read(size) {
this.push('some data');
this.push(null); // No more data
}
});
myReadableStream.on('data', (chunk) => {
console.log(chunk.toString());
});
Creating a writable stream
This feature allows you to create a writable stream where you can write data. The 'write' method is called when the stream receives data to write.
const { Writable } = require('readable-stream');
const myWritableStream = new Writable({
write(chunk, encoding, callback) {
process.stdout.write(chunk);
callback();
}
});
process.stdin.pipe(myWritableStream);
Creating a transform stream
This feature allows you to create a transform stream that can modify data as it is read from a readable stream before it is written to a writable stream.
const { Transform } = require('readable-stream');
const myTransformStream = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
process.stdin.pipe(myTransformStream).pipe(process.stdout);
Creating a duplex stream
This feature allows you to create a duplex stream that is both readable and writable. It can be used to read data from one source and write to another.
const { Duplex } = require('readable-stream');
const myDuplexStream = new Duplex({
read(size) {
this.push('data from read method');
this.push(null);
},
write(chunk, encoding, callback) {
console.log(chunk.toString());
callback();
}
});
myDuplexStream.on('data', (chunk) => {
console.log(chunk.toString());
});
myDuplexStream.write('data for write method');
Through2 is a tiny wrapper around Node.js streams.Transform that makes it easier to create transform streams. It is similar to readable-stream's Transform, but with a simpler API for most common use cases.
Highland.js manages synchronous and asynchronous code easily, using nothing more than standard JavaScript and Node-like streams. It is more functional in nature compared to readable-stream and provides a higher level abstraction for handling streams.
Stream-browserify is a browser-compatible version of Node.js' core stream module, similar to readable-stream. It allows the use of Node.js-style streams in the browser, but it is specifically designed to polyfill the native Node.js stream module for browser use.
Buffer List (bl) is a storage object for collections of Node Buffers, which can be used with streams. Unlike readable-stream, it focuses on buffering and manipulating binary data rather than providing the stream API itself.
Readme
Stability: 1 - Experimental
An exploration of a new kind of readable streams for Node.js
This is an abstract class designed to be extended. It also provides a
wrap
method that you can use to provide the simpler readable API for
streams that have the "readable stream" interface of Node 0.8 and
before.
var Readable = require('readable-stream');
var r = new Readable();
r.read = function(n) {
// your magic goes here.
// return n bytes, or null if there is nothing to be read.
// if you return null, then you MUST emit 'readable' at some
// point in the future if there are bytes available, or 'end'
// if you are not going to have any more data.
//
// You MUST NOT emit either 'end' or 'readable' before
// returning from this function, but you MAY emit 'end' or
// 'readable' in process.nextTick().
};
r.on('end', function() {
// no more bytes will be provided.
});
r.on('readable', function() {
// now is the time to call read() again.
});
Writable streams in node are very straightforward to use and extend.
The write
method either returns true
if the bytes could be
completely handled and another write
should be performed, or false
if you would like the user to back off a bit, in which case a drain
event at some point in the future will let them continue writing. The
end()
method lets the user indicate that no more bytes will be
written. That's pretty much the entire required interface for
writing.
However, readable streams in Node 0.8 and before are rather complicated.
data
events start coming right away, no matter what. There
is no way to do other actions before consuming data, without
handling buffering yourself.pause()
and resume()
methods, and take care of
buffering yourself.So, while writers only have to implement write()
, end()
, and
drain
, readers have to implement (at minimum):
pause()
methodresume()
methoddata
eventend
eventIf you are using a readable stream, and want to just get the first 10 bytes, make a decision, and then pass the rest off to somewhere else, then you have to handle buffering, pausing, and so on. This is all rather brittle and easy to get wrong for all but the most trivial use cases.
Additionally, this all made the reader.pipe(writer)
method
unnecessarily complicated and difficult to extend without breaking
something. Backpressure and error handling is especially challenging
and brittle.
The reader does not have pause/resume methods. If you want to consume
the bytes, you call read()
. If bytes are not being consumed, then
effectively the stream is in a paused state. It exerts backpressure
on upstream connections, doesn't read from files, etc.
If read()
returns null
, then a future readable
event will be
fired when there are more bytes ready to be consumed.
This is simpler and conceptually closer to the underlying mechanisms.
The resulting pipe()
method is much shorter and simpler.
It's not particularly difficult to wrap older-style streams in this new interface, or to wrap this type of stream in the older-style interface.
The Readable
class takes an argument which is an old-style stream
with data
events and pause()
and resume()
methods, and uses that
as the data source. For example:
var r = new Readable(oldReadableStream);
// now you can use r.read(), and it will emit 'readable' events
The Readable
class will also automatically convert into an old-style
data
-emitting stream if any listeners are added to the data
event.
So, this works fine, though you of course lose a lot of the benefits of
the new interface:
var r = new ReadableThing();
r.on('data', function(chunk) {
// ...
});
// now pause, resume, etc. are patched into place, and r will
// continually call read() until it returns null, emitting the
// returned chunks in 'data' events.
r.on('end', function() {
// ...
});
FAQs
Unknown package
The npm package readable-stream receives a total of 108,617,811 weekly downloads. As such, readable-stream popularity was classified as popular.
We found that readable-stream demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
pnpm 9.5 introduces a Catalogs feature, enabling shareable dependency version specifiers, reducing merge conflicts and improving support for monorepos.
Security News
A threat actor on BreachForums is selling an unverified npm vulnerability for account takeover, but npm has not officially confirmed the existence of this security concern.
Security News
Cyber insurance rates are dropping as the market matures, according to a new report projecting global premiums to reach $43 billion by 2030, driven by international market uptake and growth in the SME sector.