What is minipass-pipeline?
The minipass-pipeline package is a Node.js module that allows you to create a pipeline of streams (typically transform streams) that data can be written to. Once written, the data will flow through each stream in the pipeline in order. It is built on the Minipass stream library, which is a small, fast stream implementation.
What are minipass-pipeline's main functionalities?
Stream Pipeline Creation
This feature allows you to create a pipeline of streams. Data written to the pipeline will be processed by each stream in turn. In this example, a file is read, compressed using gzip, and then written to a new file.
{"const Pipeline = require('minipass-pipeline')
const fs = require('fs')
const zlib = require('zlib')
const pipeline = new Pipeline(
fs.createReadStream('input.txt'),
zlib.createGzip(),
fs.createWriteStream('output.txt.gz')
)
pipeline.write('some data')
pipeline.end()"}
Error Handling
This feature allows you to handle errors that may occur in the pipeline. The 'error' event is emitted if any stream in the pipeline emits an 'error' event. In this example, an error handler is attached to the pipeline to log any errors that occur.
{"const Pipeline = require('minipass-pipeline')
const fs = require('fs')
const zlib = require('zlib')
const pipeline = new Pipeline(
fs.createReadStream('input.txt'),
zlib.createGzip(),
fs.createWriteStream('output.txt.gz')
)
pipeline.on('error', (err) => {
console.error('Pipeline error:', err)
})"}
Other packages similar to minipass-pipeline
pump
The 'pump' package is similar to minipass-pipeline in that it is used to pipe between streams and handle their close and error events properly. Unlike minipass-pipeline, 'pump' does not create a new stream instance but is a function that pipes streams together and calls a callback when the pipeline is fully done or an error occurs.
through2
The 'through2' package is a tiny wrapper around Node.js streams.Transform (a subclass of stream) to avoid explicit subclassing noise. It's similar in the sense that it can be used to create transform streams that can be part of a pipeline, but it does not provide pipeline functionality itself.
minipass-pipeline
Create a pipeline of streams using Minipass.
Calls .pipe()
on all the streams in the list. Returns a stream where
writes got to the first pipe in the chain, and reads are from the last.
Errors are proxied along the chain and emitted on the Pipeline stream.
USAGE
const Pipeline = require('minipass-pipeline')
const p = new Pipeline(input, transform, output)
p.write('foo')
p.on('data', chunk => doSomething())
const decode = new bunzipDecoder()
const unpack = tar.extract({ cwd: 'target-dir' })
const tbz = new Pipeline(decode, unpack)
fs.createReadStream('archive.tbz').pipe(tbz)
Pipeline is a minipass stream, so it's as
synchronous as the streams it wraps. It will buffer data until there is a
reader, but no longer, so make sure to attach your listeners before you
pipe it somewhere else.