Socket
Socket
Sign inDemoInstall

minipass-pipeline

Package Overview
Dependencies
2
Maintainers
1
Versions
11
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

    minipass-pipeline

create a pipeline of streams using Minipass


Version published
Weekly downloads
14M
increased by0.31%
Maintainers
1
Install size
60.4 kB
Created
Weekly downloads
 

Package description

What is minipass-pipeline?

The minipass-pipeline package is a Node.js module that allows you to create a pipeline of streams (typically transform streams) that data can be written to. Once written, the data will flow through each stream in the pipeline in order. It is built on the Minipass stream library, which is a small, fast stream implementation.

What are minipass-pipeline's main functionalities?

Stream Pipeline Creation

This feature allows you to create a pipeline of streams. Data written to the pipeline will be processed by each stream in turn. In this example, a file is read, compressed using gzip, and then written to a new file.

{"const Pipeline = require('minipass-pipeline')
const fs = require('fs')
const zlib = require('zlib')

const pipeline = new Pipeline(
  fs.createReadStream('input.txt'),
  zlib.createGzip(),
  fs.createWriteStream('output.txt.gz')
)

pipeline.write('some data')
pipeline.end()"}

Error Handling

This feature allows you to handle errors that may occur in the pipeline. The 'error' event is emitted if any stream in the pipeline emits an 'error' event. In this example, an error handler is attached to the pipeline to log any errors that occur.

{"const Pipeline = require('minipass-pipeline')
const fs = require('fs')
const zlib = require('zlib')

const pipeline = new Pipeline(
  fs.createReadStream('input.txt'),
  zlib.createGzip(),
  fs.createWriteStream('output.txt.gz')
)

pipeline.on('error', (err) => {
  console.error('Pipeline error:', err)
})"}

Other packages similar to minipass-pipeline

Readme

Source

minipass-pipeline

Create a pipeline of streams using Minipass.

Calls .pipe() on all the streams in the list. Returns a stream where writes got to the first pipe in the chain, and reads are from the last.

Errors are proxied along the chain and emitted on the Pipeline stream.

USAGE

const Pipeline = require('minipass-pipeline')

// the list of streams to pipeline together,
// a bit like `input | transform | output` in bash
const p = new Pipeline(input, transform, output)

p.write('foo') // writes to input
p.on('data', chunk => doSomething()) // reads from output stream

// less contrived example (but still pretty contrived)...
const decode = new bunzipDecoder()
const unpack = tar.extract({ cwd: 'target-dir' })
const tbz = new Pipeline(decode, unpack)

fs.createReadStream('archive.tbz').pipe(tbz)

// specify any minipass options if you like, as the first argument
// it'll only try to pipeline event emitters with a .pipe() method
const p = new Pipeline({ objectMode: true }, input, transform, output)

// If you don't know the things to pipe in right away, that's fine.
// use p.push(stream) to add to the end, or p.unshift(stream) to the front
const databaseDecoderStreamDoohickey = (connectionInfo) => {
  const p = new Pipeline()
  logIntoDatabase(connectionInfo).then(connection => {
    initializeDecoderRing(connectionInfo).then(decoderRing => {
      p.push(connection, decoderRing)
      getUpstreamSource(upstream => {
        p.unshift(upstream)
      })
    })
  })
  // return to caller right away
  // emitted data will be upstream -> connection -> decoderRing pipeline
  return p
}

Pipeline is a minipass stream, so it's as synchronous as the streams it wraps. It will buffer data until there is a reader, but no longer, so make sure to attach your listeners before you pipe it somewhere else.

new Pipeline(opts = {}, ...streams)

Create a new Pipeline with the specified Minipass options and any streams provided.

pipeline.push(stream, ...)

Attach one or more streams to the pipeline at the end (read) side of the pipe chain.

pipeline.unshift(stream, ...)

Attach one or more streams to the pipeline at the start (write) side of the pipe chain.

FAQs

Last updated on 28 Jul 2020

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc