What is stream-chain?
The stream-chain npm package is designed to facilitate the creation and management of processing pipelines for streams. It allows you to chain together multiple stream processing steps in a flexible and efficient manner.
What are stream-chain's main functionalities?
Creating a Stream Chain
This code demonstrates how to create a stream chain that parses JSON data, picks a specific part of the JSON, and then streams the values. The pipeline processes a JSON string and logs the value of the 'data' key.
const { chain } = require('stream-chain');
const { parser } = require('stream-json');
const { pick } = require('stream-json/filters/Pick');
const { streamValues } = require('stream-json/streamers/StreamValues');
const pipeline = chain([
parser(),
pick({ filter: 'data' }),
streamValues(),
]);
pipeline.on('data', (data) => {
console.log(data.value);
});
pipeline.write('{"data": {"key": "value"}}');
pipeline.end();
Combining Multiple Streams
This example shows how to combine multiple streams in a chain to process an array of JSON objects. The pipeline parses the JSON, picks the 'items' array, and then streams each value in the array.
const { chain } = require('stream-chain');
const { parser } = require('stream-json');
const { pick } = require('stream-json/filters/Pick');
const { streamValues } = require('stream-json/streamers/StreamValues');
const { streamArray } = require('stream-json/streamers/StreamArray');
const pipeline = chain([
parser(),
pick({ filter: 'items' }),
streamArray(),
streamValues(),
]);
pipeline.on('data', (data) => {
console.log(data.value);
});
pipeline.write('{"items": [{"key": "value1"}, {"key": "value2"}]}');
pipeline.end();
Other packages similar to stream-chain
through2
The through2 package is a tiny wrapper around Node.js streams.Transform. It simplifies the creation of transform streams, allowing you to easily process data as it passes through the stream. Unlike stream-chain, through2 focuses on creating individual transform streams rather than chaining multiple streams together.
highland
Highland is a high-level stream library for Node.js that provides a more functional approach to working with streams. It allows you to create and manipulate streams using a variety of functional programming techniques. Highland offers more comprehensive stream manipulation capabilities compared to stream-chain, which is more focused on chaining existing streams.
mississippi
Mississippi is a collection of useful stream utility modules that make working with streams easier. It includes modules for creating, combining, and consuming streams. Mississippi provides a broader set of utilities for stream management compared to stream-chain, which is specifically designed for chaining streams together.
stream-chain
stream-chain
creates a chain of streams out of regular functions, asynchronous functions, generator functions, and existing streams, while properly handling backpressure. The resulting chain is represented as a Duplex stream, which can be combined with other streams the usual way. It eliminates a boilerplate helping to concentrate on functionality without losing the performance especially make it easy to build object mode data processing pipelines.
Originally stream-chain
was used internally with stream-fork and stream-json to create flexible data processing pipelines.
stream-chain
is a lightweight, no-dependencies micro-package. It is distributed under New BSD license.
Intro
const Chain = require('stream-chain');
const fs = require('fs');
const zlib = require('zlib');
const {Transform} = require('stream');
const chain = new Chain([
x => x * x,
x => [x - 1, x, x + 1],
async x => await getTotalFromDatabaseByKey(x),
function* (x) {
for (let i = x; i > 0; --i) {
yield i;
}
return 0;
},
x => x % 2 ? x : null,
new Transform({
writableObjectMode: true,
transform(x, _, callback) {
callback(null, x.toString());
}
}),
zlib.createGzip()
]);
chain.on('error', error => console.log(error));
dataSource.pipe(chain).pipe(fs.createWriteStream('output.txt.gz'));
Making processing pipelines appears to be easy: just chain functions one after another, and we are done. Real life pipelines filter objects out and/or produce more objects out of a few ones. On top of that we have to deal with asynchronous operations, while processing or producing data: networking, databases, files, user responses, and so on. Unequal number of values per stage, and unequal throughput of stages introduced problems like backpressure, which requires algorithms implemented by streams.
While a lot of API improvements were made to make streams easy to use, in reality, a lot of boilerplate is required when creating a pipeline. stream-chain
eliminates most of it.
Installation
npm i --save stream-chain
Documentation
Chain
, which is returned by require('stream-chain')
, is based on Duplex. It chains its dependents in a single pipeline optionally binding error
events.
Many details about this package can be discovered by looking at test files located in tests/
and in the source code (index.js
).
Constructor: new Chain(fns[, options])
The constructor accepts the following arguments:
fns
is an array of functions arrays or stream instances.
- If a value is a function, a Transform stream is created, which calls this function with two parameters:
chunk
(an object), and an optional encoding
. See Node's documentation for more details on those parameters. The function will be called in the context of the created stream.
- If it is a regular function, it can return:
- If it is an asynchronous function, it can return a regular value.
- In essence, it is covered under "special values" as a function that returns a promise.
async x => {
await new Promise(resolve => setTimeout(() => resolve(), 500));
return x;
}
- If it is a generator function, each yield should produce a regular value.
- In essence, it is covered under "special values" as a function that returns a generator object.
function* (x) {
for (let i = -1; i <= 1; ++i) {
if (i) yield x + i;
}
return x;
}
- (since 2.2.0) If it is an asynchronous generator function, each yield should produce a regular value.
- In essence, it is covered under "special values" as a function that returns a generator object.
async function* (x) {
for (let i = -1; i <= 1; ++i) {
if (i) {
await new Promise(resolve => setTimeout(() => resolve(), 50));
yield x + i;
}
}
return x;
}
- (since 2.1.0) If a value is an array, it is assumed to be an array of regular functions.
Their values are passed in a chain. All values (including
null
, undefined
, and arrays) are allowed
and passed without modifications. The last value is a subject to processing defined above for regular functions.
- Empty arrays are ignored.
- If any function returns a value produced by
Chain.final(value)
(see below), it terminates the chain using
value
as the final value of the chain. - This feature bypasses streams. It is implemented for performance reasons.
- If a value is a valid stream, it is included as is in the pipeline.
- Transform.
- Duplex.
- The very first stream can be Readable.
- In this case a
Chain
instance ignores all possible writes to the front, and ends when the first stream ends.
- The very last stream can be Writable.
- In this case a
Chain
instance does not produce any output, and finishes when the last stream finishes. - Because
'data'
event is not used in this case, the instance resumes itself automatically. Read about it in Node's documentation:
options
is an optional object detailed in the Node's documentation.
An instance can be used to attach handlers for stream events.
const chain = new Chain([x => x * x, x => [x - 1, x, x + 1]]);
chain.on('error', error => console.error(error));
dataSource.pipe(chain);
Properties
Following public properties are available:
streams
is an array of streams created by a constructor. Its values either Transform streams that use corresponding functions from a constructor parameter, or user-provided streams. All streams are piped sequentially starting from the beginning.input
is the beginning of the pipeline. Effectively it is the first item of streams
.output
is the end of the pipeline. Effectively it is the last item of streams
.
Generally, a Chain
instance should be used to represent a chain:
const chain = new Chain([
x => x * x,
x => [x - 1, x, x + 1],
new Transform({
writableObjectMode: true,
transform(chunk, _, callback) {
callback(null, chunk.toString());
}
})
]);
dataSource
.pipe(chain);
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('output.txt.gz'));
But in some cases input
and output
provide a better control over how a data processing pipeline should be organized:
chain.output
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('output.txt.gz'));
dataSource.pipe(chain.input);
Please select what style you want to use, and never mix them together with the same object.
Static methods
Following static methods are available:
chain(fns[, options)
is a helper factory function, which has the same arguments as the constructor and returns a Chain
instance.
const {chain} = require('stream-chain');
dataSource
.pipe(chain([x => x * x, x => [x - 1, x, x + 1]]));
chain([
dataSource,
x => x * x,
x => [x - 1, x, x + 1],
zlib.createGzip(),
fs.createWriteStream('output.txt.gz')
])
- (since 2.1.0)
final(value)
is a helper factory function, which can be used in by chained functions (see above the array of functions).
It returns a special value, which terminates the chain and uses the passed value as the result of the chain.
const {chain, final} = require('stream-chain');
dataSource
.pipe(chain([[x => x * x, x => 2 * x + 1]]));
dataSource
.pipe(chain([[
x => x * x,
x => final(x),
x => 2 * x + 1
]]));
dataSource
.pipe(chain([[
x => x * x,
x => final(),
x => 2 * x + 1
]]));
dataSource
.pipe(chain([[
x => x * x,
x => x % 2 ? final() : x,
x => 2 * x + 1
]]));
const none = final();
dataSource
.pipe(chain([[
x => x * x,
x => x % 2 ? none : x,
x => 2 * x + 1
]]));
- (since 2.1.0)
many(array)
is a helper factory function, which is used to wrap arrays to be interpreted as multiple values returned from a function.
At the moment it is redundant: you can use a simple array to indicate that, but a naked array is being deprecated and in future versions it will be passed as is.
The thinking is that using many()
is better indicates the intention. Additionally, in the future versions it will be used by array of functions (see above).
const {chain, many} = require('stream-chain');
dataSource
.pipe(chain([x => many([x, x + 1, x + 2])]));
Release History
- 2.2.1 Technical release: new symbols namespace, explicit license (thx Keen Yee Liau), added Greenkeeper.
- 2.2.0 Added utilities:
take
, takeWhile
, skip
, skipWhile
, fold
, scan
, Reduce
, comp
. - 2.1.0 Added simple transducers, dropped Node 6.
- 2.0.3 Added TypeScript typings and the badge.
- 2.0.2 Workaround for Node 6: use
'finish'
event instead of _final()
. - 2.0.1 Improved documentation.
- 2.0.0 Upgraded to use Duplex instead of EventEmitter as the base.
- 1.0.3 Improved documentation.
- 1.0.2 Better README.
- 1.0.1 Fixed the README.
- 1.0.0 The initial release.