Security News
Node.js EOL Versions CVE Dubbed the "Worst CVE of the Year" by Security Experts
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
stream-chopper
Advanced tools
The stream-chopper npm package is designed to split or 'chop' streams into smaller chunks based on specified criteria, such as time intervals or size limits. This can be particularly useful for logging, data processing, or any scenario where managing large streams in smaller, more manageable pieces is beneficial.
Chop by Time Interval
This feature allows you to chop a stream into smaller chunks based on a specified time interval. In this example, the stream is chopped every 1000 milliseconds (1 second).
const StreamChopper = require('stream-chopper');
const fs = require('fs');
const chopper = StreamChopper({ interval: 1000 }); // Chop every 1000 milliseconds
const inputStream = fs.createReadStream('largefile.log');
inputStream.pipe(chopper).pipe(fs.createWriteStream('output.log'));
Chop by Size
This feature allows you to chop a stream into smaller chunks based on a specified size limit. In this example, the stream is chopped every 1MB.
const StreamChopper = require('stream-chopper');
const fs = require('fs');
const chopper = StreamChopper({ size: 1024 * 1024 }); // Chop every 1MB
const inputStream = fs.createReadStream('largefile.log');
inputStream.pipe(chopper).pipe(fs.createWriteStream('output.log'));
Custom Chop Conditions
This feature allows you to define custom conditions for chopping the stream. In this example, the stream is chopped whenever the string 'ERROR' is found in the chunk.
const StreamChopper = require('stream-chopper');
const fs = require('fs');
const chopper = StreamChopper({
custom: (chunk) => chunk.includes('ERROR') // Chop when 'ERROR' is found in the chunk
});
const inputStream = fs.createReadStream('largefile.log');
inputStream.pipe(chopper).pipe(fs.createWriteStream('output.log'));
The split2 package is used to split a stream into lines or other delimiter-based chunks. It is similar to stream-chopper in that it allows for breaking down a stream into smaller pieces, but it is more focused on line or delimiter-based splitting rather than time or size-based chopping.
The stream-splitter package provides functionality to split a stream based on a specified delimiter. It is similar to stream-chopper in that it allows for breaking down a stream into smaller pieces, but it is more focused on delimiter-based splitting rather than time or size-based chopping.
The rotating-file-stream package is used to create a rotating write stream, which can rotate files based on time intervals or size limits. It is similar to stream-chopper in that it allows for managing large streams by breaking them into smaller chunks, but it is more focused on file rotation rather than general stream chopping.
Chop a single stream of data into a series of readable streams.
Stream Chopper is useful in situations where you have a stream of data you want to chop up into smaller pieces, either based on time or size. Each piece will be emitted as a readable stream (called output streams).
Possible use-cases include log rotation, splitting up large data sets, or chopping up a live stream of data into finite chunks that can then be stored.
Sometimes it's important to ensure that a chunk written to the input
stream isn't split up and devided over two output streams. Stream
Chopper allows you to specify the chopping algorithm used (via the
type
option) when a chunk is too large to fit into the current output
stream.
By default a chunk too large to fit in the current output stream is split between it and the next. Alternatively you can decide to either allow the chunk to "overflow" the size limit, in which case it will be written to the current output stream, or to "underflow" the size limit, in which case the current output stream will be ended and the chunk written to the next output stream.
npm install stream-chopper --save
Example app:
const StreamChopper = require('stream-chopper')
const chopper = new StreamChopper({
size: 30, // chop stream when it reaches 30 bytes,
time: 10000, // or when it's been open for 10s,
type: StreamChopper.overflow // but allow stream to exceed size slightly
})
chopper.on('stream', function (stream, next) {
console.log('>> Got a new stream! <<')
stream.pipe(process.stdout)
stream.on('end', next) // call next when you're ready to receive a new stream
})
chopper.write('This write contains more than 30 bytes\n')
chopper.write('This write contains less\n')
chopper.write('This is the last write\n')
Output:
>> Got a new stream! <<
This write contains more than 30 bytes
>> Got a new stream! <<
This write contains less
This is the last write
chopper = new StreamChopper([options])
Instantiate a StreamChopper
instance. StreamChopper
is a writable
stream.
Takes an optional options
object which, besides the normal options
accepted by the Writable
class, accepts the following
config options:
size
- The maximum number of bytes that can be written to the
chopper
stream before a new output stream is emitted (default:
Infinity
)time
- The maximum number of milliseconds that an output stream can
be in use before a new output stream is emitted (default: -1
which
means no limit)type
- Change the algoritm used to determine how a written chunk
that cannot fit into the current output stream should be handled. The
following values are possible:
StreamChopper.split
- Fit as much data from the chunk as possible
into the current stream and write the remainder to the next stream
(default)StreamChopper.overflow
- Allow the entire chunk to be written to
the current stream. After writing, the stream is endedStreamChopper.underflow
- End the current output stream and write
the entire chunk to the next streamIf type
is StreamChopper.underflow
and the size of the chunk to be
written is larger than size
an error is emitted.
stream
Emitted every time a new output stream is ready. You must listen for this event.
The listener function is called with two arguments:
stream
- A readable output streamnext
- A function you must call when you're ready to receive a new
output stream. If called with an error, the chopper
stream is
destroyedchopper.chop([callback])
Manually chop the stream. Forces the current output stream to end even
if its size
limit or time
timeout hasn't been reached yet.
Arguments:
callback
- An optional callback which will be called once the output
stream have endedFAQs
Chop a single stream of data into a series of readable streams
The npm package stream-chopper receives a total of 234,278 weekly downloads. As such, stream-chopper popularity was classified as popular.
We found that stream-chopper demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
Security News
cURL and Go security teams are publicly rejecting CVSS as flawed for assessing vulnerabilities and are calling for more accurate, context-aware approaches.
Security News
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.