Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
stream-chopper
Advanced tools
The stream-chopper npm package is designed to split or 'chop' streams into smaller chunks based on specified criteria, such as time intervals or size limits. This can be particularly useful for logging, data processing, or any scenario where managing large streams in smaller, more manageable pieces is beneficial.
Chop by Time Interval
This feature allows you to chop a stream into smaller chunks based on a specified time interval. In this example, the stream is chopped every 1000 milliseconds (1 second).
const StreamChopper = require('stream-chopper');
const fs = require('fs');
const chopper = StreamChopper({ interval: 1000 }); // Chop every 1000 milliseconds
const inputStream = fs.createReadStream('largefile.log');
inputStream.pipe(chopper).pipe(fs.createWriteStream('output.log'));
Chop by Size
This feature allows you to chop a stream into smaller chunks based on a specified size limit. In this example, the stream is chopped every 1MB.
const StreamChopper = require('stream-chopper');
const fs = require('fs');
const chopper = StreamChopper({ size: 1024 * 1024 }); // Chop every 1MB
const inputStream = fs.createReadStream('largefile.log');
inputStream.pipe(chopper).pipe(fs.createWriteStream('output.log'));
Custom Chop Conditions
This feature allows you to define custom conditions for chopping the stream. In this example, the stream is chopped whenever the string 'ERROR' is found in the chunk.
const StreamChopper = require('stream-chopper');
const fs = require('fs');
const chopper = StreamChopper({
custom: (chunk) => chunk.includes('ERROR') // Chop when 'ERROR' is found in the chunk
});
const inputStream = fs.createReadStream('largefile.log');
inputStream.pipe(chopper).pipe(fs.createWriteStream('output.log'));
The split2 package is used to split a stream into lines or other delimiter-based chunks. It is similar to stream-chopper in that it allows for breaking down a stream into smaller pieces, but it is more focused on line or delimiter-based splitting rather than time or size-based chopping.
The stream-splitter package provides functionality to split a stream based on a specified delimiter. It is similar to stream-chopper in that it allows for breaking down a stream into smaller pieces, but it is more focused on delimiter-based splitting rather than time or size-based chopping.
The rotating-file-stream package is used to create a rotating write stream, which can rotate files based on time intervals or size limits. It is similar to stream-chopper in that it allows for managing large streams by breaking them into smaller chunks, but it is more focused on file rotation rather than general stream chopping.
Chop a single stream of data into a series of readable streams.
Stream Chopper is useful in situations where you have a stream of data you want to chop up into smaller pieces, either based on time or size. Each piece will be emitted as a readable stream (called output streams).
Possible use-cases include log rotation, splitting up large data sets, or chopping up a live stream of data into finite chunks that can then be stored.
Sometimes it's important to ensure that a chunk written to the input
stream isn't split up and devided over two output streams. Stream
Chopper allows you to specify the chopping algorithm used (via the
type
option) when a chunk is too large to fit into the current output
stream.
By default a chunk too large to fit in the current output stream is split between it and the next. Alternatively you can decide to either allow the chunk to "overflow" the size limit, in which case it will be written to the current output stream, or to "underflow" the size limit, in which case the current output stream will be ended and the chunk written to the next output stream.
npm install stream-chopper --save
Example app:
const StreamChopper = require('stream-chopper')
const chopper = new StreamChopper({
size: 30, // chop stream when it reaches 30 bytes,
time: 10000, // or when it's been open for 10s,
type: StreamChopper.overflow // but allow stream to exceed size slightly
})
chopper.on('stream', function (stream, next) {
console.log('>> Got a new stream! <<')
stream.pipe(process.stdout)
stream.on('end', next) // call next when you're ready to receive a new stream
})
chopper.write('This write contains more than 30 bytes\n')
chopper.write('This write contains less\n')
chopper.write('This is the last write\n')
Output:
>> Got a new stream! <<
This write contains more than 30 bytes
>> Got a new stream! <<
This write contains less
This is the last write
chopper = new StreamChopper([options])
Instantiate a StreamChopper
instance. StreamChopper
is a writable
stream.
Takes an optional options
object which, besides the normal options
accepted by the Writable
class, accepts the following
config options:
size
- The maximum number of bytes that can be written to the
chopper
stream before a new output stream is emitted (default:
Infinity
)time
- The maximum number of milliseconds that an output stream can
be in use before a new output stream is emitted (default: -1
which
means no limit)type
- Change the algoritm used to determine how a written chunk
that cannot fit into the current output stream should be handled. The
following values are possible:
StreamChopper.split
- Fit as much data from the chunk as possible
into the current stream and write the remainder to the next stream
(default)StreamChopper.overflow
- Allow the entire chunk to be written to
the current stream. After writing, the stream is endedStreamChopper.underflow
- End the current output stream and write
the entire chunk to the next streamtransform
- An optional function that returns a transform stream
used for transforming the data in some way (e.g. a zlib Gzip stream).
If used, the size
option will count towards the size of the output
chunks. This config option cannot be used together with the
StreamChopper.split
typeIf type
is StreamChopper.underflow
and the size of the chunk to be
written is larger than size
an error is emitted.
stream
Emitted every time a new output stream is ready. You must listen for this event.
The listener function is called with two arguments:
stream
- A readable output streamnext
- A function you must call when you're ready to receive a new
output stream. If called with an error, the chopper
stream is
destroyedchopper.size
The maximum number of bytes that can be written to the chopper
stream
before a new output stream is emitted.
Use this property to override it with a new value. The new value will take effect immediately on the current stream.
chopper.time
The maximum number of milliseconds that an output stream can be in use before a new output stream is emitted.
Use this property to override it with a new value. The new value will
take effect when the next stream is initialized. To change the current
timer, see chopper.resetTimer()
.
Set to -1
for no time limit.
chopper.type
The algoritm used to determine how a written chunk that cannot fit into the current output stream should be handled. The following values are possible:
StreamChopper.split
- Fit as much data from the chunk as possible
into the current stream and write the remainder to the next streamStreamChopper.overflow
- Allow the entire chunk to be written to
the current stream. After writing, the stream is endedStreamChopper.underflow
- End the current output stream and write
the entire chunk to the next streamUse this property to override it with a new value. The new value will take effect immediately on the current stream.
chopper.chop([callback])
Manually chop the stream. Forces the current output stream to end even
if its size
limit or time
timeout hasn't been reached yet.
Arguments:
callback
- An optional callback which will be called once the output
stream have endedchopper.resetTimer([time])
Use this function to reset the current timer (configured via the time
config option). Calling this function will force the current timer to
start over.
If the optional time
argument is provided, this value is used as the
new time. This is equivilent to calling:
chopper.time = time
chopper.resetTimer()
If the function is called with time
set to -1
, the current timer is
cancelled and the time limit is disabled for all future streams.
FAQs
Chop a single stream of data into a series of readable streams
We found that stream-chopper demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.