
Security News
GitHub Actions Pricing Whiplash: Self-Hosted Actions Billing Change Postponed
GitHub postponed a new billing model for self-hosted Actions after developer pushback, but moved forward with hosted runner price cuts on January 1.
bufferstream
Advanced tools
painless stream buffering, cutting and piping.
npm install bufferstream
BufferStream is a full node.js Stream so it has apis of both Writeable Stream and Readable Stream.
BufferStream = require('bufferstream')
stream = new BufferStream([{encoding:'utf8', size:'none'}]) // default
encoding default encoding for writing stringsblocking if true and the source is a child_process the stream will block the entire process (timeouts wont work anymore, but splitting and listening on data still works, because they work sync)size defines buffer level or sets buffer to given size (see ↓setSize for more)disabled immediately call disablesplit short form for:stream.split(token, function (chunk) {stream.emit('data', chunk)})
stream.setSize(size) // can be one of ['none', 'flexible', <number>]
different buffer behaviors can be triggered by size:
none when output drains, bufferstream drains tooflexible buffers everthing that it gets and not piping out<number> TODO buffer has given size. buffers everthing until buffer is full. when buffer is full then the stream will drainstream.enable()
enables stream buffering default
stream.disable()
flushes buffer and disables stream buffering. BufferStream now pipes all data as long as the output accepting data. when the output is draining BufferStream will buffer all input temporary.
stream.disable(token, ...)
stream.disable(tokens) // Array
token[s] buffer splitters (should be String or Buffer)disables given tokens. wont flush until no splitter tokens are left.
stream.split(token, ...)
stream.split(tokens) // Array
token[s] buffer splitters (should be String or Buffer)each time BufferStream finds a splitter token in the input data it will emit a split event. this also works for binary data.
stream.on('split', function (chunk, token) {…})
stream.split(token, function (chunk, token) {…}) // only get called for this particular token
whenever the stream is enabled it will try to find all splitter token in stream.buffer,
cut it off and emit the chunk (without token) as split event.
this data will be lost when not handled.
the chunk is the cut off of stream.buffer without the token.
Warning: try to avoid calling stream.emit('data', newchunk) more than one time, because this will likely throw Error: Offset is out of bounds.
stream.getBuffer()
// or just
stream.buffer
returns its Buffer.
stream.toString()
shortcut for stream.buffer.toString()
stream.length
shortcut for stream.buffer.length
PostBuffer = require('bufferstream/postbuffer')
post = new PostBuffer(req)
req http.ServerRequestfor if you want to get all the post data from a http server request and do some db reqeust before.
buffer http client
post.onEnd(function (data) {…});
set a callback to get all post data from a http server request
post.pipe(stream, options);
pumps data into another stream to allow incoming streams given options will be passed to Stream.pipe
To improve platform independence bufferstream is using bufferjs instead of buffertools since version 0.6.0.
Just run npm install buffertools to use their implementation of Buffer.indexOf which is sligthly faster than bufferjs's version.
if you're forced to use the javascript-only version of Buffer.indexOf (like on windows) you can disable the warning by:
require('bufferstream').fn.warn = false
BufferStream = require('bufferstream')
stream = new BufferStream({encoding:'utf8', size:'flexible'})
stream.split('//', ':')
stream.on('split', function (chunk, token) {
console.log("got '%s' by '%s'", chunk.toString(), token.toString())
})
stream.write("buffer:stream//23")
console.log(stream.toString())
results in
got 'buffer' by ':'
got 'stream' by '//'
23
I'm not sure from your readme what the split event emits?
you can specify more than one split token .. so it's emitted whenever a token is found.
does it emit the buffer up to the just before the token starts?
yes.
also, does it join buffers together if they do not already end in a token?
when size is flexible it joins everything together what it gets to
one buffer (accessible through stream.buffer or
stream.getBuffer())
whenever it gets data, it will try to find all tokens
in other words, can I use this to rechunk a stream so that the chunks always break on newlines, for example?
yes.
stream = new BufferStream({size:'flexible'});
stream.split('\n', function (line) { // line doesn't have a '\n' anymore
stream.emit('data', line); // Buffer.isBuffer(line) === true
});
FAQs
painless stream buffering and cutting
The npm package bufferstream receives a total of 558 weekly downloads. As such, bufferstream popularity was classified as not popular.
We found that bufferstream demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
GitHub postponed a new billing model for self-hosted Actions after developer pushback, but moved forward with hosted runner price cuts on January 1.

Research
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.

Security News
Socket CTO Ahmad Nassri shares practical AI coding techniques, tools, and team workflows, plus what still feels noisy and why shipping remains human-led.