Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

bufferstream

Package Overview
Dependencies
Maintainers
1
Versions
36
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

bufferstream

painless stream buffering and cutting

  • 0.4.7
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
747
increased by12.5%
Maintainers
1
Weekly downloads
 
Created
Source

BufferStream

painless stream buffering, cutting and piping.

install

make sure you have node-waf installed (contained in nodejs-dev package).

npm install bufferstream

api

BufferStream is a full node.js Stream so it has apis of both Writeable Stream and Readable Stream.

BufferStream

BufferStream = require('bufferstream')
stream = new BufferStream([{encoding:'utf8', size:'none'}]) // default
  • encoding default encoding for writing strings
  • size defines buffer level or sets buffer to given size (see ↓setSize for more)
  • split short form for:
stream.split(token, function (chunk) {stream.emit('data', chunk)})

stream.setSize

stream.setSize(size) // can be one of ['none', 'flexible', <number>]

different buffer behaviors can be triggered by size:

  • none when output drains, bufferstream drains too
  • flexible buffers everthing that it gets and not piping out
  • <number> TODO buffer has given size. buffers everthing until buffer is full. when buffer is full then the stream will drain

stream.enable

stream.enable()

enables stream buffering default

stream.disable

stream.disable()

flushes buffer and disables stream buffering. BufferStream now pipes all data as long as the output accepting data. when the output is draining BufferStream will buffer all input temporary.

stream.disable(token, ...)
stream.disable(tokens) // Array
  • token[s] buffer splitters (should be String or Buffer)

disables given tokens. wont flush until no splitter tokens are left.

stream.split

stream.split(token, ...)
stream.split(tokens) // Array
  • token[s] buffer splitters (should be String or Buffer)

each time BufferStream finds a splitter token in the input data it will emit a split event. this also works for binary data.

Event: 'split'

stream.on('split', function (chunk, token) {…})
stream.split(token, function (chunk, token) {…}) // only get called for this particular token

whenever the stream is enabled it will try to find all splitter token in stream.buffer, cut it off and emit the chunk (without token) as split event. this data will be lost when not handled.

the chunk is the cut off of stream.buffer without the token.

Warning: try to avoid calling stream.emit('data', newchunk) more than one time, because this will likely throw Error: Offset is out of bounds.

stream.getBuffer

stream.getBuffer()
// or just
stream.buffer

returns its Buffer.

stream.toString

stream.toString()

shortcut for stream.buffer.toString()

stream.length

stream.length

shortcut for stream.buffer.length

PostBuffer

PostBuffer = require('bufferstream/postbuffer')
post = new PostBuffer(req)
  • req http.ServerRequest

for if you want to get all the post data from a http server request and do some db reqeust before.

buffer http client

post.onEnd

post.onEnd(function (data) {…});

set a callback to get all post data from a http server request

post.pipe

post.pipe(stream, options);

pumps data into another stream to allow incoming streams given options will be passed to Stream.pipe

example

BufferStream = require('bufferstream')
stream = new BufferStream({encoding:'utf8', size:'flexible'})
stream.split('//', ':')
stream.on('split', function (chunk, token) {
    console.log("got '%s' by '%s'", chunk.toString(), token.toString())
})
stream.write("buffer:stream//23")
console.log(stream.toString())

results in

got 'buffer' by ':'
got 'stream' by '//'
23

FAQ

I'm not sure from your readme what the split event emits?

you can specify more than one split token .. so it's emitted whenever a token is found.

does it emit the buffer up to the just before the token starts?

yes.

also, does it join buffers together if they do not already end in a token?

when size is flexible it joins everything together what it gets to one buffer (accessible through stream.buffer or stream.getBuffer()) whenever it gets data, it will try to find all tokens

in other words, can I use this to rechunk a stream so that the chunks always break on newlines, for example?

yes.

stream = new BufferStream({size:'flexible'});
stream.split('\n', function (line) { // line doesn't have a '\n' anymore
    stream.emit('data', line); // Buffer.isBuffer(line) === true
});

Keywords

FAQs

Package last updated on 21 Sep 2011

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc