chunkosaurus
It's not the most efficient chunker, but it is the easiest to understand.
Overview
Chunkosaurus provides a function for splitting large strings into easy to understand
chunks, and a class for reformating those chunks back together into data. Data is not
serialized, though hypothetically it could be in a future version with no change to
the Chunkosaurus API.
chunkify(data: str, chunkSize: number)
Splits data up into chunks to be ingested by Chunkosaurus
. While chunks have an
accessible data structure, it's best to not rely on accessing them in your code
in case the structure were to change in the future. Treat chunks like a black box.
const chunks = chunkify('some data here', 5);
Chunkosaurus
An EventEmitter
class that receives chunk
events, and spits out data
events
after it's processed all of the chunks in a piece of data. You can also call
.feedChunk(chunk)
, it's identical to .emit('chunk', chunk)
.
import { Chunkosaurus, chunks } from 'chunkosaurus';
const c = Chunkosaurus();
c.on('data', (data) => {
console.log(`Full data: ${data}`)
});
const chunks = chunkify('some data here', 5);
c.emit('chunk', chunks[0]);
c.emit('chunk', chunks[1]);
c.emit('chunk', chunks[2]);
isChunk
Utility function for determining if an object is a chunk. Useful for streams where you
only want to chunk large data, but want to send smaller bits of data as-is.
isChunk({
id: '2b518fa0-6157-4165-aeb4-c93db2710f1a',
chunks: 3,
chunk: 1,
data: 'some data here'
});
isChunk('some data here');