
Security News
Insecure Agents Podcast: Certified Patches, Supply Chain Security, and AI Agents
Socket CEO Feross Aboukhadijeh joins Insecure Agents to discuss CVE remediation and why supply chain attacks require a different security approach.
@dlenroc/binary-decoder
Advanced tools
A lightweight library for implementing incremental binary data parsers using generators.
npm install @dlenroc/binary-decoder
The yield syntax allows parsers to:
yield N reads N bytes; yield -N reads up to |N| bytes, with a minimum of 1 byte.yield ArrayBufferView (e.g., Uint8Array) returns bytes to the buffer.yield <other> adds data to the final result.🚨
yield Nreturns a view of the input passed todecode, making a copy only if multiple chunks are needed to satisfy the request.
import { BinaryDecoder } from '@dlenroc/binary-decoder';
const decoder = new BinaryDecoder(function* () {
// ✨ N ≥ 0 | Read exactly N bytes.
const fixedLengthBytes: Uint8Array = yield 2;
// ✨ N < 0 | Read at most |N| bytes, at least 1 byte.
const variableLengthBytes: Uint8Array = yield -2;
// ✨ ArrayBufferView | Pushback bytes to the internal buffer.
yield Uint8Array.of(40, 50);
// ✨ Other | Enqueue parsed data.
yield { fixedLengthBytes, variableLengthBytes, extraBytes: yield -Infinity };
});
console.log(decoder.decode(Uint8Array.of(10, 20, 30)));
// [
// {
// fixedLengthBytes: Uint8Array(2) [ 10, 20 ],
// variableLengthBytes: Uint8Array(1) [ 30 ],
// extraBytes: Uint8Array(2) [ 40, 50 ]
// }
// ]
Suppose we need to parse a simple protocol defined as follows:
| No. of bytes | Type [Value] | Description |
|---|---|---|
| 1 | U8 [1] | message-type |
| 4 | U32 | length |
| length | U8 array | text |
import { BinaryDecoder, type Decoder } from '@dlenroc/binary-decoder';
type Echo = { type: 1; text: String };
function* parse(): Decoder<Echo> {
while (true) {
const [type] = yield 1;
switch (type) {
case 1:
yield* parseEcho();
break;
default:
throw new Error(`Unknown message type: ${type}`);
}
}
}
function* parseEcho(): Decoder<Echo> {
const bytes = yield 4;
const view = new DataView(bytes.buffer, bytes.byteOffset, bytes.byteLength);
const text = new TextDecoder().decode(yield view.getUint32(0));
yield { type: 1, text };
}
const decoder = new BinaryDecoder(parse);
// create a message
const textBytes = new TextEncoder().encode('Hello, World!');
const chunk = new Uint8Array(5 + textBytes.byteLength);
const view = new DataView(chunk.buffer);
view.setUint8(0, 1);
view.setUint32(1, textBytes.byteLength);
chunk.set(textBytes, 5);
// [1] Parse a message
console.log(decoder.decode(chunk));
// [
// { type: 1, text: 'Hello, World!' }
// ]
// [2] Parse multiple messages
console.log(decoder.decode(Uint8Array.of(...chunk, ...chunk)));
// [
// { type: 1, text: 'Hello, World!' },
// { type: 1, text: 'Hello, World!' }
// ]
// [3] Parse a message split across chunks
console.log(decoder.decode(chunk.subarray(0, 3)));
// []
console.log(decoder.decode(chunk.subarray(3, 5)));
// []
console.log(decoder.decode(chunk.subarray(5)));
// [
// { type: 1, text: 'Hello' }
// ]
In the Stateful Parsing example, the Echo message’s text length is encoded as a U32, which can represent sizes up to 4.29 GB. For processing such large messages, streaming is more practical than buffering the entire message in memory.
Here’s how to implement streaming for large messages:
type Echo = { type: 1; text: ReadableStream<string> };
function* parseEcho(): Decoder<Echo> {
const textDecoderStream = new TextDecoderStream();
const writer = textDecoderStream.writable.getWriter();
try {
yield { type: 1, text: textDecoderStream.readable };
const bytes = yield 4;
const view = new DataView(bytes.buffer, bytes.byteOffset, bytes.byteLength);
let eta = view.getUint32(0);
while (eta > 0) {
const chunk = yield -eta;
writer.write(chunk);
eta -= chunk.byteLength;
}
} finally {
writer.close();
}
}
// ...
// [3] Parse a message split across chunks (streaming output)
console.log(decoder.decode(chunk.subarray(0, 3)));
// [
// {
// type: 1,
// text: ReadableStream { locked: false, state: 'readable', supportsBYOB: false }
// }
// ]
console.log(decoder.decode(chunk.subarray(3, 5)));
// []
console.log(decoder.decode(chunk.subarray(5)));
// []
FAQs
Incremental binary data parser powered by generators.
We found that @dlenroc/binary-decoder demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket CEO Feross Aboukhadijeh joins Insecure Agents to discuss CVE remediation and why supply chain attacks require a different security approach.

Security News
Tailwind Labs laid off 75% of its engineering team after revenue dropped 80%, as LLMs redirect traffic away from documentation where developers discover paid products.

Security News
The planned feature introduces a review step before releases go live, following the Shai-Hulud attacks and a rocky migration off classic tokens that disrupted maintainer workflows.