Security News
Introducing the Socket Python SDK
The initial version of the Socket Python SDK is now on PyPI, enabling developers to more easily interact with the Socket REST API in Python projects.
@discoveryjs/json-ext
Advanced tools
The @discoveryjs/json-ext npm package provides utilities for working with JSON data, including features for streaming, parsing, and stringifying large JSON objects and arrays. It is designed to handle JSON data efficiently, making it easier to work with large JSON files or streams of JSON data.
Streaming JSON parse
This feature allows for parsing newline-delimited JSON (NDJSON) from a stream, enabling efficient processing of large JSON files without loading them entirely into memory.
const { createReadStream } = require('fs');
const { parseNDJSON } = require('@discoveryjs/json-ext');
const stream = createReadStream('path/to/your/file.ndjson');
parseNDJSON(stream, (obj) => {
console.log(obj);
});
Stringify JSON to stream
This feature allows for stringifying JSON objects and piping them to a writable stream, which is useful for creating NDJSON files or streaming JSON data over the network.
const { createWriteStream } = require('fs');
const { stringifyStream } = require('@discoveryjs/json-ext');
const stream = createWriteStream('path/to/your/output.ndjson');
const objects = [{ foo: 'bar' }, { baz: 'qux' }];
stringifyStream(objects).pipe(stream);
JSONStream offers similar streaming JSON parsing and stringifying capabilities. It allows for parsing JSON files or streams using JSONPath-like expressions. Compared to @discoveryjs/json-ext, JSONStream focuses more on the streaming aspect and might be more suitable for scenarios where JSONPath expressions are needed for selecting data.
stream-json provides a toolkit for processing JSON as a stream. It includes a variety of stream components for parsing, filtering, and transforming JSON data. While it shares the streaming JSON processing capability with @discoveryjs/json-ext, stream-json offers a more modular approach, allowing users to build custom processing pipelines.
A set of utilities that extend the use of JSON:
JSON.parse()
but iterates over chunks, reconstructing the result object.JSON.stringify()
, but returns a generator yielding strings instead of a single string.Features:
JSON.parse()
and JSON.stringify()
require the entire JSON content before processing. parseChunked()
and stringifyChunked()
allow processing and sending data incrementally, avoiding large memory consumption at a single time point and reducing GC pressure.npm install @discoveryjs/json-ext
Functions like JSON.parse()
, iterating over chunks to reconstruct the result object, and returns a Promise.
Note:
reviver
parameter is not supported yet.
function parseChunked(input: Iterable<Chunk> | AsyncIterable<Chunk>): Promise<any>;
function parseChunked(input: () => (Iterable<Chunk> | AsyncIterable<Chunk>)): Promise<any>;
type Chunk = string | Buffer | Uint8Array;
Usage:
import { parseChunked } from '@discoveryjs/json-ext';
const data = await parseChunked(chunkEmitter);
Parameter chunkEmitter
can be an iterable or async iterable that iterates over chunks, or a function returning such a value. A chunk can be a string
, Uint8Array
, or Node.js Buffer
.
Examples:
parseChunked(function*() {
yield '{ "hello":';
yield Buffer.from(' "wor'); // Node.js only
yield new TextEncoder().encode('ld" }'); // returns Uint8Array
});
parseChunked(async function*() {
for await (const chunk of someAsyncSource) {
yield chunk;
}
});
parseChunked(['{ "hello":', ' "world"}'])
parseChunked(() => ['{ "hello":', ' "world"}'])
Readable
stream:
import { parseChunked } from '@discoveryjs/json-ext';
import fs from 'node:fs';
parseChunked(fs.createReadStream('path/to/file.json'))
Note: Iterability for Web streams was added later in the Web platform, not all environments support it. Consider using
parseFromWebStream()
for broader compatibility.
const response = await fetch('https://example.com/data.json');
const data = await parseChunked(response.body); // body is ReadableStream
Functions like JSON.stringify()
, but returns a generator yielding strings instead of a single string.
Note: Returns
"null"
whenJSON.stringify()
returnsundefined
(since a chunk cannot beundefined
).
function stringifyChunked(value: any, replacer?: Replacer, space?: Space): Generator<string, void, unknown>;
function stringifyChunked(value: any, options: StringifyOptions): Generator<string, void, unknown>;
type Replacer =
| ((this: any, key: string, value: any) => any)
| (string | number)[]
| null;
type Space = string | number | null;
type StringifyOptions = {
replacer?: Replacer;
space?: Space;
highWaterMark?: number;
};
Usage:
import { stringifyStream } from '@discoveryjs/json-ext';
const chunks = [...stringifyChunked(data)];
// or
for (const chunk of stringifyChunked(data)) {
console.log(chunk);
}
Examples:
Streaming into a file (Node.js):
Readable.from(stringifyChunked(data))
.pipe(fs.createWriteStream('path/to/file.json'));
Wrapping into a Promise
for piping into a writable Node.js stream:
new Promise((resolve, reject) => {
Readable.from(stringifyChunked(data))
.on('error', reject)
.pipe(stream)
.on('error', reject)
.on('finish', resolve);
});
Using with fetch (JSON streaming):
Note: This feature has limited support in browsers, see Streaming requests with the fetch API
Note:
ReadableStream.from()
has limited support in browsers, usecreateStringifyWebStream()
instead.
fetch('http://example.com', {
method: 'POST',
duplex: 'half',
body: ReadableStream.from(stringifyChunked(data))
});
Wrapping into ReadableStream
:
Note: Use
ReadableStream.from()
orcreateStringifyWebStream()
when no extra logic is needed
new ReadableStream({
start() {
this.generator = stringifyChunked(data);
},
pull(controller) {
const { value, done } = this.generator.next();
if (done) {
controller.close();
} else {
controller.enqueue(value);
}
},
cancel() {
this.generator = null;
}
});
export function stringifyInfo(value: any, replacer?: Replacer, space?: Space): StringifyInfoResult;
export function stringifyInfo(value: any, options?: StringifyInfoOptions): StringifyInfoResult;
type StringifyInfoOptions = {
replacer?: Replacer;
space?: Space;
continueOnCircular?: boolean;
}
type StringifyInfoResult = {
minLength: number;
circular: Object[]; // list of circular references
};
Functions like JSON.stringify()
, but returns an object with the expected overall size of the stringify operation and a list of circular references.
Example:
import { stringifyInfo } from '@discoveryjs/json-ext';
console.log(stringifyInfo({ test: true }));
// {
// bytes: 13, // Buffer.byteLength('{"test":true}')
// circular: []
// }
Type: Boolean
Default: false
Determines whether to continue collecting info for a value when a circular reference is found. Setting this option to true
allows finding all circular references.
A helper function to consume JSON from a Web Stream. You can use parseChunked(stream)
instead, but @@asyncIterator
on ReadableStream
has limited support in browsers (see ReadableStream compatibility table).
import { parseFromWebStream } from '@discoveryjs/json-ext';
const data = await parseFromWebStream(readableStream);
// equivalent to (when ReadableStream[@@asyncIterator] is supported):
// await parseChunked(readableStream);
A helper function to convert stringifyChunked()
into a ReadableStream
(Web Stream). You can use ReadableStream.from()
instead, but this method has limited support in browsers (see ReadableStream.from() compatibility table).
import { createStringifyWebStream } from '@discoveryjs/json-ext';
createStringifyWebStream({ test: true });
// equivalent to (when ReadableStream.from() is supported):
// ReadableStream.from(stringifyChunked({ test: true }))
MIT
0.6.0 (2024-07-02)
stringifyChunked()
as a generator functioncreateStringifyWebStream()
functionparseFromWebStream()
functionparseChunked()
to accept an iterable or async iterable that iterates over string, Buffer, or TypedArray elementsstringifyStream()
, use Readable.from(stringifyChunked())
insteadstringifyChunked()
with JSON.stringify()
when replacer a list of keys and a key refer to an entry in a prototype chainstringifyInfo()
:
stringifyChunked
by accepting options
as the second parameter. Now supports:
stringifyInfo(value, replacer?, space?)
stringifyInfo(value, options?)
minLength
field into bytes
in functions resultasync
optionasync
and duplicate
fieldsJSON.stringify()
when replacer a list of keys and a key refer to an entry in a prototype chainversion
attributeFAQs
A set of utilities that extend the use of JSON
The npm package @discoveryjs/json-ext receives a total of 7,325,537 weekly downloads. As such, @discoveryjs/json-ext popularity was classified as popular.
We found that @discoveryjs/json-ext demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The initial version of the Socket Python SDK is now on PyPI, enabling developers to more easily interact with the Socket REST API in Python projects.
Security News
Floating dependency ranges in npm can introduce instability and security risks into your project by allowing unverified or incompatible versions to be installed automatically, leading to unpredictable behavior and potential conflicts.
Security News
A new Rust RFC proposes "Trusted Publishing" for Crates.io, introducing short-lived access tokens via OIDC to improve security and reduce risks associated with long-lived API tokens.