Big update!Introducing GitHub Bot Commands. Learn more
Socket
Log inBook a demo

@discoveryjs/json-ext

Package Overview
Dependencies
0
Maintainers
3
Versions
15
Issues
File Explorer

Advanced tools

@discoveryjs/json-ext

A set of utilities that extend the use of JSON

    0.5.7latest

Version published
Maintainers
3
Weekly downloads
7,792,240
decreased by-11.45%

Weekly downloads

Changelog

Source

0.5.7 (2022-03-09)

  • Fixed adding entire package.json content to a bundle when target is a browser

Readme

Source

json-ext

NPM version Build Status Coverage Status NPM Downloads

A set of utilities that extend the use of JSON. Designed to be fast and memory efficient

Features:

Install

npm install @discoveryjs/json-ext

API

parseChunked(chunkEmitter)

Works the same as JSON.parse() but takes chunkEmitter instead of string and returns Promise.

NOTE: reviver parameter is not supported yet, but will be added in next releases. NOTE: WHATWG streams aren't supported yet

When to use:

  • It's required to avoid freezing the main thread during big JSON parsing, since this process can be distributed in time
  • Huge JSON needs to be parsed (e.g. >500MB on Node.js)
  • Needed to reduce memory pressure. JSON.parse() needs to receive the entire JSON before parsing it. With parseChunked() you may parse JSON as first bytes of it comes. This approach helps to avoid storing a huge string in the memory at a single time point and following GC.

Benchmark

Usage:

const { parseChunked } = require('@discoveryjs/json-ext'); // as a regular Promise parseChunked(chunkEmitter) .then(data => { /* data is parsed JSON */ }); // using await (keep in mind that not every runtime has a support for top level await) const data = await parseChunked(chunkEmitter);

Parameter chunkEmitter can be:

const fs = require('fs'); const { parseChunked } = require('@discoveryjs/json-ext'); parseChunked(fs.createReadStream('path/to/file.json'))
  • Generator, async generator or function that returns iterable (chunks). Chunk might be a string, Uint8Array or Buffer (Node.js only):
const { parseChunked } = require('@discoveryjs/json-ext'); const encoder = new TextEncoder(); // generator parseChunked(function*() { yield '{ "hello":'; yield Buffer.from(' "wor'); // Node.js only yield encoder.encode('ld" }'); // returns Uint8Array(5) [ 108, 100, 34, 32, 125 ] }); // async generator parseChunked(async function*() { for await (const chunk of someAsyncSource) { yield chunk; } }); // function that returns iterable parseChunked(() => ['{ "hello":', ' "world"}'])

Using with fetch():

async function loadData(url) { const response = await fetch(url); const reader = response.body.getReader(); return parseChunked(async function*() { while (true) { const { done, value } = await reader.read(); if (done) { break; } yield value; } }); } loadData('https://example.com/data.json') .then(data => { /* data is parsed JSON */ })

stringifyStream(value[, replacer[, space]])

Works the same as JSON.stringify(), but returns an instance of ReadableStream instead of string.

NOTE: WHATWG Streams aren't supported yet, so function available for Node.js only for now

Departs from JSON.stringify():

  • Outputs null when JSON.stringify() returns undefined (since streams may not emit undefined)
  • A promise is resolving and the resulting value is stringifying as a regular one
  • A stream in non-object mode is piping to output as is
  • A stream in object mode is piping to output as an array of objects

When to use:

  • Huge JSON needs to be generated (e.g. >500MB on Node.js)
  • Needed to reduce memory pressure. JSON.stringify() needs to generate the entire JSON before send or write it to somewhere. With stringifyStream() you may send a result to somewhere as first bytes of the result appears. This approach helps to avoid storing a huge string in the memory at a single time point.
  • The object being serialized contains Promises or Streams (see Usage for examples)

Benchmark

Usage:

const { stringifyStream } = require('@discoveryjs/json-ext'); // handle events stringifyStream(data) .on('data', chunk => console.log(chunk)) .on('error', error => consold.error(error)) .on('finish', () => console.log('DONE!')); // pipe into a stream stringifyStream(data) .pipe(writableStream);

Using Promise or ReadableStream in serializing object:

const fs = require('fs'); const { stringifyStream } = require('@discoveryjs/json-ext'); // output will be // {"name":"example","willSerializeResolvedValue":42,"fromFile":[1, 2, 3],"at":{"any":{"level":"promise!"}}} stringifyStream({ name: 'example', willSerializeResolvedValue: Promise.resolve(42), fromFile: fs.createReadStream('path/to/file.json'), // support file content is "[1, 2, 3]", it'll be inserted as it at: { any: { level: new Promise(resolve => setTimeout(() => resolve('promise!'), 100)) } } }) // in case several async requests are used in object, it's prefered // to put fastest requests first, because in this case stringifyStream({ foo: fetch('http://example.com/request_takes_2s').then(req => req.json()), bar: fetch('http://example.com/request_takes_5s').then(req => req.json()) });

Using with WritableStream (Node.js only):

const fs = require('fs'); const { stringifyStream } = require('@discoveryjs/json-ext'); // pipe into a console stringifyStream(data) .pipe(process.stdout); // pipe into a file stringifyStream(data) .pipe(fs.createWriteStream('path/to/file.json')); // wrapping into a Promise new Promise((resolve, reject) => { stringifyStream(data) .on('error', reject) .pipe(stream) .on('error', reject) .on('finish', resolve); });

stringifyInfo(value[, replacer[, space[, options]]])

value, replacer and space arguments are the same as for JSON.stringify().

Result is an object:

{ minLength: Number, // minimal bytes when values is stringified circular: [...], // list of circular references duplicate: [...], // list of objects that occur more than once async: [...] // list of async values, i.e. promises and streams }

Example:

const { stringifyInfo } = require('@discoveryjs/json-ext'); console.log( stringifyInfo({ test: true }).minLength ); // > 13 // that equals '{"test":true}'.length
Options
async

Type: Boolean
Default: false

Collect async values (promises and streams) or not.

continueOnCircular

Type: Boolean
Default: false

Stop collecting info for a value or not whenever circular reference is found. Setting option to true allows to find all circular references.

version

The version of library, e.g. "0.3.1".

License

MIT

Keywords

FAQs

What is @discoveryjs/json-ext?

A set of utilities that extend the use of JSON

Is @discoveryjs/json-ext popular?

The npm package @discoveryjs/json-ext receives a total of 7,373,970 weekly downloads. As such, @discoveryjs/json-ext popularity was classified as popular.

Is @discoveryjs/json-ext well maintained?

We found that @discoveryjs/json-ext demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.It has 3 open source maintainers collaborating on the project.

Last updated on 09 Mar 2022

Did you know?

Socket installs a Github app to automatically flag issues on every pull request and report the health of your dependencies. Find out what is inside your node modules and prevent malicious activity before you update the dependencies.

Install Socket
Socket

Product

Subscribe to our newsletter

Get open source security insights delivered straight into your inbox. Be the first to learn about new features and product updates.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc