Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@json2csv/plainjs

Package Overview
Dependencies
Maintainers
1
Versions
14
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@json2csv/plainjs

Pure Javascript JSON to CSV converter.

  • 7.0.6
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
235K
decreased by-10.62%
Maintainers
1
Weekly downloads
 
Created
Source

@json2csv/plainjs

npm version npm monthly downloads Node.js CI Coverage Status license

Fast and highly configurable JSON to CSV converter. It fully support conversion following the RFC4180 specification as well as other similar text delimited formats as TSV.

@json2csv/plainjs exposes plain JavasScript modules of json2csv which can be used in Node.js, the browser or Deno.

This package includes two modules:

  • Parser: A synchronous JSON to CSV converter written in plain js. It's fast and simple, but it loads the entire dataset in memory and block the event loop. So it's not advisable for big dataset or the browser.
  • Stream Parser: An asynchronous JSON to CSV converter written in plain js. Ideal for large dataset and the browser.

Features

  • Fast and lightweight
  • Support for standard JSON as well as NDJSON
  • Scalable to infinitely large datasets (using stream processing)
  • Advanced data selection (automatic field discovery, underscore-like selectors, custom data getters, default values for missing fields, ...)
  • Support for custom input data transformation
  • Support for custom csv cell formatting.
  • Highly customizable (supporting custom quotation marks, delimiters, eol values, etc.)
  • Automatic escaping (preserving new lines, quotes, etc.)
  • Optional headers
  • Unicode encoding support
  • Pretty printing in table format to stdout

Other json2csv packages

There are multiple flavours of json2csv:

  • Plainjs: Includes the Parser API and a new StreamParser API which doesn't the conversion in a streaming fashion in pure js.
  • Node: Includes the Node Transform and Node Async Parser APIs for Node users.
  • WHATWG: Includes the WHATWG Transform Stream and WHATWG Async Parser APIs for users of WHATWG streams (browser, Node or Deno).
  • CLI: Includes the CLI interface.

And a couple of libraries that enable additional configurations:

  • Transforms: Includes the built-in transforms for json2csv (unwind and flatten) allowing the using to transform data before is parsed.
  • Formatters: Includes the built-in formatters for json2csv (one for each data type, an excel-specific one, etc.). Formatters convert JSON data types into CSV-compatible strings.

Requirements

  • None

Installation

NPM

You can install json2csv as a dependency using NPM.

$ npm install --save @json2csv/plainjs

Yarn

You can install json2csv as a dependency using Yarn.

$ yarn add --save @json2csv/plainjs

CDN

json2csv plainjs modules is packaged as an ES6 modules. If your browser supports modules, you can load json2csv plainjs modules directly on the browser from the CDN.

You can import the latest version:

<script type="module">
  import { Parser } from 'https://cdn.jsdelivr.net/npm/@json2csv/plainjs/src/Parser.js';
  import { StreamParser } from 'https://cdn.jsdelivr.net/npm/@json2csv/plainjs/src/StreamParser.js';
</script>

You can also select a specific version:

<script type="module">
  import { Parser } from 'https://cdn.jsdelivr.net/npm/@json2csv/plainjs@6.0.0/src/Parser.js';
  import { StreamParser } from 'https://cdn.jsdelivr.net/npm/@json2csv/plainjs@6.0.0/src/StreamParser.js';
</script>

Parser

json2csv can be used programmatically as a synchronous converter.

This loads the entire JSON in memory and do the whole processing in-memory while blocking Javascript event loop. For that reason is rarely a good reason to use it until your data is very small or your application doesn't do anything else.

Usage

import { Parser } from '@json2csv/plainjs';

try {
  const opts = {};
  const parser = new Parser(opts);
  const csv = parser.parse(myData);
  console.log(csv);
} catch (err) {
  console.error(err);
}
Parameters
Options
  • fields <DataSelector[]> Defaults to toplevel JSON attributes.
  • transforms <Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.
  • formatters <Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.
  • defaultValue <Any> value to use when missing data. Defaults to <empty> if not specified. (Overridden by fields[].default)
  • delimiter <String> delimiter of columns. Defaults to , if not specified.
  • eol <String> overrides the default OS line ending (i.e. \n on Unix and \r\n on Windows).
  • header <Boolean> determines whether or not CSV file will contain a title column. Defaults to true if not specified.
  • includeEmptyRows <Boolean> includes empty rows. Defaults to false.
  • withBOM <Boolean> with BOM character. Defaults to false.

Complete Documentation

See https://juanjodiaz.github.io/json2csv/#/parsers/parser.

Stream Parser

The synchronous API has the downside of loading the entire JSON array in memory and blocking JavaScript's event loop while processing the data. This means that your server won't be able to process more request or your UI will become irresponsive while data is being processed. For those reasons, it is rarely a good reason to use it unless your data is very small or your application doesn't do anything else.

The async parser processes the data as a it comes in so you don't need the entire input data set loaded in memory and you can avoid blocking the event loop for too long. Thus, it's better suited for large datasets or system with high concurrency.

The streaming API takes a second options argument to configure objectMode and ndjson mode. These options also support fine-tunning the underlying JSON parser.

The streaming API support multiple callbacks to get the resulting CSV, errors, etc.

Usage

import { StreamParser } from '@json2csv/plainjs';

const opts = {};
const asyncOpts = {};
const parser = new StreamParser(opts, asyncOpts);

let csv = '';
parser.onData = (chunk) => (csv += chunk.toString());
parser.onEnd = () => console.log(csv);
parser.onError = (err) => console.error(err);

// You can also listen for events on the conversion and see how the header or the lines are coming out.
parser.onHeader = (header) => console.log(header);
parser.onLine = (line) => console.log(line);
Parameters
Options
  • ndjson <Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.
  • fields <DataSelector[]> Defaults to toplevel JSON attributes.
  • transforms <Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.
  • formatters <Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.
  • defaultValue <Any> value to use when missing data. Defaults to <empty> if not specified. (Overridden by fields[].default)
  • delimiter <String> delimiter of columns. Defaults to , if not specified.
  • eol <String> overrides the default OS line ending (i.e. \n on Unix and \r\n on Windows).
  • header <Boolean> determines whether or not CSV file will contain a title column. Defaults to true if not specified.
  • includeEmptyRows <Boolean> includes empty rows. Defaults to false.
  • withBOM <Boolean> with BOM character. Defaults to false.
Async Options

Options used by the underlying parsing library to process the binary or text stream. Not relevant when running in objectMode. Buffering is only relevant if you expect very large strings/numbers in your JSON. See @streamparser/json for more details about buffering.

  • stringBufferSize <number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.
  • numberBufferSize <number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.

Complete Documentation

See https://juanjodiaz.github.io/json2csv/#/parsers/stream-parser.

License

See LICENSE.md.

Keywords

FAQs

Package last updated on 11 Feb 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc