Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

json2csv

Package Overview
Dependencies
Maintainers
4
Versions
104
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

json2csv - npm Package Compare versions

Comparing version 5.0.6 to 6.0.0-alpha.0

.prettierrc.js

14

bin/json2csv.js

@@ -19,2 +19,3 @@ #!/usr/bin/env node

const { unwind, flatten } = json2csv.transforms;
const { string: stringFormatter, stringExcel: stringExcelFormatter } = json2csv.formatters;
const JSON2CSVParser = json2csv.Parser;

@@ -153,5 +154,15 @@ const Json2csvTransform = json2csv.Transform;

}
const formatters = {
string: config.excelStrings
? stringExcelFormatter
: stringFormatter({
quote: config.quote,
escapedQuote: config.escapedQuote,
})
};
const opts = {
transforms,
formatters,
fields: config.fields

@@ -161,7 +172,4 @@ ? (Array.isArray(config.fields) ? config.fields : config.fields.split(','))

defaultValue: config.defaultValue,
quote: config.quote,
escapedQuote: config.escapedQuote,
delimiter: config.delimiter,
eol: config.eol,
excelStrings: config.excelStrings,
header: config.header,

@@ -168,0 +176,0 @@ includeEmptyRows: config.includeEmptyRows,

@@ -5,22 +5,34 @@ # Changelog

### [5.0.6](https://github.com/zemirco/json2csv/compare/v5.0.5...v5.0.6) (2021-02-03)
## [6.0.0-alpha.0](https://github.com/zemirco/json2csv/compare/v5.0.3...v6.0.0-alpha.0) (2021-04-14)
### Bug Fixes
### ⚠ BREAKING CHANGES
* Escape quotes excel ([#512](https://github.com/zemirco/json2csv/issues/512)) ([ab3bf8a](https://github.com/zemirco/json2csv/commit/ab3bf8ad3ce64cff7d6149e1b70f92b71bebf6c7))
* Drop support for Node < v12
* AsyncParser API has changed, see the `Upgrading from 5.X to 6.X` section for details.
### [5.0.5](https://github.com/zemirco/json2csv/compare/v5.0.4...v5.0.5) (2020-11-16)
* fix: consolidate the API of AsyncParser and parseAsync
* feat: simplify AsyncParser
### Bug Fixes
* chore: drop support for node 11
* don't mutate original object in unset ([#499](https://github.com/zemirco/json2csv/issues/499)) ([6e4ea5e](https://github.com/zemirco/json2csv/commit/6e4ea5ebdc263006ca6ff45879fbb7e0bd65bef2))
* refactor: improve AsyncParser parse method
### [5.0.4](https://github.com/zemirco/json2csv/compare/v5.0.3...v5.0.4) (2020-11-10)
* docs: add links to node docs and fix few small issues
* In the JavaScript modules, `formatters` are introduced and the `quote`, `escapedQuote` and `excelStrings` options are removed. See the migration notes in the readme. CLI hasn't changed.
### Features
* Introduce formatters ([#455](https://github.com/zemirco/json2csv/issues/455)) ([88ed6ee](https://github.com/zemirco/json2csv/commit/88ed6ee780b439d394235c9e8fac7e42b0d614dd))
* use jsonparse for ND-JSON instead of the custom made implementation ([#493](https://github.com/zemirco/json2csv/issues/493)) ([55aa0c7](https://github.com/zemirco/json2csv/commit/55aa0c70374def0dafa342d2a122d077eb87d5e1))
### Bug Fixes
* Fix issue with unwind and empty arrays creating an extra column ([#496](https://github.com/zemirco/json2csv/issues/496)) ([0b331fc](https://github.com/zemirco/json2csv/commit/0b331fc3ad345f8062abe60f64cb3b43dad30fb0))
* consolidate the API of AsyncParser and parseAsync ([#492](https://github.com/zemirco/json2csv/issues/492)) ([bcce91f](https://github.com/zemirco/json2csv/commit/bcce91f953625bb6a3b401d839670bb3cb5ba11a))
* issue with unwind and empty arrays creating an extra column ([#497](https://github.com/zemirco/json2csv/issues/497)) ([3b74735](https://github.com/zemirco/json2csv/commit/3b747359b086ec212a0f6ecb92ec0a40511f75c3))
* Performance optimizations ([#491](https://github.com/zemirco/json2csv/issues/491)) ([471f5a7](https://github.com/zemirco/json2csv/commit/471f5a7a55375a06a66ce4b0438583d719d6db8f))
* prevents Parser and AsyncParser from caching the fields option between executions causing issues and inconsistencies ([#498](https://github.com/zemirco/json2csv/issues/498)) ([4d8a81a](https://github.com/zemirco/json2csv/commit/4d8a81a3139024c31377fc62e4e39ece29e72c8c))
* simplify stringExcel formatter and support proper escaping ([#513](https://github.com/zemirco/json2csv/issues/513)) ([50062c3](https://github.com/zemirco/json2csv/commit/50062c3e155ff2c12b1bb417085188a2156885a8))

@@ -27,0 +39,0 @@ ### [5.0.3](https://github.com/zemirco/json2csv/compare/v5.0.2...v5.0.3) (2020-09-24)

'use strict';
const { Readable } = require('stream');
const JSON2CSVParser = require('./JSON2CSVParser');
const JSON2CSVAsyncParser = require('./JSON2CSVAsyncParser');
const JSON2CSVTransform = require('./JSON2CSVTransform');
// Transforms
const flatten = require('./transforms/flatten');
const unwind = require('./transforms/unwind');
// Formatters
const defaultFormatter = require('./formatters/default');
const number = require('./formatters/number');
const string = require('./formatters/string');
const stringQuoteOnlyIfNecessary = require('./formatters/stringQuoteOnlyIfNecessary');
const stringExcel = require('./formatters/stringExcel');
const symbol = require('./formatters/symbol');
const object = require('./formatters/object');
module.exports.Parser = JSON2CSVParser;

@@ -16,30 +26,17 @@ module.exports.AsyncParser = JSON2CSVAsyncParser;

module.exports.parse = (data, opts) => new JSON2CSVParser(opts).parse(data);
module.exports.parseAsync = (data, opts, transformOpts) => {
try {
if (!(data instanceof Readable)) {
transformOpts = Object.assign({}, transformOpts, { objectMode: true });
}
module.exports.parseAsync = (data, opts, transformOpts) => new JSON2CSVAsyncParser(opts, transformOpts).parse(data).promise();
const asyncParser = new JSON2CSVAsyncParser(opts, transformOpts);
const promise = asyncParser.promise();
if (Array.isArray(data)) {
data.forEach(item => asyncParser.input.push(item));
asyncParser.input.push(null);
} else if (data instanceof Readable) {
asyncParser.fromInput(data);
} else {
asyncParser.input.push(data);
asyncParser.input.push(null);
}
return promise;
} catch (err) {
return Promise.reject(err);
}
};
module.exports.transforms = {
flatten,
unwind,
};
};
module.exports.formatters = {
default: defaultFormatter,
number,
string,
stringQuoteOnlyIfNecessary,
stringExcel,
symbol,
object,
};
'use strict';
const { Transform } = require('stream');
const { Readable } = require('stream');
const JSON2CSVTransform = require('./JSON2CSVTransform');
const { fastJoin } = require('./utils');
class JSON2CSVAsyncParser {
constructor(opts, transformOpts) {
this.input = new Transform(transformOpts);
this.input._read = () => {};
this.transform = new JSON2CSVTransform(opts, transformOpts);
this.processor = this.input.pipe(this.transform);
this.opts = opts;
this.transformOpts = transformOpts;
}
fromInput(input) {
if (this._input) {
throw new Error('Async parser already has an input.');
/**
* Main function that converts json to csv.
*
* @param {Stream|Array|Object} data Array of JSON objects to be converted to CSV
* @returns {Stream} A stream producing the CSV formated data as a string
*/
parse(data) {
if (typeof data !== 'object') {
throw new Error('Data should not be empty or the "fields" option should be included');
}
this._input = input;
this.input = this._input.pipe(this.processor);
return this;
}
throughTransform(transform) {
if (this._output) {
throw new Error('Can\'t add transforms once an output has been added.');
if (!(data instanceof Readable)) {
data = Readable.from((Array.isArray(data) ? data : [data]).filter(obj => obj !== null));
}
this.processor = this.processor.pipe(transform);
return this;
}
toOutput(output) {
if (this._output) {
throw new Error('Async parser already has an output.');
}
this._output = output;
this.processor = this.processor.pipe(output);
return this;
return data.pipe(new JSON2CSVTransform(this.opts, { objectMode: data.readableObjectMode, ...this.transformOpts }))
}
promise(returnCSV = true) {
return new Promise((resolve, reject) => {
if (!returnCSV) {
this.processor
.on('finish', () => resolve())
.on('error', err => reject(err));
return;
}
let csvBuffer = [];
this.processor
.on('data', chunk => csvBuffer.push(chunk.toString()))
.on('finish', () => resolve(fastJoin(csvBuffer, '')))
.on('error', err => reject(err));
});
}
}
module.exports = JSON2CSVAsyncParser
module.exports = JSON2CSVAsyncParser;

@@ -5,3 +5,8 @@ 'use strict';

const lodashGet = require('lodash.get');
const { getProp, fastJoin, flattenReducer } = require('./utils');
const { getProp } = require('./utils');
const defaultFormatter = require('./formatters/default');
const numberFormatterCtor = require('./formatters/number')
const stringFormatterCtor = require('./formatters/string');
const symbolFormatterCtor = require('./formatters/symbol');
const objectFormatterCtor = require('./formatters/object');

@@ -21,13 +26,32 @@ class JSON2CSVBase {

const processedOpts = Object.assign({}, opts);
if (processedOpts.fields) {
processedOpts.fields = this.preprocessFieldsInfo(processedOpts.fields, processedOpts.defaultValue);
}
processedOpts.transforms = !Array.isArray(processedOpts.transforms)
? (processedOpts.transforms ? [processedOpts.transforms] : [])
: processedOpts.transforms
const stringFormatter = (processedOpts.formatters && processedOpts.formatters['string']) || stringFormatterCtor();
const objectFormatter = objectFormatterCtor({ stringFormatter });
const defaultFormatters = {
header: stringFormatter,
undefined: defaultFormatter,
boolean: defaultFormatter,
number: numberFormatterCtor(),
bigint: defaultFormatter,
string: stringFormatter,
symbol: symbolFormatterCtor({ stringFormatter }),
function: objectFormatter,
object: objectFormatter
};
processedOpts.formatters = {
...defaultFormatters,
...processedOpts.formatters,
};
processedOpts.delimiter = processedOpts.delimiter || ',';
processedOpts.eol = processedOpts.eol || os.EOL;
processedOpts.quote = typeof processedOpts.quote === 'string'
? processedOpts.quote
: '"';
processedOpts.escapedQuote = typeof processedOpts.escapedQuote === 'string'
? processedOpts.escapedQuote
: `${processedOpts.quote}${processedOpts.quote}`;
processedOpts.header = processedOpts.header !== false;

@@ -47,3 +71,3 @@ processedOpts.includeEmptyRows = processedOpts.includeEmptyRows || false;

*/
preprocessFieldsInfo(fields) {
preprocessFieldsInfo(fields, globalDefaultValue) {
return fields.map((fieldInfo) => {

@@ -54,4 +78,4 @@ if (typeof fieldInfo === 'string') {

value: (fieldInfo.includes('.') || fieldInfo.includes('['))
? row => lodashGet(row, fieldInfo, this.opts.defaultValue)
: row => getProp(row, fieldInfo, this.opts.defaultValue),
? row => lodashGet(row, fieldInfo, globalDefaultValue)
: row => getProp(row, fieldInfo, globalDefaultValue),
};

@@ -63,3 +87,3 @@ }

? fieldInfo.default
: this.opts.defaultValue;
: globalDefaultValue;

@@ -99,7 +123,6 @@ if (typeof fieldInfo.value === 'string') {

*/
getHeader() {
return fastJoin(
this.opts.fields.map(fieldInfo => this.processValue(fieldInfo.label)),
this.opts.delimiter
);
getHeader(fields) {
return fields
.map(fieldInfo => this.opts.formatters.header(fieldInfo.label))
.join(this.opts.delimiter);
}

@@ -113,3 +136,3 @@

return this.opts.transforms.reduce((rows, transform) =>
rows.map(row => transform(row)).reduce(flattenReducer, []),
rows.flatMap(row => transform(row)),
[row]

@@ -125,3 +148,3 @@ );

*/
processRow(row) {
processRow(row, fields) {
if (!row) {

@@ -131,12 +154,9 @@ return undefined;

const processedRow = this.opts.fields.map(fieldInfo => this.processCell(row, fieldInfo));
const processedRow = fields.map(fieldInfo => this.processCell(row, fieldInfo));
if (!this.opts.includeEmptyRows && processedRow.every(field => field === undefined)) {
if (!this.opts.includeEmptyRows && processedRow.every(field => field === '')) {
return undefined;
}
return fastJoin(
processedRow,
this.opts.delimiter
);
return processedRow.join(this.opts.delimiter);
}

@@ -162,34 +182,3 @@

processValue(value) {
if (value === null || value === undefined) {
return undefined;
}
const valueType = typeof value;
if (valueType !== 'boolean' && valueType !== 'number' && valueType !== 'string') {
value = JSON.stringify(value);
if (value === undefined) {
return undefined;
}
if (value[0] === '"') {
value = value.replace(/^"(.+)"$/,'$1');
}
}
if (typeof value === 'string') {
if (this.opts.excelStrings) {
if(value.includes(this.opts.quote)) {
value = value.replace(new RegExp(this.opts.quote, 'g'), `${this.opts.escapedQuote}${this.opts.escapedQuote}`);
}
value = `"=""${value}"""`;
} else {
if(value.includes(this.opts.quote)) {
value = value.replace(new RegExp(this.opts.quote, 'g'), this.opts.escapedQuote);
}
value = `${this.opts.quote}${value}${this.opts.quote}`;
}
}
return value;
return this.opts.formatters[typeof value](value);
}

@@ -196,0 +185,0 @@ }

'use strict';
const JSON2CSVBase = require('./JSON2CSVBase');
const { fastJoin, flattenReducer } = require('./utils');

@@ -9,5 +8,2 @@ class JSON2CSVParser extends JSON2CSVBase {

super(opts);
if (this.opts.fields) {
this.opts.fields = this.preprocessFieldsInfo(this.opts.fields);
}
}

@@ -21,21 +17,17 @@ /**

parse(data) {
const processedData = this.preprocessData(data);
const processedData = this.preprocessData(data, this.opts.fields);
if (!this.opts.fields) {
this.opts.fields = processedData
.reduce((fields, item) => {
Object.keys(item).forEach((field) => {
if (!fields.includes(field)) {
fields.push(field)
}
});
const fields = this.opts.fields || this.preprocessFieldsInfo(processedData
.reduce((fields, item) => {
Object.keys(item).forEach((field) => {
if (!fields.includes(field)) {
fields.push(field)
}
});
return fields
}, []);
this.opts.fields = this.preprocessFieldsInfo(this.opts.fields);
}
return fields
}, []));
const header = this.opts.header ? this.getHeader() : '';
const rows = this.processData(processedData);
const header = this.opts.header ? this.getHeader(fields) : '';
const rows = this.processData(processedData, fields);
const csv = (this.opts.withBOM ? '\ufeff' : '')

@@ -55,6 +47,6 @@ + header

*/
preprocessData(data) {
preprocessData(data, fields) {
const processedData = Array.isArray(data) ? data : [data];
if (!this.opts.fields && (processedData.length === 0 || typeof processedData[0] !== 'object')) {
if (!fields && (processedData.length === 0 || typeof processedData[0] !== 'object')) {
throw new Error('Data should not be empty or the "fields" option should be included');

@@ -66,4 +58,3 @@ }

return processedData
.map(row => this.preprocessRow(row))
.reduce(flattenReducer, []);
.flatMap(row => this.preprocessRow(row));
}

@@ -77,7 +68,7 @@

*/
processData(data) {
return fastJoin(
data.map(row => this.processRow(row)).filter(row => row), // Filter empty rows
this.opts.eol
);
processData(data, fields) {
return data
.map(row => this.processRow(row, fields))
.filter(row => row) // Filter empty rows
.join(this.opts.eol);
}

@@ -84,0 +75,0 @@ }

@@ -22,4 +22,2 @@ 'use strict';

this.initObjectModeParse();
} else if (this.opts.ndjson) {
this.initNDJSONParse();
} else {

@@ -34,3 +32,2 @@ this.initJSONParser();

if (this.opts.fields) {
this.opts.fields = this.preprocessFieldsInfo(this.opts.fields);
this.pushHeader();

@@ -58,43 +55,2 @@ }

/**
* Init the transform with a parser to process NDJSON data.
* It maintains a buffer of received data, parses each line
* as JSON and send it to `pushLine for processing.
*/
initNDJSONParse() {
const transform = this;
this.parser = {
_data: '',
write(chunk) {
this._data += chunk.toString();
const lines = this._data
.split('\n')
.map(line => line.trim())
.filter(line => line !== '');
let pendingData = false;
lines
.forEach((line, i) => {
try {
transform.pushLine(JSON.parse(line));
} catch(e) {
if (i === lines.length - 1) {
pendingData = true;
} else {
e.message = `Invalid JSON (${line})`
transform.emit('error', e);
}
}
});
this._data = pendingData
? this._data.slice(this._data.lastIndexOf('\n'))
: '';
},
getPendingData() {
return this._data;
}
};
}
/**
* Init the transform with a parser to process JSON data.

@@ -176,3 +132,3 @@ * It maintains a buffer of received data, parses each as JSON

if (this.opts.header) {
const header = this.getHeader();
const header = this.getHeader(this.opts.fields);
this.emit('header', header);

@@ -198,3 +154,3 @@ this.push(header);

processedData.forEach(row => {
const line = this.processRow(row, this.opts);
const line = this.processRow(row, this.opts.fields);
if (line === undefined) return;

@@ -206,4 +162,14 @@ this.emit('line', line);

}
promise() {
return new Promise((resolve, reject) => {
const csvBuffer = [];
this
.on('data', chunk => csvBuffer.push(chunk.toString()))
.on('finish', () => resolve(csvBuffer.join('')))
.on('error', err => reject(err));
});
}
}
module.exports = JSON2CSVTransform;
const lodashGet = require('lodash.get');
const { setProp, unsetProp, flattenReducer } = require('../utils');
const { setProp, unsetProp } = require('../utils');

@@ -19,4 +19,3 @@ function getUnwindablePaths(obj, currentPath) {

unwindablePaths = unwindablePaths.concat(value
.map(arrObj => getUnwindablePaths(arrObj, newPath))
.reduce(flattenReducer, [])
.flatMap(arrObj => getUnwindablePaths(arrObj, newPath))
.filter((item, index, arr) => arr.indexOf(item) !== index));

@@ -38,3 +37,3 @@ }

return rows
.map(row => {
.flatMap(row => {
const unwindArray = lodashGet(row, unwindPath);

@@ -57,4 +56,3 @@

});
})
.reduce(flattenReducer, []);
});
}

@@ -61,0 +59,0 @@

@@ -11,3 +11,3 @@ 'use strict';

const newValue = pathArray.length > 1 ? setProp(obj[key] || {}, restPath, value) : value;
return Object.assign({}, obj, { [key]: newValue });
return { ...obj, [key]: newValue };
}

@@ -28,3 +28,3 @@

.filter(prop => prop !== key)
.reduce((acc, prop) => Object.assign(acc, { [prop]: obj[prop] }), {});
.reduce((acc, prop) => ({ ...acc, [prop]: obj[prop] }), {});
}

@@ -35,29 +35,2 @@

function flattenReducer(acc, arr) {
try {
// This is faster but susceptible to `RangeError: Maximum call stack size exceeded`
acc.push(...arr);
return acc;
} catch (err) {
// Fallback to a slower but safer option
return acc.concat(arr);
}
}
function fastJoin(arr, separator) {
let isFirst = true;
return arr.reduce((acc, elem) => {
if (elem === null || elem === undefined) {
elem = '';
}
if (isFirst) {
isFirst = false;
return `${elem}`;
}
return `${acc}${separator}${elem}`;
}, '');
}
module.exports = {

@@ -67,4 +40,2 @@ getProp,

unsetProp,
fastJoin,
flattenReducer
};
{
"name": "json2csv",
"version": "5.0.6",
"version": "6.0.0-alpha.0",
"description": "Convert JSON to CSV",

@@ -39,7 +39,7 @@ "keywords": [

"prepublish": "in-publish && npm run before:publish || not-in-publish",
"before:publish": "npm test && npm run build && npm run deploy:docs",
"before:publish": "npm test && npm run build",
"release": "standard-version"
},
"dependencies": {
"commander": "^6.1.0",
"commander": "^6.2.0",
"jsonparse": "^1.3.1",

@@ -49,19 +49,18 @@ "lodash.get": "^4.4.2"

"devDependencies": {
"@babel/core": "^7.3.3",
"@babel/preset-env": "^7.3.1",
"coveralls": "^3.0.3",
"docpress": "^0.8.0",
"eslint": "^6.1.0",
"gh-pages": "^2.0.1",
"in-publish": "^2.0.0",
"nyc": "^14.1.1",
"rollup": "^1.11.0",
"rollup-plugin-babel": "^4.3.2",
"rollup-plugin-commonjs": "^10.0.2",
"rollup-plugin-node-builtins": "^2.1.2",
"rollup-plugin-node-globals": "^1.2.1",
"@babel/core": "^7.12.3",
"@babel/preset-env": "^7.12.1",
"coveralls": "^3.1.0",
"docpress": "^0.8.2",
"eslint": "^7.13.0",
"gh-pages": "^3.1.0",
"in-publish": "^2.0.1",
"nyc": "^15.1.0",
"rollup": "^2.33.2",
"rollup-plugin-babel": "^4.4.0",
"rollup-plugin-commonjs": "^10.1.0",
"rollup-plugin-node-polyfills": "^0.2.1",
"rollup-plugin-node-resolve": "^5.2.0",
"standard-version": "^8.0.1",
"standard-version": "^9.0.0",
"tap-spec": "^5.0.0",
"tape": "^4.10.1"
"tape": "^5.0.1"
},

@@ -73,4 +72,4 @@ "engines": {

"volta": {
"node": "10.19.0"
"node": "12.20.0"
}
}
# json2csv
Converts json into csv with column titles and proper line endings.
Converts JSON into CSV with column titles and proper line endings.
Can be used as a module and from the command line.

@@ -12,4 +12,6 @@

## Features
> :construction: The documentation seen here is for the upcoming v6 which is a work in progress, see https://www.npmjs.com/package/json2csv for the documentation for the latest published version. See [v5 branch](https://github.com/zemirco/json2csv/tree/v5) for code for v5+.
Features
- Fast and lightweight

@@ -28,3 +30,3 @@ - Scalable to infinitely large datasets (using stream processing)

You can install json2csv as a dependency using NPM.
Requires **Node v10** or higher.
Requires **Node v12** or higher.

@@ -50,4 +52,6 @@ ```sh

## Command Line Interface
## Usage
### Command Line Interface
`json2csv` can be called from the command line if installed globally (using the `-g` flag).

@@ -91,105 +95,332 @@

### CLI examples
For more details, you can check some of our CLI usage [examples](./docs/cli-examples.md) or our [test suite](test/CLI.js).
All examples use this example [input file](https://github.com/zemirco/json2csv/blob/master/test/fixtures/json/default.json).
## Javascript module
#### Input file and specify fields
`json2csv` can also be use programatically from you javascript codebase.
```sh
$ json2csv -i input.json -f carModel,price,color
carModel,price,color
"Audi",10000,"blue"
"BMW",15000,"red"
"Mercedes",20000,"yellow"
"Porsche",30000,"green"
The programatic APIs take a configuration object very similar to the CLI options. All APIs take the exact same options.
- `fields` - Array of Objects/Strings. Defaults to toplevel JSON attributes. See example below.
- `ndjson` - Boolean, indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.
- `transforms` - Array of transforms. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order before converting the data record into a CSV row. See bellow for more details.
- `formatters` - Object where the each key is a Javascript data type and its associated value is a formatters for the given type. A formatter is a function that receives the raw js value of a given type and formats it as a valid CSV cell. Supported types are the types returned by `typeof` i.e. `undefined`, `boolean`, `number`, `bigint`, `string`, `symbol`, `function` and `object`.
- `defaultValue` - Default value to use when missing data. Defaults to `<empty>` if not specified. (Overridden by `fields[].default`)
- `delimiter` - String, delimiter of columns. Defaults to `,` if not specified.
- `eol` - String, overrides the default OS line ending (i.e. `\n` on Unix and `\r\n` on Windows).
- `header` - Boolean, determines whether or not CSV file will contain a title column. Defaults to `true` if not specified.
- `includeEmptyRows` - Boolean, includes empty rows. Defaults to `false`.
- `withBOM` - Boolean, with BOM character. Defaults to `false`.
### Transforms
json2csv supports transforms. A transform is a function that receives a data record and returns a transformed record.
#### Custom transforms
```js
function doNothing(item) {
// apply tranformations or create new object
return transformedItem;
}
```
#### Input file, specify fields and use pretty logging
or using ES6
```sh
$ json2csv -i input.json -f carModel,price,color -p
```js
const doNothing = (item) => {
// apply tranformations or create new object
return transformedItem;
};
```
![Screenshot](https://s3.amazonaws.com/zeMirco/github/json2csv/json2csv-pretty.png)
For example, let's add a line counter to our CSV, capitalize the car field and change the price to be in Ks (1000s).
#### Generating CSV containing only specific fields
```js
function addCounter() {
let counter = 1;
return (item) => ({
counter: counter++,
...item,
car: item.car.toUpperCase(),
price: item.price / 1000,
});
}
```
```sh
$ json2csv -i input.json -f carModel,price,color -o out.csv
$ cat out.csv
carModel,price,color
"Audi",10000,"blue"
"BMW",15000,"red"
"Mercedes",20000,"yellow"
"Porsche",30000,"green"
Then you can add `addCounter()` to the `transforms` array.
The reason to wrap the actual transform in a factory function is so the counter always starts with one and you can reuse it. But it's not strictly necessary.
#### Built-in Transforms
There is a number of built-in transform provider by the library.
```js
const {
transforms: { unwind, flatten },
} = require('json2csv');
```
Same result will be obtained passing the fields config as a file.
##### Unwind
```sh
$ json2csv -i input.json -c fieldsConfig.json -o out.csv
The `unwind` transform deconstructs an array field from the input item to output a row for each element. It's similar to MongoDB's \$unwind aggregation.
The transform needs to be instantiated and takes an options object as arguments containing:
- `paths` - Array of Strings, list the paths to the fields to be unwound. It's mandatory and should not be empty.
- `blankOut` - Boolean, unwind using blank values instead of repeating data. Defaults to `false`.
```js
// Default
unwind({ paths: ['fieldToUnwind'] });
// Blanking out repeated data
unwind({ paths: ['fieldToUnwind'], blankOut: true });
```
where the file `fieldsConfig.json` contains
##### Flatten
```json
[
"carModel",
"price",
"color"
]
Flatten nested JavaScript objects into a single level object.
The transform needs to be instantiated and takes an options object as arguments containing:
- `objects` - Boolean, whether to flatten JSON objects or not. Defaults to `true`.
- `arrays`- Boolean, whether to flatten Arrays or not. Defaults to `false`.
- `separator` - String, separator to use between nested JSON keys when flattening a field. Defaults to `.`.
```js
// Default
flatten();
// Custom separator '__'
flatten({ separator: '_' });
// Flatten only arrays
flatten({ objects: false, arrays: true });
```
#### Read input from stdin
### Formatters
```sh
$ json2csv -f price
[{"price":1000},{"price":2000}]
json2csv supports formatters. A formatter is a function that receives the raw js value of a given type and formats it as a valid CSV cell. Supported types are the types returned by `typeof` i.e. `undefined`, `boolean`, `number`, `bigint`, `string`, `symbol`, `function` and `object`.
There is a special type of formatter that only applies to the CSV headers if they are present. This is the `header` formatter and by default it uses the `string` formatter.
Pay special attention to the `string` formatter since other formatters like the `headers` or `object` formatters, rely on the `string` formatter for the stringification.
#### Custom Formatters
```js
function formatType(itemOfType) {
// format object
return formattedItem;
}
```
Hit <kbd>Enter</kbd> and afterwards <kbd>CTRL</kbd> + <kbd>D</kbd> to end reading from stdin. The terminal should show
or using ES6
```js
const formatType = (itemOfType) => {
// apply tranformations or create new object
return itemOfType;
};
```
price
1000
2000
For example, let's format functions as their name or 'unknown'.
```js
const functionNameFormatter = (item) => item.name || 'unknown';
```
#### Appending to existing CSV
Then you can add `{ function: functionNameFormatter }` to the `formatters` object.
Sometimes you want to add some additional rows with the same columns.
This is how you can do that.
A less trivial example would be to ensure that string cells never take more than 20 characters.
```sh
# Initial creation of csv with headings
$ json2csv -i test.json -f name,version > test.csv
# Append additional rows
$ json2csv -i test.json -f name,version --no-header >> test.csv
```js
const stringFixedFormatter = (stringLength, ellipsis = '...') => (item) =>
item.length <= stringLength
? item
: `${item.slice(0, stringLength - ellipsis.length)}${ellipsis}`;
```
## Javascript module
Then you can add `{ string: stringFixedFormatter(20) }` to the `formatters` object.
Or `stringFixedFormatter(20, '')` to not use the ellipsis and just clip the text.
As with the sample transform in the previous section, the reason to wrap the actual formatter in a factory function is so it can be parameterized easily.
`json2csv` can also be use programatically from you javascript codebase.
Keep in mind that the above example doesn't quote or escape the string which is problematic. A more realistic example could use our built-in string formatted to do the quoting and escaping like:
### Available Options
```js
const { formatters: { string: defaultStringFormatter } } = require('json2csv');
The programatic APIs take a configuration object very equivalent to the CLI options.
const stringFixedFormatter = (stringLength, ellipsis = '...', stringFormatter = defaultStringFormatter()) => (item) => item.length <= stringLength ? item : stringFormatter(`${item.slice(0, stringLength - ellipsis.length)}${ellipsis})`;
```
- `fields` - Array of Objects/Strings. Defaults to toplevel JSON attributes. See example below.
- `ndjson` - Only effective on the streaming API. Indicates that data coming through the stream is NDJSON.
- `transforms` - Array of transforms to be applied to each data item. A transform is simply a function that receives a data item and returns the transformed item.
- `defaultValue` - String, default value to use when missing data. Defaults to `<empty>` if not specified. (Overridden by `fields[].default`)
- `quote` - String, quote around cell values and column names. Defaults to `"` if not specified.
- `escapedQuote` - String, the value to replace escaped quotes in strings. Defaults to 2x`quotes` (for example `""`) if not specified.
- `delimiter` - String, delimiter of columns. Defaults to `,` if not specified.
- `eol` - String, overrides the default OS line ending (i.e. `\n` on Unix and `\r\n` on Windows).
- `excelStrings` - Boolean, converts string data into normalized Excel style data.
- `header` - Boolean, determines whether or not CSV file will contain a title column. Defaults to `true` if not specified.
- `includeEmptyRows` - Boolean, includes empty rows. Defaults to `false`.
- `withBOM` - Boolean, with BOM character. Defaults to `false`.
#### Built-in Formatters
### json2csv parser (Synchronous API)
There is a number of built-in formatters provider by the library.
`json2csv` can also be used programatically as a synchronous converter using its `parse` method.
```js
const {
formatters: {
default: defaultFormatter,
number: numberFormatter,
string: stringFormatter,
stringQuoteOnlyIfNecessary: stringQuoteOnlyIfNecessaryFormatter,
stringExcel: stringExcelFormatter,
symbol: symbolFormatter,
object: objectFormatter,
},
} = require('json2csv');
```
##### Default
This formatter just relies on standard JavaScript stringification.
This is the default formatter for `undefined`, `boolean`, `number` and `bigint` elements.
It's not a factory but the formatter itself.
```js
{
undefined: defaultFormatter,
boolean: defaultFormatter,
number: defaultFormatter,
bigint: defaultFormatter,
}
```
##### Number
Format numbers with a fixed amount of decimals
The formatter needs to be instantiated and takes an options object as arguments containing:
- `separator` - String, separator to use between integer and decimal digits. Defaults to `.`. It's crucial that the decimal separator is not the same character as the CSV delimiter or the result CSV will be incorrect.
- `decimals` - Number, amount of decimals to keep. Defaults to all the available decimals.
```js
{
// 2 decimals
number: numberFormatter(),
// 3 decimals
number: numberFormatter(3)
}
```
##### String
Format strings quoting them and escaping illegal characters if needed.
The formatter needs to be instantiated and takes an options object as arguments containing:
- `quote` - String, quote around cell values and column names. Defaults to `"`.
- `escapedQuote` - String, the value to replace escaped quotes in strings. Defaults to double-quotes (for example `""`).
This is the default for `string` elements.
```js
{
// Uses '"' as quote and '""' as escaped quote
string: stringFormatter(),
// Use single quotes `'` as quotes and `''` as escaped quote
string: stringFormatter({ quote: '\'' }),
// Never use quotes
string: stringFormatter({ quote: '' }),
// Use '\"' as escaped quotes
string: stringFormatter({ escapedQuote: '\"' }),
}
```
##### String Quote Only Necessary
The default string formatter quote all strings. This is consistent but it is not mandatory according to the CSV standard. This formatter only quote strings if they don't contain quotes (by default `"`), the CSV separator character (by default `,`) or the end-of-line (by default `\n` or `\r\n` depending on you operating system).
The formatter needs to be instantiated and takes an options object as arguments containing:
- `quote` - String, quote around cell values and column names. Defaults to `"`.
- `escapedQuote` - String, the value to replace escaped quotes in strings. Defaults to 2x`quotes` (for example `""`).
- `eol` - String, overrides the default OS line ending (i.e. `\n` on Unix and `\r\n` on Windows). Ensure that you use the same `eol` here as in the json2csv options.
```js
{
// Uses '"' as quote, '""' as escaped quote and your OS eol
string: stringQuoteOnlyIfNecessaryFormatter(),
// Use single quotes `'` as quotes, `''` as escaped quote and your OS eol
string: stringQuoteOnlyIfNecessaryFormatter({ quote: '\'' }),
// Never use quotes
string: stringQuoteOnlyIfNecessaryFormatter({ quote: '' }),
// Use '\"' as escaped quotes
string: stringQuoteOnlyIfNecessaryFormatter({ escapedQuote: '\"' }),
// Use linux EOL regardless of your OS
string: stringQuoteOnlyIfNecessaryFormatter({ eol: '\n' }),
}
```
##### String Excel
Converts string data into normalized Excel style data after formatting it using the given string formatter.
The formatter needs to be instantiated and takes no arguments.
```js
{
string: stringExcelFormatter,
}
```
##### Symbol
Format the symbol as its string value and then use the given string formatter i.e. `Symbol('My Symbol')` is formatted as `"My Symbol"`.
The formatter needs to be instantiated and takes an options object as arguments containing:
- `stringFormatter` - Boolean, whether to flatten JSON objects or not. Defaults to our built-in `stringFormatter`.
This is the default for `symbol` elements.
```js
{
// Uses the default string formatter
symbol: symbolFormatter(),
// Uses custom string formatter
// You rarely need to this since the symbol formatter will use the string formatter that you set.
symbol: symbolFormatter(myStringFormatter()),
}
```
##### Object
Format the object using `JSON.stringify` and then the given string formatter.
Some object types likes `Date` or Mongo's `ObjectId` are automatically quoted by `JSON.stringify`. This formatter, remove those quotes and uses the given string formatter for correct quoting and escaping.
The formatter needs to be instantiated and takes an options object as arguments containing:
- `stringFormatter` - Boolean, whether to flatten JSON objects or not. Defaults to our built-in `stringFormatter`.
This is the default for `function` and `object` elements. `function`'s are formatted as empty ``.
```js
{
// Uses the default string formatter
object: objectFormatter(),
// Uses custom string formatter
// You rarely need to this since the object formatter will use the string formatter that you set.
object: objectFormatter(myStringFormatter()),
}
```
### json2csv Parser (Synchronous API)
`json2csv` can also be used programmatically as a synchronous converter using its `parse` method.
```js
const { Parser } = require('json2csv');

@@ -227,16 +458,13 @@

### json2csv async parser (Streaming API)
### json2csv Async Parser (Streaming API)
The synchronous API has the downside of loading the entire JSON array in memory and blocking javascript's event loop while processing the data. This means that your server won't be able to process more request or your UI will become irresponsive while data is being processed. For those reasons, is rarely a good reason to use it unless your data is very small or your application doesn't do anything else.
The synchronous API has the downside of loading the entire JSON array in memory and blocking JavaScript's event loop while processing the data. This means that your server won't be able to process more request or your UI will become irresponsive while data is being processed. For those reasons, it is rarely a good reason to use it unless your data is very small or your application doesn't do anything else.
The async parser process the data as a non-blocking stream. This approach ensures a consistent memory footprint and avoid blocking javascript's event loop. Thus, it's better suited for large datasets or system with high concurrency.
The async parser processes the data as a non-blocking stream. This approach ensures a consistent memory footprint and avoid blocking JavaScript's event loop. Thus, it's better suited for large datasets or system with high concurrency.
One very important difference between the asynchronous and the synchronous APIs is that using the asynchronous API json objects are processed one by one. In practice, this means that only the fields in the first object of the array are automatically detected and other fields are just ignored. To avoid this, it's advisable to ensure that all the objects contain exactly the same fields or provide the list of fields using the `fields` option.
The async API uses takes a second options arguments that's directly passed to the underlying streams and accept the same options as the standard [Node.js streams](https://nodejs.org/api/stream.html#stream_new_stream_duplex_options).
The async API takes a second options arguments that is directly passed to the underlying streams and accepts the same options as the standard [Node.js streams](https://nodejs.org/api/stream.html#stream_new_stream_duplex_options).
Instances of `AsyncParser` expose three objects:
* *input:* Which allows to push more data
* *processor:* A readable string representing the whole data processing. You can listen to all the standard events of Node.js streams.
* *transform:* The json2csv transform. See bellow for more details.
Instances of `AsyncParser` expose three objects expose a `parse` method similar to the sync API which takes both JSON arrays/objects and readable streams as input and returns a stream that produces the CSV.

@@ -253,22 +481,13 @@ ```js

let csv = '';
asyncParser.processor
.on('data', chunk => (csv += chunk.toString()))
asyncParser
.parse(data)
.on('data', (chunk) => (csv += chunk.toString()))
.on('end', () => console.log(csv))
.on('error', err => console.error(err));
// You can also listen for events on the conversion and see how the header or the lines are coming out.
asyncParser.transform
.on('header', header => console.log(header))
.on('line', line => console.log(line))
.on('error', err => console.log(err));
asyncParser.input.push(data); // This data might come from an HTTP request, etc.
asyncParser.input.push(null); // Sending `null` to a stream signal that no more data is expected and ends it.
.on('error', (err) => console.error(err))
// You can also listen for events on the conversion and see how the header or the lines are coming out.
.on('header', (header) => console.log(header))
.on('line', (line) => console.log(line));
```
`AsyncParser` also exposes some convenience methods:
* `fromInput` allows you to set the input stream.
* `throughTransform` allows you to add transforms to the stream.
* `toOutput` allows you to set the output stream.
* `promise` returns a promise that resolves when the stream ends or errors. Takes a boolean parameter to indicate if the resulting CSV should be kept in-memory and be resolved by the promise.
Using the async API you can transform streaming JSON into CSV and output directly to a writable stream.

@@ -283,35 +502,25 @@ ```js

// Using the promise API
const input = createReadStream(inputPath, { encoding: 'utf8' });
const asyncParser = new JSON2CSVAsyncParser(opts, transformOpts);
const parsingProcessor = asyncParser.fromInput(input);
const output = createWriteStream(outputPath, { encoding: 'utf8' });
parsingProcessor.promise()
.then(csv => console.log(csv))
.catch(err => console.error(err));
const asyncParser = new AsyncParser(opts, transformOpts);
// Using the promise API just to know when the process finnish
// but not actually load the CSV in memory
const input = createReadStream(inputPath, { encoding: 'utf8' });
const output = createWriteStream(outputPath, { encoding: 'utf8' });
const asyncParser = new JSON2CSVAsyncParser(opts, transformOpts);
const parsingProcessor = asyncParser.fromInput(input).toOutput(output);
parsingProcessor.promise(false).catch(err => console.error(err));
asyncParser.parse(input).pipe(output);
```
you can also use the convenience method `parseAsync` which accept both JSON arrays/objects and readable streams and returns a promise.
`AsyncParser` also exposes a convenience `promise` method which turns the stream into a promise and resolves the whole CSV:
```js
const { parseAsync } = require('json2csv');
const { AsyncParser } = require('json2csv');
const fields = ['field1', 'field2', 'field3'];
const opts = { fields };
const transformOpts = { highWaterMark: 8192 };
parseAsync(myData, opts)
.then(csv => console.log(csv))
.catch(err => console.error(err));
const asyncParser = new AsyncParser(opts, transformOpts);
let csv = await asyncParser.parse(data).promise();
```
### json2csv transform (Streaming API)
### json2csv Transform (Streaming API)

@@ -336,11 +545,11 @@ json2csv also exposes the raw stream transform so you can pipe your json content into it. This is the same Transform that `AsyncParser` uses under the hood.

json2csv
.on('header', header => console.log(header))
.on('line', line => console.log(line))
.on('error', err => console.log(err));
.on('header', (header) => console.log(header))
.on('line', (line) => console.log(line))
.on('error', (err) => console.log(err));
```
The stream API can also work on object mode. This is useful when you have an input stream in object mode or if you are getting JSON objects one by one and want to convert them to CSV as they come.
The stream API can also work in object mode. This is useful when you have an input stream in object mode or if you are getting JSON objects one by one and want to convert them to CSV as they come.
```js
const { Transform } = require("json2csv");
const { Transform } = require('json2csv');
const { Readable } = require('stream');

@@ -351,3 +560,3 @@

// myObjectEmitter is just a fake example representing anything that emit objects.
myObjectEmitter.on('object', obj => input.push(obj));
myObjectEmitter.on('object', (obj) => input.push(obj));
// Pushing a null close the stream

@@ -365,484 +574,190 @@ myObjectEmitter.end(() => input.push(null));

### Data transforms
## Upgrading
json2csv supports data transforms. A transform is simply a function that receives a data item and returns the transformed item.
### Upgrading from 5.X to 6.X
The CLI hasn't changed at all.
#### Custom transforms
#### Formatters
```js
function (item) {
// apply tranformations or create new object
return transformedItem;
}
```
or using ES6
```js
(item) => {
// apply tranformations or create new object
return transformedItem;
}
```
In the JavaScript modules, `formatters` are introduced and the `quote`, `escapedQuote` and `excelStrings` options are removed.
For example, let's add a line counter to our CSV, capitalize the car field and change the price to be in Ks (1000s).
Custom `quote` and `escapedQuote` are applied by setting the properties in the `string` formatter.
```js
let counter = 1;
(item) => ({ counter: counter++, ...item, car: item.car.toUpperCase(), price: item.price / 1000 });
const { Parser } = require('json2csv');
const json2csvParser = new Parser({ quote: "'", escapedQuote: "\\'" });
const csv = json2csvParser.parse(myData);
```
#### Built-in transforms
should be replaced by
There is a number of built-in transform provider by the library.
```js
const { transforms: { unwind, flatten } } = require('json2csv');
const { Parser, formatter: { string: stringFormatter } } = require('json2csv');
const json2csvParser = new Parser({
formatters: {
string: stringFormatter({ quote: '\'', escapedQuote: '\\\'' })),
}
});
const csv = json2csvParser.parse(myData);
```
##### Unwind
`excelStrings` can be used by using the `stringExcel` formatter.
The unwind transform deconstructs an array field from the input item to output a row for each element. Is's similar to MongoDB's $unwind aggregation.
The transform needs to be instantiated and takes an options object as arguments containing:
- `paths` - Array of String, list the paths to the fields to be unwound. It's mandatory and should not be empty.
- `blankOut` - Boolean, unwind using blank values instead of repeating data. Defaults to `false`.
```js
// Default
unwind({ paths: ['fieldToUnwind'] });
// Blanking out repeated data
unwind({ paths: ['fieldToUnwind'], blankOut: true });
const { Parser } = require('json2csv');
const json2csvParser = new Parser({
quote: "'",
escapedQuote: "\\'",
excelStrings: true,
});
const csv = json2csvParser.parse(myData);
```
##### Flatten
Flatten nested javascript objects into a single level object.
should be replaced by
The transform needs to be instantiated and takes an options object as arguments containing:
- `objects` - Boolean, whether to flatten JSON objects or not. Defaults to `true`.
- `arrays`- Boolean, whether to flatten Arrays or not. Defaults to `false`.
- `separator` - String, separator to use between nested JSON keys when flattening a field. Defaults to `.`.
```js
// Default
flatten();
const {
Parser,
formatter: { stringExcel: stringExcelFormatter },
} = require('json2csv');
const json2csvParser = new Parser({
formatters: {
string: stringExcelFormatter,
},
});
const csv = json2csvParser.parse(myData);
```
// Custom separator '__'
flatten({ separator: '_' });
#### AsyncParser
// Flatten only arrays
flatten({ objects: false, arrays: true });
```
Async parser have been simplified to be a class with a single `parse` method which replaces the previous `fromInput` method. `throughTransform` and `toOutput` can be replaced by Node's standard [pipe](https://nodejs.org/api/stream.html#stream_readable_pipe_destination_options) method or the newer [pipeline](https://nodejs.org/api/stream.html#stream_stream_pipeline_source_transforms_destination_callback) utility.
### Javascript module examples
What used to be
#### Example `fields` option
```js
{
fields: [
// Supports pathname -> pathvalue
'simplepath', // equivalent to {value:'simplepath'}
'path.to.value' // also equivalent to {value:'path.to.value'}
// Supports label -> simple path
{
label: 'some label', // Optional, column will be labeled 'path.to.something' if not defined)
value: 'path.to.something', // data.path.to.something
default: 'NULL' // default if value is not found (Optional, overrides `defaultValue` for column)
},
// Supports label -> derived value
{
label: 'some label', // Optional, column will be labeled with the function name or empty if the function is anonymous
value: (row, field) => row[field.label].toLowerCase() ||field.default,
default: 'NULL' // default if value function returns null or undefined
},
// Supports label -> derived value
{
value: (row) => row.arrayField.join(',')
},
// Supports label -> derived value
{
value: (row) => `"${row.arrayField.join(',')}"`
},
]
}
const { AsyncParser } = require('json2csv');
const json2csvParser = new AsyncParser();
const csv = await json2csvParser
.fromInput(myData)
.throughTransform(myTransform)
.toOutput(myOutput);
```
#### Example 1
should be replaced by
```js
const { Parser } = require('json2csv');
const myCars = [
{
"car": "Audi",
"price": 40000,
"color": "blue"
}, {
"car": "BMW",
"price": 35000,
"color": "black"
}, {
"car": "Porsche",
"price": 60000,
"color": "green"
}
];
const json2csvParser = new Parser();
const csv = json2csvParser.parse(myCars);
console.log(csv);
const { AsyncParser } = require('json2csv');
const json2csvParser = new AsyncParser();
json2csvParser.parse(myData.pipe(myTransform)).pipe(myOutput);
```
will output to console
The `promise` method has been kept but it doesn't take any argument as it used to. Now it always keeps the whole CSV and returns it.
```
"car", "price", "color"
"Audi", 40000, "blue"
"BMW", 35000, "black"
"Porsche", 60000, "green"
```
What used to be
#### Example 2
You can choose which fields to include in the CSV.
```js
const { Parser } = require('json2csv');
const fields = ['car', 'color'];
const json2csvParser = new Parser({ fields });
const csv = json2csvParser.parse(myCars);
console.log(csv);
const { AsyncParser } = require('json2csv');
const json2csvParser = new AsyncParser();
const csv = await json2csvParser.fromInput(myData).promise();
```
will output to console
should be replaced by
```
"car", "color"
"Audi", "blue"
"BMW", "black"
"Porsche", "green"
```
#### Example 3
You can choose custom column names for the exported file.
```js
const { Parser } = require('json2csv');
const fields = [{
label: 'Car Name',
value: 'car'
},{
label: 'Price USD',
value: 'price'
}];
const json2csvParser = new Parser({ fields });
const csv = json2csvParser.parse(myCars);
console.log(csv);
const { AsyncParser } = require('json2csv');
const json2csvParser = new AsyncParser();
const csv = await json2csvParser.parse(myData).promise();
```
will output to console
If you want to wait for the stream to finish but not keep the CSV in memory you can use the [stream.finished](https://nodejs.org/api/stream.html#stream_stream_finished_stream_options_callback) utility from Node's stream module.
```
"Car Name", "Price USD"
"Audi", 40000
"BMW", 35000
"Porsche", 60000
```
Finally, the `input`, `transform` and `processor` properties have been remove.
`input` is just your data stream.
`transform` and `processor` are equivalent to the return of the `parse` method.
#### Example 4
Before you could instantiate an `AsyncParser` and push data into it. Now you can simply pass the data as the argument to the `parse` method if you have the entire dataset or you can manually create an array and push data to it.
You can also specify nested properties using dot notation.
What used to be
```js
const { Parser } = require('json2csv');
asyncParser.processor
.on('data', (chunk) => (csv += chunk.toString()))
.on('end', () => console.log(csv))
.on('error', (err) => console.error(err));
const myCars = [
{
"car": { "make": "Audi", "model": "A3" },
"price": 40000,
"color": "blue"
}, {
"car": { "make": "BMW", "model": "F20" },
"price": 35000,
"color": "black"
}, {
"car": { "make": "Porsche", "model": "9PA AF1" },
"price": 60000,
"color": "green"
}
];
const fields = ['car.make', 'car.model', 'price', 'color'];
const json2csvParser = new Parser({ fields });
const csv = json2csvParser.parse(myCars);
console.log(csv);
myData.forEach((item) => asyncParser.input.push(item));
asyncParser.input.push(null); // Sending `null` to a stream signal that no more data is expected and ends it.
```
will output to console
now can be done as
```
"car.make", "car.model", "price", "color"
"Audi", "A3", 40000, "blue"
"BMW", "F20", 35000, "black"
"Porsche", "9PA AF1", 60000, "green"
```
#### Example 5
Use a custom delimiter to create tsv files using the delimiter option:
```js
const { Parser } = require('json2csv');
const json2csvParser = new Parser({ delimiter: '\t' });
const tsv = json2csvParser.parse(myCars);
console.log(tsv);
asyncParser
.parse(myData)
.on('data', (chunk) => (csv += chunk.toString()))
.on('end', () => console.log(csv))
.on('error', (err) => console.error(err));
```
will output to console
or done manually as
```
"car" "price" "color"
"Audi" 10000 "blue"
"BMW" 15000 "red"
"Mercedes" 20000 "yellow"
"Porsche" 30000 "green"
```
If no delimiter is specified, the default `,` is used.
#### Example 6
You can choose custom quotation marks.
```js
const { Parser } = require('json2csv');
const { Readable } = require('stream');
const json2csvParser = new Parser({ quote: '' });
const csv = json2csvParser.parse(myCars);
const myManualInput = new Readable({ objectMode: true });
myManualInput._read = () => {};
console.log(csv);
```
asyncParser
.parse(myManualInput)
.on('data', (chunk) => (csv += chunk.toString()))
.on('end', () => console.log(csv))
.on('error', (err) => console.error(err));
will output to console
myData.forEach((item) => myManualInput.push(item)); // This is useful when the data is coming asynchronously from a request or ws for example.
myManualInput.push(null);
```
car, price, color
Audi, 40000, blue
BMW", 35000, black
Porsche", 60000, green
```
#### Example 7
### Upgrading from 4.X to 5.X
You can unwind arrays similar to MongoDB's $unwind operation using the `unwind` transform.
In the CLI, the config file option, `-c`, used to be a list of fields and now it's expected to be a full configuration object.
```js
const { Parser, transforms: { unwind } } = require('json2csv');
The `stringify` option hass been removed.
const myCars = [
{
"carModel": "Audi",
"price": 0,
"colors": ["blue","green","yellow"]
}, {
"carModel": "BMW",
"price": 15000,
"colors": ["red","blue"]
}, {
"carModel": "Mercedes",
"price": 20000,
"colors": "yellow"
}, {
"carModel": "Porsche",
"price": 30000,
"colors": ["green","teal","aqua"]
}
];
`doubleQuote` has been renamed to `escapedQuote`.
const fields = ['carModel', 'price', 'colors'];
const transforms = [unwind({ paths: ['colors'] })];
In the javascript Javascript modules, `transforms` are introduced and all the `unwind` and `flatten` -related options has been moved to their own transforms.
const json2csvParser = new Parser({ fields, transforms });
const csv = json2csvParser.parse(myCars);
What used to be
console.log(csv);
```
will output to console
```
"carModel","price","colors"
"Audi",0,"blue"
"Audi",0,"green"
"Audi",0,"yellow"
"BMW",15000,"red"
"BMW",15000,"blue"
"Mercedes",20000,"yellow"
"Porsche",30000,"green"
"Porsche",30000,"teal"
"Porsche",30000,"aqua"
```
#### Example 8
You can also unwind arrays multiple times or with nested objects.
```js
const { Parser, transforms: { unwind } } = require('json2csv');
const myCars = [
{
"carModel": "BMW",
"price": 15000,
"items": [
{
"name": "airbag",
"color": "white"
}, {
"name": "dashboard",
"color": "black"
}
]
}, {
"carModel": "Porsche",
"price": 30000,
"items": [
{
"name": "airbag",
"items": [
{
"position": "left",
"color": "white"
}, {
"position": "right",
"color": "gray"
}
]
}, {
"name": "dashboard",
"items": [
{
"position": "left",
"color": "gray"
}, {
"position": "right",
"color": "black"
}
]
}
]
}
];
const fields = ['carModel', 'price', 'items.name', 'items.color', 'items.items.position', 'items.items.color'];
const transforms = [unwind({ paths: ['items', 'items.items'] })];
const json2csvParser = new Parser({ fields, transforms });
const csv = json2csvParser.parse(myCars);
console.log(csv);
const { Parser } = require('json2csv');
const json2csvParser = new Parser({
unwind: paths,
unwindBlank: true,
flatten: true,
flattenSeparator: '__',
});
const csv = json2csvParser.parse(myData);
```
will output to console
should be replaced by
```
"carModel","price","items.name","items.color","items.items.position","items.items.color"
"BMW",15000,"airbag","white",,
"BMW",15000,"dashboard","black",,
"Porsche",30000,"airbag",,"left","white"
"Porsche",30000,"airbag",,"right","gray"
"Porsche",30000,"dashboard",,"left","gray"
"Porsche",30000,"dashboard",,"right","black"
```
#### Example 9
You can also unwind arrays blanking the repeated fields.
```js
const { Parser, transforms: { unwind } } = require('json2csv');
const myCars = [
{
"carModel": "BMW",
"price": 15000,
"items": [
{
"name": "airbag",
"color": "white"
}, {
"name": "dashboard",
"color": "black"
}
]
}, {
"carModel": "Porsche",
"price": 30000,
"items": [
{
"name": "airbag",
"items": [
{
"position": "left",
"color": "white"
}, {
"position": "right",
"color": "gray"
}
]
}, {
"name": "dashboard",
"items": [
{
"position": "left",
"color": "gray"
}, {
"position": "right",
"color": "black"
}
]
}
]
}
];
const fields = ['carModel', 'price', 'items.name', 'items.color', 'items.items.position', 'items.items.color'];
const transforms = [unwind({ paths: ['items', 'items.items'], blankOut: true })];
const json2csvParser = new Parser({ fields, transforms });
const csv = json2csvParser.parse(myCars);
console.log(csv);
const {
Parser,
transform: { unwind, flatten },
} = require('json2csv');
const json2csvParser = new Parser({
transforms: [unwind({ paths, blankOut: true }), flatten('__')],
});
const csv = json2csvParser.parse(myData);
```
will output to console
You can se the documentation for json2csv v4.X.X [here](https://github.com/zemirco/json2csv/blob/v4/README.md).
```
"carModel","price","items.name","items.color","items.items.position","items.items.color"
"BMW",15000,"airbag","white",,
,,"dashboard","black",,
"Porsche",30000,"airbag",,"left","white"
,,,,"right","gray"
,,"dashboard",,"left","gray"
,,,,"right","black"
```
### Upgrading from 3.X to 4.X
### Migrations
What in 3.X used to be
#### Migrating from 3.X to 4.X
What in 3.X used to be
```js

@@ -854,2 +769,3 @@ const json2csv = require('json2csv');

should be replaced by
```js

@@ -862,2 +778,3 @@ const { Parser } = require('json2csv');

or the convenience method
```js

@@ -871,28 +788,2 @@ const json2csv = require('json2csv');

#### Migrating from 4.X to 5.X
In the CLI, the config file option, `-c`, used to be a list of fields and now it's expected to be a full configuration object.
The `stringify` option hass been removed.
`doubleQuote` has been renamed to `escapedQuote`.
The `unwind` and `flatten` -related options has been moved to their own transforms.
What used to be
```js
const { Parser } = require('json2csv');
const json2csvParser = new Parser({ unwind: paths, unwindBlank: true, flatten: true, flattenSeparator: '__' });
const csv = json2csvParser.parse(myData);
```
should be replaced by
```js
const { Parser, transforms: { unwind, flatten } } = require('json2csv');
const json2csvParser = new Parser({ transforms: [unwind({ paths, blankOut: true }), flatten('__')] });
const csv = json2csvParser.parse(myData);
```
You can se the documentation for json2csv v4.X.X [here](https://github.com/zemirco/json2csv/blob/v4/README.md).
## Known Gotchas

@@ -902,3 +793,3 @@

#### Avoiding excel autoformatting
#### Avoiding excel auto-formatting

@@ -908,2 +799,3 @@ Excel tries to automatically detect the format of every field (number, date, string, etc.) regardless of whether the field is quoted or not.

This might produce few undesired effects with, for example, serial numbers:
- Large numbers are displayed using scientific notation

@@ -930,3 +822,2 @@ - Leading zeros are stripped.

### PowerShell escaping

@@ -936,4 +827,14 @@

## Building
## Development
### Pulling the repo
After you clone the repository you just need to install the required packages for development by runnning following command under json2csv dir.
```sh
$ npm install
```
### Building
json2csv is packaged using `rollup`. You can generate the packages running:

@@ -944,13 +845,14 @@

```
which generates 3 files under the `dist folder`:
* `json2csv.umd.js` UMD module transpiled to ES5
* `json2csv.esm.js` ES5 module (import/export)
* `json2csv.cjs.js` CommonJS module
- `json2csv.umd.js` UMD module transpiled to ES5
- `json2csv.esm.js` ES5 module (import/export)
- `json2csv.cjs.js` CommonJS module
When you use packaging tools like webpack and such, they know which version to use depending on your configuration.
## Testing
### Linting & Testing
Run the folowing command to check the code style.
Run the following command to check the code style.

@@ -967,12 +869,6 @@ ```sh

## Contributors
### Contributing changes
After you clone the repository you just need to install the required packages for development by runnning following command under json2csv dir.
Before making any pull request please ensure sure that your code is formatted, test are passing and test coverage haven't decreased.
```sh
$ npm install
```
Before making any pull request please ensure sure that your code is formatted, test are passing and test coverage haven't decreased. (See [Testing](#testing))
## License

@@ -988,3 +884,3 @@

[coveralls-badge-url]: https://coveralls.io/r/zemirco/json2csv?branch=master
[CHANGELOG]: https://github.com/zemirco/json2csv/blob/master/CHANGELOG.md
[LICENSE.md]: https://github.com/zemirco/json2csv/blob/master/LICENSE.md
[changelog]: https://github.com/zemirco/json2csv/blob/master/CHANGELOG.md
[license.md]: https://github.com/zemirco/json2csv/blob/master/LICENSE.md
import resolve from 'rollup-plugin-node-resolve';
import commonjs from 'rollup-plugin-commonjs';
import globals from 'rollup-plugin-node-globals';
import builtins from 'rollup-plugin-node-builtins';
import nodePolyfills from 'rollup-plugin-node-polyfills';
import babel from 'rollup-plugin-babel';

@@ -21,4 +20,3 @@ import pkg from './package.json';

commonjs(),
globals(),
builtins(),
nodePolyfills(),
babel({

@@ -25,0 +23,0 @@ exclude: ['node_modules/**'],

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc