Security News
tea.xyz Spam Plagues npm and RubyGems Package Registries
Tea.xyz, a crypto project aimed at rewarding open source contributions, is once again facing backlash due to an influx of spam packages flooding public package registries.
@json2csv/node
Advanced tools
Readme
Fast and highly configurable JSON to CSV converter. It fully support conversion following the RFC4180 specification as well as other similar text delimited formats as TSV.
@json2csv/node
exposes two modules to integrate json2csv
with the Node.js Stream API for stream processing of JSON data.
This package includes two modules:
Node Transform
to offer a friendly promise-based API.There are multiple flavours of json2csv:
Parser
API and a new StreamParser
API which doesn't the conversion in a streaming fashion in pure js.Node Transform
and Node Async Parser
APIs for Node users.WHATWG Transform Stream
and WHATWG Async Parser
APIs for users of WHATWG streams (browser, Node or Deno).CLI
interface.And a couple of libraries that enable additional configurations:
transforms
for json2csv (unwind and flatten) allowing the using to transform data before is parsed.formatters
for json2csv (one for each data type, an excel-specific one, etc.). Formatters convert JSON data types into CSV-compatible strings.You can install json2csv as a dependency using NPM.
$ npm install --save @json2csv/node
You can install json2csv as a dependency using Yarn.
$ yarn add --save @json2csv/node
For Node.js users, the Streaming API is wrapped in a Node.js Stream Transform. This approach ensures a consistent memory footprint and avoids blocking JavaScript's event loop.
The async API takes a second options arguments that is directly passed to the underlying streams and accepts the same options as the standard Node.js streams, plus the options supported by the Stream Parser
.
This Transform uses the StreamParser
under the hood and support similar events.
import { createReadStream, createWriteStream } from 'fs';
import { Transform } from '@json2csv/node';
const input = createReadStream(inputPath, { encoding: 'utf8' });
const output = createWriteStream(outputPath, { encoding: 'utf8' });
const opts = {};
const transformOpts = {};
const asyncOpts = {};
const parser = new Transform(opts, asyncOpts, transformOpts);
const processor = input.pipe(parser).pipe(output);
// You can also listen for events on the conversion and see how the header or the lines are coming out.
parser
.on('header', (header) => console.log(header))
.on('line', (line) => console.log(line));
ndjson
<Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.fields
<DataSelector[]> Defaults to toplevel JSON attributes.transforms
<Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.formatters
<Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.defaultValue
<Any> value to use when missing data. Defaults to <empty>
if not specified. (Overridden by fields[].default
)delimiter
<String> delimiter of columns. Defaults to ,
if not specified.eol
<String> overrides the default OS line ending (i.e. \n
on Unix and \r\n
on Windows).header
<Boolean> determines whether or not CSV file will contain a title column. Defaults to true
if not specified.includeEmptyRows
<Boolean> includes empty rows. Defaults to false
.withBOM
<Boolean> with BOM character. Defaults to false
.See the Duplex stream options for more details.
Options used by the underlying parsing library to process the binary or text stream.
Not relevant when running in objectMode
.
Buffering is only relevant if you expect very large strings/numbers in your JSON.
See @streamparser/json for more details about buffering.
stringBufferSize
<number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.numberBufferSize
<number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.See https://juanjodiaz.github.io/json2csv/#/parsers/node-transform.
To facilitate usage, NodeAsyncParser
wraps NodeTransform
exposing a single parse
method similar to the sync API. This method accepts JSON arrays/objects, TypedArrays, strings and readable streams as input and returns a stream that produces the CSV.
NodeAsyncParser
also exposes a convenience promise
method which turns the stream into a promise that resolves to the whole CSV.
import { AsyncParser } from '@json2csv/node';
const opts = {};
const transformOpts = {};
const asyncOpts = {};
const parser = new AsyncParser(opts, asyncOpts, transformOpts);
const csv = await parser.parse(data).promise();
// The parse method return the transform stream.
// So data can be passed to a writable stream (a file, http request, etc.)
parser.parse(data).pipe(writableStream);
ndjson
<Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.fields
<DataSelector[]> Defaults to toplevel JSON attributes.transforms
<Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.formatters
<Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.defaultValue
<Any> value to use when missing data. Defaults to <empty>
if not specified. (Overridden by fields[].default
)delimiter
<String> delimiter of columns. Defaults to ,
if not specified.eol
<String> overrides the default OS line ending (i.e. \n
on Unix and \r\n
on Windows).header
<Boolean> determines whether or not CSV file will contain a title column. Defaults to true
if not specified.includeEmptyRows
<Boolean> includes empty rows. Defaults to false
.withBOM
<Boolean> with BOM character. Defaults to false
.See the Duplex stream options for more details.
Options used by the underlying parsing library to process the binary or text stream.
Not relevant when running in objectMode
.
Buffering is only relevant if you expect very large strings/numbers in your JSON.
See @streamparser/json for more details about buffering.
stringBufferSize
<number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.numberBufferSize
<number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.See https://juanjodiaz.github.io/json2csv/#/parsers/node-async-parser.
See LICENSE.md.
FAQs
Node.js Transform and Async interface to convert JSON into CSV.
The npm package @json2csv/node receives a total of 45,584 weekly downloads. As such, @json2csv/node popularity was classified as popular.
We found that @json2csv/node demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Tea.xyz, a crypto project aimed at rewarding open source contributions, is once again facing backlash due to an influx of spam packages flooding public package registries.
Security News
As cyber threats become more autonomous, AI-powered defenses are crucial for businesses to stay ahead of attackers who can exploit software vulnerabilities at scale.
Security News
UnitedHealth Group disclosed that the ransomware attack on Change Healthcare compromised protected health information for millions in the U.S., with estimated costs to the company expected to reach $1 billion.