
Security News
Meet Socket at Black Hat and DEF CON 2025 in Las Vegas
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
Simple CSV export module that can export a rich JSON array of objects to CSV.
4.0.0
I decided to update this repo and drop unnecessary code. Version 3.0.1
already was constrained to Node v6; but by breaking some eggs and moving to >= v10, I'm able to drop some dependencies and remove some unnecessary code (i.e. buffered-reader -> Readable.from). I decided to bump the major version with this breaking change. The API itself hasn't changed at all and still works as-is.
const jsoncsv = require('json-csv')
let csv = await jsoncsv.buffered(data, options) //returns Promise
//optionally, you can use the callback
jsoncsv.buffered(data, options, (err, csv) => {...}))
When using the streaming API, you can pipe data to it in object mode.
const jsoncsv = require('json-csv')
let readable = some_readable_source //<readable source in object mode>
readable
.pipe(jsoncsv.stream(options)) //transforms to Utf8 string and emits lines
.pipe(something_else_writable)
})
{
//field definitions for CSV export
fields :
[
{
//required: field name for source value
name: 'string',
//required: column label for CSV header
label: 'string',
//optional: transform value before exporting
transform: function(value) { return value; }
}
],
// Other default options:
fieldSeparator: ","
,ignoreHeader: false
,encoding: "utf8"
}
let items = [
{
name: 'fred',
email: 'fred@somewhere',
amount: 1.02,
},
{
name: 'jo',
email: 'jo@somewhere',
amount: 1.02,
},
{
name: 'jo with a comma,',
email: 'jo@somewhere',
amount: 1.02,
},
{
name: 'jo with a quote"',
email: 'jo@somewhere',
amount: 1.02,
}]
let options = {
fields: [
{
name: 'name',
label: 'Name',
quoted: true,
},
{
name: 'email',
label: 'Email',
},
{
name: 'amount',
label: 'Amount',
},
],
}
This method will take an array of data and convert it into a CSV string all in runtime memory. This works well for small amounts of data.
const jsoncsv = require('json-csv')
async function writeCsv() {
try {
let csv = await jsoncsv.buffered(items, options)
console.log(csv)
} catch (err) {
console.error(err)
}
}
writeCsv()
Here, we want to pipe data from a source to the converter, write the headers and then pipe it to an output (one row at a time). This works really well for large amounts of data like exporting from a MongoDb query directly.
const jsoncsv = require('json-csv')
const {Readable} = require('stream')
Readable.from(items)
.pipe(csv.stream(options))
.pipe(process.stdout)
Name,Email,Amount
"fred",fred@somewhere,1.02
"jo",jo@somewhere,1.02
"jo with a comma,",jo@somewhere,1.02
"jo with a quote""",jo@somewhere,1.02
const items = [
{
downloaded: false,
contact: {
company: 'Widgets, LLC',
name: 'John Doe',
email: 'john@widgets.somewhere',
},
registration: {
year: 2013,
level: 3,
},
},
{
downloaded: true,
contact: {
company: 'Sprockets, LLC',
name: 'Jane Doe',
email: 'jane@sprockets.somewhere',
},
registration: {
year: 2013,
level: 2,
},
},
]
const options = {
fields: [
{
name: 'contact.company',
label: 'Company',
},
{
name: 'contact.name',
label: 'Name',
},
{
name: 'contact.email',
label: 'Email',
},
{
name: 'downloaded',
label: "Downloaded",
transform: (v) => v ? 'downloaded' : 'pending',
},
{
name: 'registration.year',
label: 'Year',
},
{
name: 'registration.level',
label: 'Level',
transform: (v) => {
switch (v) {
case 1: return 'Test 1'
case 2: return 'Test 2'
default: return 'Unknown'
}
},
},
],
}
async function writeCsv() {
try {
let result = await csv.buffered(items, options)
console.log(result)
} catch (err) {
console.error(err)
}
}
writeCsv()
Company,Name,Email,Downloaded,Year,Level
"Widgets, LLC",John Doe,john@widgets.somewhere,pending,2013,Unknown
"Sprockets, LLC",Jane Doe,jane@sprockets.somewhere,downloaded,2013,Test 2
FAQs
Easily convert JSON array to CSV in Node.JS via buffered or streaming.
The npm package json-csv receives a total of 5,134 weekly downloads. As such, json-csv popularity was classified as popular.
We found that json-csv demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
Security News
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
Security News
Deno 2.4 brings back bundling, improves dependency updates and telemetry, and makes the runtime more practical for real-world JavaScript projects.