Security News
pnpm 10.0.0 Blocks Lifecycle Scripts by Default
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
Convert any JSON string to CSV with support for nested objects, filtering, many different CSV variations, CLI, ...
There are already a lot of good json to csv modules in the wild and this one aggregates all the features of the other modules and adds many, many more.
csvkit json2csv commander JSONPath cli-table
npm install csvwriter
var csvwriter = require('csvwriter');
var data = {
"name": "csvwriter",
"repository": {
"type": "git",
"url": "https://github.com/GazonkFoo/csvwriter"
}
};
csvwriter(data, function(err, csv) {
console.log(csv);
});
With configuration parameters:
var csvwriter = require('csvwriter');
var data = [/*...*/];
csvwriter(data, {delimiter: ';', decimalSeparator: ','}, function(err, csv) {
console.log(csv);
});
Read from stdin and write to stdout:
$ echo '{"name": "csvwriter", "repository": {"type": "git", "url": "https://github.com"}}' | csvwriter
Using files:
$ csvwriter -o converted.csv source.json
Usage: csvwriter [options] [file] - with no file, or when file is -, read standard input
Options:
-h, --help output usage information
-V, --version output the version number
-a, --array-delimiter <delimiter> delimiting character for arrays of primitives (strings, booleans, numbers), set to empty string ("") to disable flatting out primitive arrays (,)
-c, --no-crlf use line feed (\n) instead of carriage return + line feed (\r\n) as line separator
-d, --delimiter <delimiter> delimiting character of the csv (,)
-D, --decimal-separator <separator> the decimal mark to use for numbers (.)
-e, --encoding <encoding> encoding used for the input and output (utf8)
-E, --escape <escape> character used to escape the delimiter, newlines and the escape character itself if quoting is disabled
-f, --fields <fields> specify a comma (,) separated list of fields to convert
-H, --no-header do not include a header row as first line
-p, --path <path> jsonpath to apply on the object
-l, --line-numbers insert a column of line numbers at the front of the output, useful when piping to grep or as a simple primary key
-L, --suppress-line-breaks remove line breaks (\n) from field values
-n, --nesting-delimiter <delimiter> delimiter used for nested fields of the input (.)
-N, --max-depth <depth> maximum depth of the json object, fields below max-depth will not be included in the csv, use -1 to include all fields, 0 will not include nested objects (-1)
-o, --output <file> write to file, use - to write to stdout (default)
-q, --quote <quote> character used to quote strings in the csv (")
-Q, --no-double-quote disable inserting another quote to escape the quote character
-s, --null-string <string> string to use for writing null or undefined values
-t, --tabs specifies that the csv is delimited with tabs, overrides -d
-T, --table create a neat looking table for the console
-C, --header-color <color> color of the table header, one of: black, red, green, yellow, blue, magenta, cyan, white, gray (red)
-u, --quote-mode <0,1,2,3> quoting style used in the csv: 0 = quote minimal (default), 1 = quote all, 2 = quote non-numeric, 3 = quote none
-U, --no-utf-bom do not write utf bom (0xFEFF or 0xEFBBBF) in file if encoding is set to utf
-z, --zero when interpreting or displaying column numbers, use zero-based numbering instead of the default 1-based numbering
For ease of demonstration the examples use the command line interface but all the configuration options except those related to files are also available on the API.
Find some of the other json to csv converters on github and create a neat csv of the result:
curl https://api.github.com/search/repositories?q=json+csv|csvwriter -p "$.items[*]" -f "name,description,homepage,language,owner.login"
This uses the JSONPath "$.items[*]" to only list the items array of the result and uses a fixed list of columns.
The resulting CSV looks like this:
name,description,homepage,language,owner.login
json,"A free, in-browser JSON to CSV converter.",http://konklone.io/json/,JavaScript,konklone
json2csv,command line tool to convert json to csv,http://github.com/jehiah/json2csv,Go,jehiah
json2csv,Convert json to csv with column titles,,JavaScript,zemirco
csv-to-json,CSV to JSON,http://www.cparker15.com/code/utilities/csv-to-json/,JavaScript,cparker15
csv2json,A gem useful for converting CSV files to JSON from command-line,,Ruby,darwin
JSON2CSV,A simple PHP script to convert JSON data to CSV,,PHP,danmandle
json2csv,Converts JSON files to CSV (pulling data from nested structures). Useful for Mongo data,,Python,evidens
to_csv,"This Rails plugin gives you the ability to call to_csv to a collection of activerecords. The builder options are the same as to_json / to_xml, except for the :include.",http://www.arydjmal.com/blog/to_csv-plugin-better-excel-compatibility,Ruby,arydjmal
ServiceStack.Text,".NET's fastest JSON, JSV and CSV Text Serializers ",https://servicestack.net/text,C#,ServiceStack
The column order is the same as passed to the command.
By default the minimal quoting is used. Only fields containing the column delimiter, quoting character or newlines are quoted.
Also note the sub-field "login" of the "owner" field which is mapped as "owner.login".
Let's consider the following JSON (stored in a file named in.json):
[
{
"description": "Show of some array handling",
"tags": ["example", "arrays", "json"],
"meta": [
{
"type": "number",
"value": 12.34
},
{
"type": "boolean",
"value": false
}
]
},
{
"description": "Just for \"demo\"",
"tags": ["foo", "bar", "baz"],
"meta": [
{
"type": "array",
"value": ["another", "array"]
},
{
"type": "wrong field?",
"wrong": "Where am i?"
},
{
"type": "newline",
"value": "Think\ni'm Lost!"
}
]
}
]
Some funny stuff in there so let's convert this to CSV:
csvwriter -o out.csv in.json
So what do we get (content of out.csv):
description,tags,meta.0.type,meta.0.value,meta.1.type,meta.1.value,meta.1.wrong,meta.2.type,meta.2.value
Show of some array handling,"example,arrays,json",number,12.34,boolean,false,,,
"Just for ""demo""","foo,bar,baz",array,"another,array",wrong field,,Where am i?,newline,"Think
i'm Lost!"
This time all the fields of the input are included automatically.
The interesting part is the handling of arrays. The array of the primitive strings "example", "arrays", "json" has been flattened out to "example,arrays,json". But the array of complex objects in meta is converted by using the index in the column header (meta.0, meta.1, meta.2).
Also note the automatic escaping of the quoting character ("demo" becomes ""demo"") and by default new lines are preserved.
The entire behaviour regarding arrays, quoting, newlines can be configured as well as all the delimiters used. The defaults are chosen to meet the requirements of common spreadsheet applications (Libre/Open Office, MS Office, ...)
Finally a bit of eye candy on the console (Table with Line Numbers):
Convert any JSON string to CSV with support for nested objects, filtering, many different CSV variations, CLI, ...
object
function
Convert any JSON string to CSV with support for nested objects, filtering, many different CSV variations, CLI, ...
Kind: global function
Param | Type | Description |
---|---|---|
data | string | object | The source json data which should be converted. Can be a string or a javascript object. |
[params] | csvParameters | Configuration of the CSV generation. |
callback | csvCallback | Callback to handle the generated CSV string. |
object
Kind: global typedef
Properties
Name | Type | Description |
---|---|---|
arrayDelimiter | string | delimiting character for arrays of primitives (strings, booleans, numbers), set to empty string ("") to disable flatting out primitive arrays (,) |
crlf | boolean | use line feed (\n) or carriage return + line feed (\r\n) as line separator (true) |
delimiter | string | delimiting character of the csv (,) |
decimalSeparator | string | the decimal mark to use for numbers (.) |
encoding | string | encoding used for the input and output (utf8) |
escape | string | character used to escape the delimiter, newlines and the escape character itself if quoting is disabled |
fields | string | specify a comma (,) separated list of fields to convert |
header | boolean | include a header as first line (true) |
path | string | jsonpath to apply on the object |
lineNumbers | boolean | insert a column of line numbers at the front of the output, useful when piping to grep or as a simple primary key (false) |
suppressLineBreaks | boolean | remove line breaks (\n) from field values (false) |
nestingDelimiter | string | delimiter used for nested fields of the input (.) |
maxDepth | number | maximum depth of the json object, fields below max-depth will not be included in the csv, use -1 (default) to include all fields, 0 will not include nested objects |
output | string | write to file, use - to write to stdout (default) |
quote | string | character used to quote strings in the csv (") |
doubleQuote | boolean | insert another quote to escape the quote character (true) |
nullString | string | string to use for writing null or undefined values |
table | boolean | create a neat looking table for the console (false) |
headerColor | string | color of the table header, one of: black, red, green, yellow, blue, magenta, cyan, white, gray (red) |
quoteMode | number | quoting style used in the csv: 0 = quote minimal (default), 1 = quote all, 2 = quote non-numeric, 3 = quote none |
utfBom | boolean | write utf bom (0xFEFF or 0xEFBBBF) in file if encoding is set to utf (true) |
zero | boolean | when interpreting or displaying column numbers, use zero-based numbering instead of the default 1-based numbering (false) |
function
Kind: global typedef
Param | Type | Description |
---|---|---|
err | error | Error object or null if no error occurred. |
csv | string | The generated CSV as string. |
The MIT License (MIT)
Copyright (c) 2016 Sebastian Maurer
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
FAQs
Create CSV from complex JSON objects with CLI or API
We found that csvwriter demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
Product
Socket now supports uv.lock files to ensure consistent, secure dependency resolution for Python projects and enhance supply chain security.
Research
Security News
Socket researchers have discovered multiple malicious npm packages targeting Solana private keys, abusing Gmail to exfiltrate the data and drain Solana wallets.