What is csv?
The csv npm package is a comprehensive library for parsing and handling CSV data in Node.js. It provides a range of tools for reading, writing, transforming, and streaming CSV data, making it a versatile choice for developers working with CSV files in JavaScript.
What are csv's main functionalities?
Parsing CSV
This feature allows you to parse CSV data into arrays or objects. The code sample demonstrates how to parse a simple CSV string.
"use strict";
const parse = require('csv-parse');
const assert = require('assert');
const input = 'a,b,c\nd,e,f';
parse(input, function(err, output){
assert.deepEqual(
output,
[['a', 'b', 'c'], ['d', 'e', 'f']]
);
});
Stringifying CSV
This feature allows you to convert arrays or objects into CSV strings. The code sample shows how to stringify an array of arrays into a CSV string.
"use strict";
const stringify = require('csv-stringify');
const assert = require('assert');
const input = [['a', 'b', 'c'], ['d', 'e', 'f']];
stringify(input, function(err, output){
assert.equal(
output,
'a,b,c\nd,e,f\n'
);
});
Transforming Data
This feature allows you to apply a transformation to the CSV data. The code sample demonstrates how to convert all the values in the CSV to uppercase.
"use strict";
const transform = require('stream-transform');
const assert = require('assert');
const input = [['a', 'b', 'c'], ['d', 'e', 'f']];
const transformer = transform(function(record, callback){
callback(null, record.map(value => value.toUpperCase()));
});
transformer.write(input[0]);
transformer.write(input[1]);
transformer.end();
const output = [];
transformer.on('readable', function(){
let row;
while ((row = transformer.read()) !== null) {
output.push(row);
}
});
transformer.on('end', function(){
assert.deepEqual(
output,
[['A', 'B', 'C'], ['D', 'E', 'F']]
);
});
Streaming API
This feature provides a streaming API for working with large CSV files without loading the entire file into memory. The code sample demonstrates how to read a CSV file as a stream and parse it.
"use strict";
const fs = require('fs');
const parse = require('csv-parse');
const parser = parse({columns: true});
const input = fs.createReadStream('/path/to/input.csv');
input.pipe(parser);
parser.on('readable', function(){
let record;
while ((record = parser.read()) !== null) {
// Work with each record
}
});
parser.on('end', function(){
// Handle end of parsing
});
Other packages similar to csv
papaparse
PapaParse is a robust and powerful CSV parser for JavaScript with a similar feature set to csv. It supports browser and server-side parsing, auto-detection of delimiters, and streaming large files. Compared to csv, PapaParse is known for its ease of use and strong browser-side capabilities.
fast-csv
fast-csv is another popular CSV parsing and formatting library for Node.js. It offers a simple API, flexible parsing options, and support for streams. While csv provides a comprehensive set of tools for various CSV operations, fast-csv focuses on performance and ease of use for common tasks.
CSV for Node.js and the web
The csv
project provides CSV generation, parsing, transformation and serialization
for Node.js.
It has been tested and used by a large community over the years and should be
considered reliable. It provides every option you would expect from an advanced
CSV parser and stringifier.
This package exposes 4 packages:
Documentation
The full documentation for the current version is available here.
Usage
Installation command is npm install csv
.
Each package is fully compatible with the stream 2 and 3 specifications.
Also, a simple callback-based API is always provided for convenience.
Sample
This example uses the Stream API to create a processing pipeline.
import * as csv from '../lib/index.js';
csv
.generate({
delimiter: '|',
length: 20
})
.pipe(csv.parse({
delimiter: '|'
}))
.pipe(csv.transform((record) => {
return record.map((value) => {
return value.toUpperCase();
});
}))
.pipe(csv.stringify({
quoted: true
}))
.pipe(process.stdout);
Development
This parent project doesn't have tests itself but instead delegates the
tests to its child projects.
Read the documentation of the child projects for additional information.
Contributors
The project is sponsored by Adaltas, an Big Data consulting firm based in Paris, France.
Related projects