What is fast-csv?
The fast-csv npm package is a comprehensive library for working with CSV data in Node.js. It provides functionalities for parsing CSV files and strings, formatting data to CSV, and transforming data during the parse and format process.
What are fast-csv's main functionalities?
Parsing CSV
This feature allows you to parse CSV files or strings. The code sample demonstrates how to read a CSV file using a stream, parse it with headers, and log each row to the console.
const fs = require('fs');
const fastcsv = require('fast-csv');
fs.createReadStream('data.csv')
.pipe(fastcsv.parse({ headers: true }))
.on('data', row => console.log(row))
.on('end', () => console.log('CSV file successfully processed'));
Formatting Data to CSV
This feature allows you to format JavaScript data arrays or streams to CSV. The code sample shows how to take an array of objects and write it to a CSV file with headers.
const fs = require('fs');
const fastcsv = require('fast-csv');
const data = [{ id: 1, name: 'John Doe' }, { id: 2, name: 'Jane Doe' }];
fastcsv
.write(data, { headers: true })
.pipe(fs.createWriteStream('out.csv'))
.on('finish', () => console.log('Write to out.csv successfully!'));
Transforming Data
This feature allows you to transform data during the parse or format process. The code sample demonstrates how to read a CSV file, transform each row by combining first and last names into a full name, and then write the transformed data to a new CSV file.
const fs = require('fs');
const fastcsv = require('fast-csv');
fs.createReadStream('data.csv')
.pipe(fastcsv.parse({ headers: true }))
.transform(row => ({ fullName: row.firstName + ' ' + row.lastName }))
.pipe(fastcsv.format({ headers: true }))
.pipe(fs.createWriteStream('out.csv'))
.on('finish', () => console.log('Transformed file successfully written.'));
Other packages similar to fast-csv
papaparse
PapaParse is a robust and powerful CSV parser for JavaScript with a similar feature set to fast-csv. It supports browser and server-side parsing, auto-detection of delimiters, and stream parsing. Compared to fast-csv, PapaParse has a stronger emphasis on browser-side parsing and provides a more user-friendly configuration for handling large files in the browser.
csv-parser
csv-parser is a Node.js module for parsing CSV data. It can handle large datasets and supports stream-based processing. While fast-csv offers both parsing and formatting capabilities, csv-parser focuses primarily on parsing CSV data and may be preferred for its simplicity when only parsing functionality is needed.
csv-writer
csv-writer is a CSV writing library for Node.js that provides functionality to serialize arrays and objects into CSV files. Unlike fast-csv, which offers both parsing and formatting, csv-writer is specialized for CSV output, making it a good choice if the primary requirement is to generate CSV files from data.
Fast-csv
This is a library is aimed at providing fast CSV parsing. It accomplishes this by not handling some of the more complex
edge cases such as multi line rows. However it does support escaped values, embedded commas, double and single quotes.
Installation
npm install fast-csv
Usage
To parse a file.
var csv = require("fast-csv");
csv("my.csv")
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
You may also parse a stream.
var stream = fs.createReadStream("my.csv");
csv(stream)
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
If you expect the first line your csv to headers you may pass a headers option in. Setting the headers option will
cause change each row to an object rather than an array.
var stream = fs.createReadStream("my.csv");
csv(stream, {headers : true})
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
You may alternatively pass an array of header names which must match the order of each column in the csv, otherwise
the data columns will not match.
var stream = fs.createReadStream("my.csv");
csv(stream, {headers : ["firstName", "lastName", "address"]})
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
If your data may include empty rows, the sort Excel might include at the end of the file for instance, you can ignore
these by including the ignoreEmpty
option.
Any rows consisting of nothing but empty strings and/or commas will be skipped, without emitting a 'data' or 'error' event.
var stream = fs.createReadStream("my.csv");
csv(stream, {ignoreEmpty: true})
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
Validating
You can validate each row in the csv by providing a validate handler. If a row is invalid then a data-invalid
event
will be emitted with the row and the index.
var stream = fs.createReadStream("my.csv");
csv(stream, {headers : true})
.validate(function(data){
return data.age < 50;
})
.on("data-invalid", function(data){
})
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
Transforming
You can transform data by providing in a transform function. What is returned from the transform function will
be provided to validate and emitted as a row.
var stream = fs.createReadStream("my.csv");
csv(stream)
.transform(function(data){
return data.reverse();
})
.on("data", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
License
MIT https://github.com/C2FO/fast-csv/raw/master/LICENSE
##Meta