csv-streamify 
Parses csv files. Accepts options. Handles weird encodings. No coffee script, no weird APIs. Just streams. Tested against csv-spectrum and used in production.
Installation
npm install csv-streamify
Usage
This module implements a simple node 0.10.x stream.Transform stream.
Note: csv-streamify pulls in the readable-stream
module, so it also works on node 0.8
var csv = require('csv-streamify'),
fs = require('fs')
var fstream = fs.createReadStream('/path/to/file'),
parser = csv(options , callback )
parser.on('readable', function () {
var line = parser.read()
console.log(parser.lineNo)
})
function callback(err, doc) {
if (err) return handleErrorGracefully(err)
doc.forEach(function (row) { console.log(row) })
}
fstream.pipe(parser).pipe(nirvana)
Note: If you pass a callback to csv-streamify
it will buffer the parsed data for you and pass it to the callback when it's done. This behaviour can obviously lead to out of memory errors with very large csv files.
Options
You can pass some options to the parser. All of them are optional.
The options are also passed to the underlying transform stream, so you can pass in any standard node core stream options.
{
delimiter: ',',
newline: '\n',
quote: '"',
empty: '',
inputEncoding: '',
objectMode: false,
columns: false
}
In order for the inputEncoding option to take effect you need to install iconv-lite (npm install iconv-lite --save
).
Also, take a look at the iconv-lite documentation for supported encodings.
(iconv-lite provides pure javascript character encoding conversion -> no native code compilation)
CLI
To use on the command line install it globally:
$ npm install csv-streamify -g
This should add the csv-streamify
command to your $PATH
.
Then, you either pipe data into it or give it a filename:
$ cat some_data.csv | csv-streamify
$ csv-streamify some_data.csv > output.json
$ csv-streamify -
Wishlist
If you would like to contribute either of those just open an issue so we can discuss it further. :)
Contributors
Nicolas Hery (objectMode)