![Build Status](https://secure.travis-ci.org/wdavidw/node-csv-parse.png)
Part of the CSV module, this project is a
parser converting CSV text input into arrays or objects. It implements the
Node.js stream.Transform
API. It also provides a simple callback-base API for
converniency. It is both extremely easy to use and powerfull. It was first
released in 2010 and is used against big datasets by a large community.
The full documentation of the CSV parser is available here.
Note
This module is to be considered in beta stage. It is part of an ongoing effort
to split the current CSV module into complementary modules with a cleaner design
and the latest stream implementation. However, the code has been imported with
very little changes and you should feel confident to use it in your code.
Usage
Run npm install csv
to install the full csv module or run
npm install csv-parse
if you are only interested by the CSV parser.
Use the callback style API for simplicity or the stream based API for
scalability.
For examples, refer to the "samples" folder,
the documentation or the "test" folder.
Using the callback API
The parser receive a string and return an array inside a user-provided
callback. This example is available with the command node samples/callback.js
.
var parse = require('csv-parse');
input = '#Welcome\n"1","2","3","4"\n"a","b","c","d"';
parse(input, {comment: '#'}, function(err, output){
output.should.eql([ [ '1', '2', '3', '4' ], [ 'a', 'b', 'c', 'd' ] ]);
});
Using the stream API
var parse = require('csv-parse');
output = [];
parser = parse({delimiter: ':'})
parser.on('readable', function(){
while(row = parser.read()){
output.push(row)
}
});
parser.on('error', function(err){
consol.log(err.message);
});
parser.on('finish', function(){
output.should.eql([
[ 'root','x','0','0','root','/root','/bin/bash' ],
[ 'someone','x','1022','1022','a funny cat','/home/someone','/bin/bash' ]
]);
});
parser.write("root:x:0:0:root:/root:/bin/bash\n");
parser.write("someone:x:1022:1022:a funny cat:/home/someone:/bin/bash\n");
parser.end()
Using the pipe function
One usefull function part of the Stream API is pipe
to interact between
multiple streams. You may use this function to pipe a stream.Readable
string
source to a stream.Writable
object destination. The next example available as
node samples/pipe.js
read the file, parse its content and transform it.
output = [];
parser = parse({delimiter: ':'})
input = fs.createReadStream('/etc/passwd');
transformer = transform(function(row, callback){
setTimeout(function(){
callback(null, row.join(' ')+'\n');
}, 500);
}, {parallel: 10});
input.pipe(parser).pipe(transformer).pipe(process.stdout);
Migration
Most of the generator is imported from its parent project CSV in a effort
to split it between the generator, the parser, the transformer and the
stringifier.
Development
Tests are executed with mocha. To install it, simple run npm install
followed by npm test
. It will install mocha and its dependencies in your
project "node_modules" directory and run the test suite. The tests run
against the CoffeeScript source files.
To generate the JavaScript files, run make build
.
The test suite is run online with Travis against the versions
0.9, 0.10 and 0.11 of Node.js.
Contributors
Related projects