Socket
Socket
Sign inDemoInstall

csv-parse

Package Overview
Dependencies
0
Maintainers
1
Versions
141
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

csv-parse


Version published
Maintainers
1
Install size
48.7 kB
Created

Package description

What is csv-parse?

The csv-parse package is a flexible Node.js library that provides a parser converting CSV text input into arrays or objects. It implements the Node.js stream.Transform API. It is also capable of converting large datasets and supports many advanced features such as streaming and asynchronous processing.

What are csv-parse's main functionalities?

Parsing CSV to Arrays

This feature allows you to parse CSV data into arrays. Each row in the CSV data becomes an array.

const parse = require('csv-parse');
const assert = require('assert');
const input = 'a,b,c\nd,e,f';
parse(input, function(err, output){
  assert.deepEqual(output, [['a', 'b', 'c'], ['d', 'e', 'f']]);
});

Parsing CSV with Column Mapping

This feature allows you to map CSV columns to object properties, so each row in the CSV data becomes an object with named properties.

const parse = require('csv-parse');
const assert = require('assert');
const input = 'a,b,c\nd,e,f';
const parser = parse({columns: true}, function(err, records){
  assert.deepEqual(records, [{a: 'd', b: 'e', c: 'f'}]);
});
parser.write(input);
parser.end();

Asynchronous Iteration

This feature allows for asynchronous iteration over the parsed records, which is useful for handling large CSV files or streams.

const parse = require('csv-parse');
const fs = require('fs');
const parser = fs.createReadStream('/path/to/csv-file.csv').pipe(parse({columns: true}));
(async () => {
  for await (const record of parser) {
    // Work with each record
  }
})();

Other packages similar to csv-parse

Readme

Source

Build Status

This project is part of the CSV module and is a parser converting input CSV text into arrays or objects and implementing the Node.js stream.Transform API. It is also providing a simple callback-base API for converniency. It is both extremely easy to use and powerfull. It was released since 2010 and is tested against very large dataset by a large community.

The full documentation of the CSV parser is available here.

Note

This module is to be considered in alpha stage. It is part of an ongoing effort to split the current CSV module into complementary modules with a cleaner design and the latest stream implementation. However, the code has been imported with very little changes and you should feel confident to use it in your code.

Usage

Run npm install csv to install the full csv module or run npm install csv-parse if you are only interested by the CSV parser.

Use the callback style API for simplicity or the stream based API for scalability.

Using the callback API

The parser receive a string and return an array inside a user-provided callback. This example is available with the command node samples/callback.js.

var parse = require('csv-parse');

input = '#Welcome\n"1","2","3","4"\n"a","b","c","d"';
parse(input, {comment: '#'}, function(err, output){
  output.should.eql([ [ '1', '2', '3', '4' ], [ 'a', 'b', 'c', 'd' ] ]);
});

Using the stream API

// node samples/stream.js
var parse = require('csv-parse');

output = [];
parser = parse({delimiter: ':'})
parser.on('readable', function(){
  while(row = parser.read()){
    output.push(row)
  }
});
parser.on('error', function(err){
  consol.log(err.message);
});
parser.on('finish', function(){
  output.should.eql([
    [ 'root','x','0','0','root','/root','/bin/bash' ],
    [ 'someone','x','1022','1022','a funny cat','/home/someone','/bin/bash' ]
  ]);
});
parser.write("root:x:0:0:root:/root:/bin/bash\n");
parser.write("someone:x:1022:1022:a funny cat:/home/someone:/bin/bash\n");
parser.end()

Using the pipe function

One usefull function part of the Stream API is pipe to interact between multiple streams. You may use this function to pipe a stream.Readable string source to a stream.Writable object destination. The next example available as node samples/pipe.js read the file, parse its content and transform it.

output = [];
parser = parse({delimiter: ':'})
input = fs.createReadStream('/etc/passwd');
transformer = transform(function(row, callback){
  setTimeout(function(){
    callback(null, row.join(' ')+'\n');
  }, 500);
}, {parallel: 10});
input.pipe(parser).pipe(transformer).pipe(process.stdout);

Migration

Most of the generator is imported from its parent project CSV in a effort to split it between the generator, the parser, the transformer and the stringifier.

Development

Tests are executed with mocha. To install it, simple run npm install followed by npm test. It will install mocha and its dependencies in your project "node_modules" directory and run the test suite. The tests run against the CoffeeScript source files.

To generate the JavaScript files, run make build.

The test suite is run online with Travis against the versions 0.9, 0.10 and 0.11 of Node.js.

Contributors

FAQs

Last updated on 04 Apr 2014

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc