Socket
Socket
Sign inDemoInstall

csv-parse

Package Overview
Dependencies
0
Maintainers
1
Versions
141
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

csv-parse


Version published
Maintainers
1
Created

Package description

What is csv-parse?

The csv-parse package is a flexible Node.js library that provides a parser converting CSV text input into arrays or objects. It implements the Node.js stream.Transform API. It is also capable of converting large datasets and supports many advanced features such as streaming and asynchronous processing.

What are csv-parse's main functionalities?

Parsing CSV to Arrays

This feature allows you to parse CSV data into arrays. Each row in the CSV data becomes an array.

const parse = require('csv-parse');
const assert = require('assert');
const input = 'a,b,c\nd,e,f';
parse(input, function(err, output){
  assert.deepEqual(output, [['a', 'b', 'c'], ['d', 'e', 'f']]);
});

Parsing CSV with Column Mapping

This feature allows you to map CSV columns to object properties, so each row in the CSV data becomes an object with named properties.

const parse = require('csv-parse');
const assert = require('assert');
const input = 'a,b,c\nd,e,f';
const parser = parse({columns: true}, function(err, records){
  assert.deepEqual(records, [{a: 'd', b: 'e', c: 'f'}]);
});
parser.write(input);
parser.end();

Asynchronous Iteration

This feature allows for asynchronous iteration over the parsed records, which is useful for handling large CSV files or streams.

const parse = require('csv-parse');
const fs = require('fs');
const parser = fs.createReadStream('/path/to/csv-file.csv').pipe(parse({columns: true}));
(async () => {
  for await (const record of parser) {
    // Work with each record
  }
})();

Other packages similar to csv-parse

Readme

Source

Build Status

Part of the CSV module, this project is a parser converting CSV text input into arrays or objects. It implements the Node.js stream.Transform`API. It also provides a simple callback-base API for convenience. It is both extremely easy to use and powerful. It was first released in 2010 and is used against big data sets by a large community.

The full documentation of the CSV parser is available here.

Features

  • Follow the Node.js streaming API
  • Support delimiters, quotes, escape characters and comments
  • Line breaks discovery
  • Support big datasets
  • Complete test coverage and samples for inspiration
  • no external dependencies
  • to be used conjointly with csv-generate, stream-transform and csv-stringify

Usage

Run npm install csv to install the full CSV package or run npm install csv-parse if you are only interested by the CSV parser.

Use the callback style API for simplicity or the stream based API for scalability.

For examples, refer to the "samples" folder, the documentation or the "test" folder.

Using the callback API

The parser receive a string and returns an array inside a user-provided callback. This example is available with the command node samples/callback.js.

See the full list of supported parsing options below.

var parse = require('csv-parse');
require('should');

var input = '#Welcome\n"1","2","3","4"\n"a","b","c","d"';
parse(input, {comment: '#'}, function(err, output){
  output.should.eql([ [ '1', '2', '3', '4' ], [ 'a', 'b', 'c', 'd' ] ]);
});

Using the stream API

The CSV parser implements the stream.Transform`API.

CSV data is send through the write function and the resulted data is obtained within the "readable" event by calling the read function. This example is available with the command node samples/stream.js.

See the full list of supported parser options below.

var parse = require('csv-parse');
require('should');

var output = [];
// Create the parser
var parser = parse({delimiter: ':'});
// Use the writable stream api
parser.on('readable', function(){
  while(record = parser.read()){
    output.push(record);
  }
});
// Catch any error
parser.on('error', function(err){
  console.log(err.message);
});
// When we are done, test that the parsed output matched what expected
parser.on('finish', function(){
  output.should.eql([
    [ 'root','x','0','0','root','/root','/bin/bash' ],
    [ 'someone','x','1022','1022','a funny cat','/home/someone','/bin/bash' ]
  ]);
});
// Now that setup is done, write data to the stream
parser.write("root:x:0:0:root:/root:/bin/bash\n");
parser.write("someone:x:1022:1022:a funny cat:/home/someone:/bin/bash\n");
// Close the readable stream
parser.end();

Using the pipe function

One useful function part of the Stream API is pipe to interact between multiple streams. You may use this function to pipe a stream.Readable string source to a stream.Writable object destination. This example available as node samples/pipe.js read the file, parse its content and transform it.

var fs = require('fs');
var parse = require('csv-parse');
var transform = require('stream-transform');

var output = [];
var parser = parse({delimiter: ':'})
var input = fs.createReadStream('/etc/passwd');
var transformer = transform(function(record, callback){
  setTimeout(function(){
    callback(null, record.join(' ')+'\n');
  }, 500);
}, {parallel: 10});
input.pipe(parser).pipe(transformer).pipe(process.stdout);

Parser options

  • delimiter Set the field delimiter. One character only, defaults to comma.
  • rowDelimiter String used to delimit record rows or a special value; special values are 'auto', 'unix', 'mac', 'windows', 'unicode'; defaults to 'auto' (discovered in source or 'unix' if no source is specified).
  • quote Optionnal character surrounding a field, one character only, defaults to double quotes.
  • escape Set the escape character, one character only, defaults to double quotes.
  • columns List of fields as an array, a user defined callback accepting the first line and returning the column names or true if autodiscovered in the first CSV line, default to null, affect the result data set in the sense that records will be objects instead of arrays.
  • comment Treat all the characteres after this one as a comment, default to '#'.
  • objname Name of header-record title to name objects by.
  • trim If true, ignore whitespace immediately around the delimiter, defaults to false.
  • ltrim If true, ignore whitespace immediately following the delimiter (i.e. left-trim all fields), defaults to false.
  • rtrim If true, ignore whitespace immediately preceding the delimiter (i.e. right-trim all fields), defaults to false.
  • auto_parse If true, the parser will attempt to convert read data types to native types.

Migration

Most of the generator is imported from its parent project CSV in a effort to split it between the generator, the parser, the transformer and the stringifier.

The "record" has disappeared, you are encouraged to use the "readable" event conjointly with the "read" function as documented above and in the Stream API.

Development

Tests are executed with mocha. To install it, simple run npm install followed by npm test. It will install mocha and its dependencies in your project "node_modules" directory and run the test suite. The tests run against the CoffeeScript source files.

To generate the JavaScript files, run make build.

The test suite is run online with Travis against the versions 0.10 and 0.11 of Node.js.

Contributors

Keywords

FAQs

Last updated on 26 May 2014

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc