Socket
Socket
Sign inDemoInstall

csv

Package Overview
Dependencies
0
Maintainers
1
Versions
98
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

Comparing version 0.2.1 to 0.2.2

doc/columns.md

8

doc/changes.md

@@ -5,3 +5,3 @@ ---

title: "Changes in latest versions"
date: 2012-10-09T16:24:28.045Z
date: 2012-10-24T16:24:28.045Z
comments: false

@@ -14,2 +14,8 @@ sharing: false

version 0.2.2
-------------
* Function `from.stream` now use a "pipe" implementation
* Add `setEncoding` to the generator to respect the readable stream API
version 0.2.1

@@ -16,0 +22,0 @@ -------------

5

doc/from.md

@@ -5,3 +5,3 @@ ---

title: "Reading data from a source"
date: 2012-10-09T16:24:28.046Z
date: 2012-11-05T20:57:04.723Z
comments: false

@@ -80,3 +80,6 @@ sharing: false

Additionnaly, in case you are working with stream, you can pass all
the options accepted by the `stream.pipe` function.
<a name="from.array"></a>

@@ -83,0 +86,0 @@ `from.array(data, [options])`

@@ -5,3 +5,3 @@ ---

title: "Node CSV"
date: 2012-10-09T16:24:28.045Z
date: 2012-11-05T20:57:04.723Z
comments: false

@@ -149,42 +149,4 @@ sharing: false

Thrown whenever an error occured.
Columns
-------
Columns names may be provided or discovered in the first line with
the read options `columns`. If defined as an array, the order must
match the one of the input source. If set to `true`, the fields are
expected to be present in the first line of the input source.
You can define a different order and even different columns in the
read options and in the write options. If the `columns` is not defined
in the write options, it will default to the one present in the read options.
When working with fields, the `transform` method and the `data`
events receive their `data` parameter as an object instead of an
array where the keys are the field names.
```javascript
// node samples/column.js
var csv = require('csv');
csv()
.from.path(__dirname+'/columns.in', {
columns: true
})
.to.stream(process.stdout, {
columns: ['id', 'name']
})
.transform(function(data){
data.name = data.firstname + ' ' + data.lastname
return data;
});
// Print sth like:
// 82,Zbigniew Preisner
// 94,Serge Gainsbourg
```
<a name="pause"></a>

@@ -191,0 +153,0 @@ `pause()`

@@ -5,3 +5,3 @@ ---

title: "Parsing"
date: 2012-10-09T16:24:28.047Z
date: 2012-11-05T20:57:04.724Z
comments: false

@@ -8,0 +8,0 @@ sharing: false

@@ -5,3 +5,3 @@ ---

title: "Stringifier"
date: 2012-10-09T16:24:28.047Z
date: 2012-11-05T20:57:04.724Z
comments: false

@@ -8,0 +8,0 @@ sharing: false

@@ -5,3 +5,3 @@ ---

title: "Writing data to a destination"
date: 2012-10-09T16:24:28.046Z
date: 2012-11-05T20:57:04.724Z
comments: false

@@ -91,5 +91,9 @@ sharing: false

.from( '"1","2","3","4"\n"a","b","c","d"' )
.to( function(data){} )
.to( function(data, count){} )
```
Callback is called with 2 arguments:
* data Stringify CSV string
* count Number of stringified records

@@ -96,0 +100,0 @@

@@ -5,3 +5,3 @@ ---

title: "Transforming data"
date: 2012-10-18T13:11:44.066Z
date: 2012-11-05T20:57:04.724Z
comments: false

@@ -8,0 +8,0 @@ sharing: false

@@ -125,69 +125,2 @@ // Generated by CoffeeScript 1.3.3

Thrown whenever an error occured.
Columns
-------
Columns are defined in the `csv.options.from` and `csv.options.to`.
Columns names may be provided or discovered in the first line with the
read options `columns`. Most user will defined columns as an
array of property names. If defined as an array, the order must match
the one of the input source. If set to `true`, the fields are
expected to be present in the first line of the input source. For greater
flexibility in parallel with the `csv.options.to.header` option,
it is possible to defined the "columns" options as an object where keys
are the property names and values are the display name.
You can define a different order and even different columns in the
read options and in the write options. If the `columns` is not defined
in the write options, it will default to the one present in the read options.
When working with fields, the `transform` method and the `data`
events receive their `data` parameter as an object instead of an
array where the keys are the field names.
// node samples/column.js
var csv = require('csv');
csv()
.from.path(__dirname+'/columns.in', {
columns: true
})
.to.stream(process.stdout, {
columns: ['id', 'name']
})
.transform(function(data){
data.name = data.firstname + ' ' + data.lastname
return data;
});
// Print sth like:
// 82,Zbigniew Preisner
// 94,Serge Gainsbourg
Columns as a true:
var data = 'field1,field2\nval1,val2';
csv()
.from(data, {columns: true})
.to(function(data){
data.should.eql('val1,val3');
});
Columns as an array:
var data = 'field1,field2,field3\nval1,val2,val3';
csv()
.from(data, {columns: true})
.to(function(data){
data.should.eql('val1,val3');
}, {columns: ['field1', 'field3']});
Columns as an object with header option:
var data = 'field1,field2,field3\nval1,val2,val3';
csv()
.from(data, {columns: true})
.to(function(data){
data.should.eql('column1,column3\nval1,val3');
}, {columns: {field1: 'column1', field3: 'column3'}, header: true});
*/

@@ -289,2 +222,5 @@

}
if (data instanceof Buffer) {
data = data.toString();
}
if (typeof data === 'string' && !preserve) {

@@ -291,0 +227,0 @@ this.parser.parse(data);

@@ -115,2 +115,5 @@ // Generated by CoffeeScript 1.3.3

* `rtrim` If true, ignore whitespace immediately preceding the delimiter (i.e. right-trim all fields), defaults to false.
Additionnaly, in case you are working with stream, you can pass all
the options accepted by the `stream.pipe` function.
*/

@@ -185,4 +188,3 @@

stream = fs.createReadStream(path, csv.from.options());
stream.setEncoding(csv.from.options().encoding);
return csv.from.stream(stream, null);
return csv.from.stream(stream);
};

@@ -199,28 +201,7 @@ /*

from.stream = function(stream, options) {
var first;
this.options(options);
first = true;
stream.on('data', function(data) {
var string, strip;
if (csv.writable) {
strip = first && typeof data === 'string' && stream.encoding === 'utf8' && 0xFEFF === data.charCodeAt(0);
string = strip ? data.substring(1) : data.toString();
if (false === csv.write(string)) {
stream.pause();
}
}
return first = false;
});
stream.on('error', function(e) {
return csv.error(e);
});
stream.on('end', function() {
return csv.end();
});
csv.on('drain', function() {
if (stream.readable) {
return stream.resume();
}
});
csv.readStream = stream;
if (options) {
this.options(options);
}
stream.setEncoding(csv.from.options().encoding);
stream.pipe(csv, csv.from.options());
return csv;

@@ -227,0 +208,0 @@ };

@@ -53,3 +53,3 @@ // Generated by CoffeeScript 1.3.3

util.inherits(Generator, Stream);
Generator.prototype.__proto__ = Stream.prototype;

@@ -72,3 +72,3 @@ Generator.prototype.resume = function() {

}
this.emit('data', "" + (line.join(',')) + "\n");
this.emit('data', new Buffer("" + (line.join(',')) + "\n", this.options.encoding));
}

@@ -87,2 +87,15 @@ };

/*
`setEncoding([encoding])`
Makes the 'data' event emit a string instead of a Buffer.
encoding can be 'utf8', 'utf16le' ('ucs2'), 'ascii', or
'hex'. Defaults to 'utf8'.
*/
Generator.prototype.setEncoding = function(encoding) {
return this.options.encoding = encoding;
};
module.exports = function(options) {

@@ -89,0 +102,0 @@ return new Generator(options);

// Generated by CoffeeScript 1.3.3
var EventEmitter, Parser, stream;
var EventEmitter, Parser;
stream = require('stream');
EventEmitter = require('events').EventEmitter;

@@ -53,2 +51,5 @@

i = 0;
if (this.lines === 0 && csv.options.from.encoding === 'utf8' && 0xFEFF === chars.charCodeAt(0)) {
i++;
}
while (i < l) {

@@ -55,0 +56,0 @@ c = chars.charAt(i);

{
"name": "csv",
"version": "0.2.1",
"version": "0.2.2",
"description": "CSV parser with simple api, full of options and tested against large datasets.",

@@ -9,3 +9,14 @@ "author": "David Worms <david@adaltas.com>",

"Will White <https://github.com/willwhite>",
"Justin Latimer <https://github.com/justinlatimer>"
"Justin Latimer <https://github.com/justinlatimer>",
"jonseymour <https://github.com/jonseymour>",
"pascalopitz <https://github.com/pascalopitz>",
"Josh Pschorr <https://github.com/jpschorr>",
"Elad Ben-Israel <https://github.com/eladb>",
"Philippe Plantier <https://github.com/phipla>",
"Tim Oxley <https://github.com/timoxley>",
"Damon Oehlman <https://github.com/DamonOehlman>",
"Alexandru Topliceanu <https://github.com/topliceanu>",
"Visup <https://github.com/visup>",
"Edmund von der Burg <https://github.com/evdb>",
"Douglas Christopher Wilson <https://github.com/dougwilson>"
],

@@ -12,0 +23,0 @@ "engines": {

@@ -96,8 +96,8 @@ [![Build Status](https://secure.travis-ci.org/wdavidw/node-csv-parser.png)](http://travis-ci.org/wdavidw/node-csv-parser)

* David Worms : <https://github.com/wdavidw>
* Will White : <https://github.com/willwhite>
* Justin Latimer : <https://github.com/justinlatimer>
* jonseymour : <https://github.com/jonseymour>
* pascalopitz : <https://github.com/pascalopitz>
* Josh Pschorr : <https://github.com/jpschorr>
* David Worms: <https://github.com/wdavidw>
* Will White: <https://github.com/willwhite>
* Justin Latimer: <https://github.com/justinlatimer>
* jonseymour: <https://github.com/jonseymour>
* pascalopitz: <https://github.com/pascalopitz>
* Josh Pschorr: <https://github.com/jpschorr>
* Elad Ben-Israel: <https://github.com/eladb>

@@ -104,0 +104,0 @@ * Philippe Plantier: <https://github.com/phipla>

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc