Comparing version
@@ -50,2 +50,3 @@ // Module CSV - Copyright David Worms <open@adaltas.com> (BSD Licensed) | ||
quote: null, | ||
quoted: false, | ||
escape: null, | ||
@@ -446,3 +447,4 @@ columns: null, | ||
} | ||
if(containsQuote || containsdelimiter || containsLinebreak){ | ||
console.log('quoted', csv.writeOptions.quoted); | ||
if(containsQuote || containsdelimiter || containsLinebreak || csv.writeOptions.quoted){ | ||
field = (csv.writeOptions.quote || csv.readOptions.quote) + field + (csv.writeOptions.quote || csv.readOptions.quote); | ||
@@ -449,0 +451,0 @@ } |
{ | ||
"name": "csv", | ||
"version": "0.0.16", | ||
"version": "0.0.17", | ||
"description": "CSV parser with simple api, full of options and tested against large datasets.", | ||
@@ -5,0 +5,0 @@ "author": "David Worms <david@adaltas.com>", |
@@ -11,3 +11,3 @@ <pre> | ||
This project provide CSV parsing and has been tested and used on large source file (over 2Gb). | ||
This project provides CSV parsing and has been tested and used on a large source file (over 2Gb). | ||
@@ -17,3 +17,3 @@ - Support delimiter, quote and escape characters | ||
- Data transformation | ||
- Asynch and event based | ||
- Async and event based | ||
- Support for large datasets | ||
@@ -78,9 +78,9 @@ - Complete test coverage as sample and inspiration | ||
- *fromPath(data, options)* | ||
Take a file path as first argument and optionally on object of options as a second arguments. | ||
Take a file path as first argument and optionally on object of options as a second argument. | ||
- *fromStream(readStream, options)* | ||
Take a readable stream as first argument and optionally on object of options as a second arguments. | ||
Take a readable stream as first argument and optionally on object of options as a second argument. | ||
- *from(data, options)* | ||
Take a string, a buffer, an array or an object as first argument and optionally some options as a second arguments. | ||
Take a string, a buffer, an array or an object as first argument and optionally some options as a second argument. | ||
@@ -90,24 +90,24 @@ Options are: | ||
- *delimiter* | ||
Set the field delimiter, one character only, default to comma. | ||
Set the field delimiter, one character only, defaults to comma. | ||
- *quote* | ||
Set the field delimiter, one character only, default to double quotes. | ||
Set the field delimiter, one character only, defaults to double quotes. | ||
- *escape* | ||
Set the field delimiter, one character only, default to double quotes. | ||
Set the field delimiter, one character only, defaults to double quotes. | ||
- *columns* | ||
List of fields or true if autodiscovered in the first CSV line, impact the `transform` argument and the `data` event by providing an object instead of an array, order matters, see the transform and the columns section below. | ||
List of fields or true if autodiscovered in the first CSV line, impact the `transform` argument and the `data` event by providing an object instead of an array, order matters, see the transform and the columns sections below. | ||
- *encoding* | ||
Default to 'utf8', apply when a readable stream is created. | ||
Defaults to 'utf8', applied when a readable stream is created. | ||
- *trim* | ||
If true, ignore whitespace immediately around the delimiter, default to false. | ||
If true, ignore whitespace immediately around the delimiter, defaults to false. | ||
- *ltrim* | ||
If true, ignore whitespace immediately following the delimiter (i.e. left-trim all fields), default to false. | ||
If true, ignore whitespace immediately following the delimiter (i.e. left-trim all fields), defaults to false. | ||
- *rtrim* | ||
If true, ignore whitespace immediately preceding the delimiter (i.e. right-trim all fields), default to false. | ||
If true, ignore whitespace immediately preceding the delimiter (i.e. right-trim all fields), defaults to false. | ||
@@ -117,3 +117,3 @@ Writing API | ||
The following method are available: | ||
The following methods are available: | ||
@@ -127,6 +127,6 @@ - *write(data, preserve)* | ||
- *toPath(path, options)* | ||
Take a file path as first argument and optionally on object of options as a second arguments. | ||
Take a file path as first argument and optionally on object of options as a second argument. | ||
- *toStream(writeStream, options)* | ||
Take a readable stream as first argument and optionally on object of options as a second arguments. | ||
Take a readable stream as first argument and optionally on object of options as a second argument. | ||
@@ -136,15 +136,18 @@ Options are: | ||
- *delimiter* | ||
Default to the delimiter read option. | ||
Defaults to the delimiter read option. | ||
- *quote* | ||
Default to the quote read option. | ||
Defaults to the quote read option. | ||
- *quoted* | ||
Boolean, default to false, quote all the fields even if not required. | ||
- *escape* | ||
Default to the escape read option. | ||
Defaults to the escape read option. | ||
- *columns* | ||
List of fields, apply when `transform` return an object, order matters, see the transform and the columns sections below. | ||
List of fields, applied when `transform` returns an object, order matters, see the transform and the columns sections below. | ||
- *encoding* | ||
Default to 'utf8', apply when a writable stream is created. | ||
Defaults to 'utf8', applied when a writable stream is created. | ||
@@ -155,9 +158,9 @@ - *header* | ||
- *lineBreaks* | ||
String used to delimit record rows or a special value; special values are 'auto', 'unix', 'mac', 'windows', 'unicode'; default to 'auto' (discovered in source or 'unix' if no source is specified). | ||
String used to delimit record rows or a special value; special values are 'auto', 'unix', 'mac', 'windows', 'unicode'; defaults to 'auto' (discovered in source or 'unix' if no source is specified). | ||
- *flags* | ||
Default to 'w', 'w' to create or overwrite an file, 'a' to append to a file. Apply when using the `toPath` method. | ||
Defaults to 'w', 'w' to create or overwrite an file, 'a' to append to a file. Applied when using the `toPath` method. | ||
- *bufferSize* | ||
Internal buffer holding data before being flush into a stream. Apply when destination is a stream. | ||
Internal buffer holding data before being flushed into a stream. Applied when destination is a stream. | ||
@@ -175,3 +178,3 @@ - *end* | ||
- *transform(callback)* | ||
User provided function call on each line to filter, enriche or modify the dataset. The callback is called asynchronously. | ||
User provided function call on each line to filter, enrich or modify the dataset. The callback is called asynchronously. | ||
@@ -182,3 +185,3 @@ The contract is quite simple, you receive an array of fields for each record and return the transformed record. The return value may be an array, an associative array, a string or null. If null, the record will simply be skipped. | ||
When the returned value is an array, the fields are merge in order. When the returned value is an object, it will search for the `columns` property in the write or in the read options and smartly order the values. If no `columns` options are found, it will merge the values in their order of appearance. When the returned value is a string, it is directly sent to the destination source and it is your responsibility to delimit, quote, escape or define line breaks. | ||
When the returned value is an array, the fields are merged in order. When the returned value is an object, it will search for the `columns` property in the write or in the read options and smartly order the values. If no `columns` options are found, it will merge the values in their order of appearance. When the returned value is a string, it is directly sent to the destination source and it is your responsibility to delimit, quote, escape or define line breaks. | ||
@@ -203,6 +206,6 @@ Example of transform returning a string | ||
By extending the Node `EventEmitter` class, the library provide a few useful events: | ||
By extending the Node `EventEmitter` class, the library provides a few useful events: | ||
- *data* (function(data, index){}) | ||
Thrown when a new row is parsed after the `transform` callback and with the data being the value returned by `transform`. Note however that the event won't be call if transform return `null` since the record is skipped. | ||
Thrown when a new row is parsed after the `transform` callback and with the data being the value returned by `transform`. Note however that the event won't be called if transform return `null` since the record is skipped. | ||
The callback provide two arguments: | ||
@@ -209,0 +212,0 @@ `data` is the CSV line being processed (by default as an array) |
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Non-existent author
Supply chain riskThe package was published by an npm account that no longer exists.
Found 1 instance in 1 package
URL strings
Supply chain riskPackage contains fragments of external URLs or IP addresses, which the package may be accessing at runtime.
Found 1 instance in 1 package
URL strings
Supply chain riskPackage contains fragments of external URLs or IP addresses, which the package may be accessing at runtime.
Found 1 instance in 1 package
100782
1.17%108
1.89%599
0.34%263
1.15%1
Infinity%