Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

csvtojson

Package Overview
Dependencies
Maintainers
1
Versions
82
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

csvtojson - npm Package Compare versions

Comparing version 2.0.8 to 2.0.10

.ts-node/4c960549db1acf61706147e6827e627a7fda2f32ddc1336b4438c12d731ddee1/09978396b133849a0d0f83c9c100b167a1060208a0fdf57d39dc695ae209e0c9.js

8

bin/options.json

@@ -79,2 +79,10 @@ {

"type": "boolean"
},
"--nullObject":{
"desc":"How to parse if a csv cell contains 'null'. Default false will keep 'null' as string. Change to true if a null object is needed.",
"type":"boolean"
},
"--downstreamFormat":{
"desc":"Option to set what JSON array format is needed by downstream. 'line' is also called ndjson format. This format will write lines of JSON (without square brackets and commas) to downstream. 'array' will write complete JSON array string to downstream (suitable for file writable stream etc). Default 'line'",
"type":"string"
}

@@ -81,0 +89,0 @@ },

5

package.json

@@ -247,3 +247,3 @@ {

],
"version": "2.0.8",
"version": "2.0.10",
"keywords": [

@@ -301,4 +301,5 @@ "csv",

"dev": "tsc -w",
"build": "rm -Rf ./v2 && tsc && npm run build:browser",
"build": "rm -Rf ./v2 && tsc && npm run build:browser && npm run build:browser:window",
"build:browser": "webpack --config ./webpack.config.js",
"build:browser:window": "webpack --config ./webpack.config.js --output-library-target=window --output-library=csv --output-filename=csvtojson.min.js",
"test": "rm -Rf .ts-node && TS_NODE_CACHE_DIRECTORY=.ts-node mocha -r ts-node/register src/**/*.test.ts ./test/*.ts -R spec",

@@ -305,0 +306,0 @@ "travis": "nyc --reporter lcov mocha -r ts-node/register src/**/*.test.ts ./test/*.ts -R spec",

86

readme.md

@@ -29,3 +29,3 @@ [![Build Status](https://travis-ci.org/Keyang/node-csvtojson.svg?branch=master)](https://travis-ci.org/Keyang/node-csvtojson)

* To upgrade to v2, please follow [upgrading guide](https://github.com/Keyang/node-csvtojson/blob/master/docs/csvtojson-v2.md)
* If you are looking for documentation for `v1`, open [this page](https://github.com/Keyang/node-csvtojson/blob/master/docs/readme.v1.md).
* If you are looking for documentation for `v1`, open [this page](https://github.com/Keyang/node-csvtojson/blob/master/docs/readme.v1.md)

@@ -47,2 +47,3 @@ It is still able to use v1 with `csvtojson@2.0.0`

* [API](#api)
* [Browser Usage](#browser-usage)
* [Contribution](#contribution)

@@ -199,3 +200,3 @@

* [Parameters](#parameters)
* [Asynchronouse Result Process](#asynchronouse-result-process)
* [Asynchronous Result Process](#asynchronous-result-process)
* [Events](#events)

@@ -205,3 +206,2 @@ * [Hook / Transform](#hook--transform)

* [Header Row](#header-row)
* [Multi CPU Core Support(experimental) ](#multi-cpu-core-support)
* [Column Parser](#column-parser)

@@ -214,3 +214,3 @@

1. parser parameters
1. Parser parameters
2. Stream options

@@ -237,4 +237,4 @@

* **output**: The format to be converted to. "json" (default) -- convert csv to json. "csv" -- convert csv to csv row array. "line" -- convert csv to csv line string
* **delimiter**: delimiter used for seperating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
* **quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" wont be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
* **delimiter**: delimiter used for separating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
* **quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" won't be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
* **trim**: Indicate if parser trim off spaces surrounding column content. e.g. " content " will be trimmed to "content". Default: true

@@ -247,3 +247,3 @@ * **checkType**: This parameter turns on and off whether check field type. Default is false. (The default is `true` if version < 1.1.4)

* **flatKeys**: Don't interpret dots (.) and square brackets in header fields as nested object or array identifiers at all (treat them like regular characters for JSON field identifiers). Default: false.
* **maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser wont consume memory. default: 0
* **maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser won't consume memory. default: 0
* **checkColumn**: whether check column number of a row is the same as headers. If column number mismatched headers number, an error of "mismatched_column" will be emitted.. default: false

@@ -255,11 +255,13 @@ * **eol**: End of line character. If omitted, parser will attempt to retrieve it from the first chunks of CSV data.

* **colParser**: Allows override parsing logic for a specific column. It accepts a JSON object with fields like: `headName: <String | Function | ColParser>` . e.g. {field1:'number'} will use built-in number parser to convert value of the `field1` column to number. For more information See [details below](#column-parser)
* **alwaysSplitAtEOL**: Always interpret each line (as defined by `eol`) as a row. This will prevent `eol` characters from being used within a row (even inside a quoted field). This ensures that misplaced quotes only break on row, and not all ensuing rows.
* **alwaysSplitAtEOL**: Always interpret each line (as defined by `eol` like `\n`) as a row. This will prevent `eol` characters from being used within a row (even inside a quoted field). Default is false. Change to true if you are confident no inline line breaks (like line break in a cell which has multi line text).
* **nullObject**: How to parse if a csv cell contains "null". Default false will keep "null" as string. Change to true if a null object is needed.
* **downstreamFormat**: Option to set what JSON array format is needed by downstream. "line" is also called ndjson format. This format will write lines of JSON (without square brackets and commas) to downstream. "array" will write complete JSON array string to downstream (suitable for file writable stream etc). Default "line"
* **needEmitAll**: Parser will build JSON result is `.then` is called (or await is used). If this is not desired, set this to false. Default is true.
All parameters can be used in Command Line tool.
## Asynchronouse Result Process
## Asynchronous Result Process
Since `v2.0.0`, asynchronouse processing has been fully supported.
Since `v2.0.0`, asynchronous processing has been fully supported.
e.g. Process each JSON result asynchronousely.
e.g. Process each JSON result asynchronously.

@@ -271,3 +273,3 @@ ```js

// Async operation on the json
// dont forget to call resolve and reject
// don't forget to call resolve and reject
})

@@ -303,3 +305,3 @@ })

`data` event is emitted for each parsed CSV line. It passes buffer of strigified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
`data` event is emitted for each parsed CSV line. It passes buffer of stringified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.

@@ -316,3 +318,3 @@ ```js

### error
`error` event is emitted if there is any errors happened during parsing.
`error` event is emitted if any errors happened during parsing.

@@ -361,3 +363,3 @@ ```js

// asynchronouse
// asynchronous
csv()

@@ -375,3 +377,3 @@ .preRawData((csvRawData)=>{

the function is called each time a file line has been parsed in csv stream. the `lineIdx` is the file line number in the file starting with 0.
The function is called each time a file line has been parsed in csv stream. The `lineIdx` is the file line number in the file starting with 0.

@@ -389,3 +391,3 @@ ```js

// asynchronouse
// asynchronous
csv()

@@ -412,3 +414,3 @@ .preFileLine((fileLineString, lineIdx)=>{

jsonObj.myNewKey='some value'
// OR asynchronousely
// OR asynchronously
return new Promise((resolve,reject)=>{

@@ -504,3 +506,3 @@ jsonObj.myNewKey='some value';

3. If original csv source has no header row but the header definition can be defined. Use `headers:[]` and `noheader:true` parameters.
4. If original csv source has no header row and the header definition is unknow. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
4. If original csv source has no header row and the header definition is unknown. Use `noheader:true`. This will automatically add `fieldN` header to csv cells

@@ -597,3 +599,3 @@

the returned value will be used in result JSON object. returning `undefined` will not change result JSON object.
The returned value will be used in result JSON object. Returning `undefined` will not change result JSON object.

@@ -634,3 +636,3 @@ ### Flat key column

2. Checkout code from your github repo to your local machine.
3. Make code changes and dont forget add related tests.
3. Make code changes and don't forget add related tests.
4. Run `npm test` locally before pushing code back.

@@ -657,1 +659,41 @@ 5. Create a [Pull Request](https://help.github.com/articles/creating-a-pull-request/) on github.

# Browser Usage
To use `csvtojson` in browser is quite simple. There are two ways:
**1. Embed script directly into script tag**
There is a pre-built script located in `browser/csvtojson.min.js`. Simply include that file in a `script` tag in `index.html` page:
```html
<script src="node_modules/csvtojson/browser/csvtojson.min.js"></script>
<!-- or use cdn -->
<script src="https://cdn.rawgit.com/Keyang/node-csvtojson/d41f44aa/browser/csvtojson.min.js"></script>
```
then use a global `csv` function
```html
<script>
csv({
output: "csv"
})
.fromString("a,b,c\n1,2,3")
.then(function(result){
})
</script>
```
**2. Use webpack or browserify**
If a module packager is preferred, just simply `require("csvtojson")`:
```js
var csv=require("csvtojson");
// or with import
import * as csv from "csvtojson";
//then use csv as normal
```

@@ -19,3 +19,3 @@ import { Transform, TransformOptions, Readable } from "stream";

export class Converter extends Transform implements PromiseLike<Array<any>> {
export class Converter extends Transform implements PromiseLike<any[]> {
preRawData(onRawData: PreRawDataCallback): Converter {

@@ -119,7 +119,7 @@ this.runtime.preRawDataHook = onRawData;

// console.log("BBB");
setTimeout(() => {
//wait for next cycle to emit the errors.
setImmediate(() => {
this.result.processError(err);
this.emit("done", err);
}, 0);
});

@@ -126,0 +126,0 @@ });

@@ -8,4 +8,4 @@ import { TransformOptions } from "stream";

}
helper["csv"] = helper;
helper["Converter"] = Converter;
export =helper;
export =helper;

@@ -117,4 +117,14 @@ import { Converter } from "./Converter";

} else {
if (head.indexOf(".") > -1) {
if (conv.parseParam.colParser[head] && (conv.parseParam.colParser[head] as ColumnParam).flat) {
const headArr=head.split(".");
let jsonHead=true;
while(headArr.length>0){
const headCom=headArr.shift();
if (headCom!.length===0){
jsonHead=false;
break;
}
}
if (!jsonHead || conv.parseParam.colParser[head] && (conv.parseParam.colParser[head] as ColumnParam).flat) {
conv.parseRuntime.columnValueSetter[headIdx] = flatSetter;

@@ -129,2 +139,5 @@ } else {

}
if (conv.parseParam.nullObject ===true && value ==="null"){
value=null;
}
conv.parseRuntime.columnValueSetter[headIdx](resultJson, head, value);

@@ -131,0 +144,0 @@ // flatSetter(resultJson, head, value);

@@ -69,3 +69,3 @@ export interface CSVParseParam {

/**
* Always interpret each line (as defined by eol) as a row. This will prevent eol characters from being used within a row (even inside a quoted field). This ensures that misplaced quotes only break on row, and not all ensuing rows.
* Always interpret each line (as defined by eol) as a row. This will prevent eol characters from being used within a row (even inside a quoted field). Default is false. Change to true if you are confident no inline line breaks (like line break in a cell which has multi line text)
*/

@@ -77,2 +77,15 @@ alwaysSplitAtEOL: boolean;

output: "json" | "csv" | "line";
/**
* Convert string "null" to null object in JSON outputs. Default is false.
*/
nullObject:boolean;
/**
* Define the format required by downstream (this parameter does not work if objectMode is on). `line` -- json is emitted in a single line separated by a line breake like "json1\njson2" . `array` -- downstream requires array format like "[json1,json2]". Default is line.
*/
downstreamFormat: "line" | "array";
/**
* Define whether .then(callback) returns all JSON data in its callback. Default is true. Change to false to save memory if subscribing json lines.
*/
needEmitAll: boolean;
}

@@ -106,3 +119,6 @@

alwaysSplitAtEOL: false,
output: "json"
output: "json",
nullObject: false,
downstreamFormat:"line",
needEmitAll:true
}

@@ -109,0 +125,0 @@ if (!params) {

@@ -5,3 +5,3 @@ import { Converter } from "./Converter";

import CSVError from "./CSVError";
import { EOL } from "os";
export class Result {

@@ -19,3 +19,4 @@ private get needEmitLine(): boolean {

private get needEmitAll(): boolean {
return !!this.converter.parseRuntime.then;
return !!this.converter.parseRuntime.then && this.converter.parseParam.needEmitAll;
// return !!this.converter.parseRuntime.then;
}

@@ -26,2 +27,7 @@ private finalResult: any[] = [];

const startPos = this.converter.parseRuntime.parsedLineNumber;
if (this.needPushDownstream && this.converter.parseParam.downstreamFormat === "array") {
if (startPos === 0) {
pushDownstream(this.converter, "[" + EOL);
}
}
// let prom: P<any>;

@@ -66,10 +72,16 @@ return new P((resolve, reject) => {

endProcess() {
if (this.needEmitAll) {
if (this.converter.parseRuntime.then && this.converter.parseRuntime.then.onfulfilled) {
this.converter.parseRuntime.then.onfulfilled(this.finalResult);
if (this.needEmitAll) {
this.converter.parseRuntime.then.onfulfilled(this.finalResult);
}else{
this.converter.parseRuntime.then.onfulfilled([]);
}
}
}
if (this.converter.parseRuntime.subscribe && this.converter.parseRuntime.subscribe.onCompleted) {
this.converter.parseRuntime.subscribe.onCompleted();
}
if (this.needPushDownstream && this.converter.parseParam.downstreamFormat === "array") {
pushDownstream(this.converter, "]" + EOL);
}
}

@@ -101,11 +113,11 @@ }

// processRecursive(lines, hook, conv, offset, needPushDownstream, cb, nextLine, false);
if (needPushDownstream){
pushDownstream(conv,nextLine);
if (needPushDownstream) {
pushDownstream(conv, nextLine);
}
while (offset<lines.length){
const line=lines[offset];
while (offset < lines.length) {
const line = lines[offset];
hook(line, conv.parseRuntime.parsedLineNumber + offset);
offset++;
if (needPushDownstream){
pushDownstream(conv,line);
if (needPushDownstream) {
pushDownstream(conv, line);
}

@@ -124,7 +136,7 @@ }

if (needPushDownstream) {
while (offset<lines.length) {
while (offset < lines.length) {
const line = lines[offset++];
pushDownstream(conv, line);
}
}

@@ -153,3 +165,4 @@ cb();

if (typeof res === "object" && !conv.options.objectMode) {
conv.push(JSON.stringify(res) + "\n", "utf8");
const data = JSON.stringify(res);
conv.push(data + (conv.parseParam.downstreamFormat === "array" ? "," + EOL : EOL), "utf8");
} else {

@@ -156,0 +169,0 @@ conv.push(res);

@@ -78,5 +78,11 @@ import { CSVParseParam } from "./Parameters";

let count = 0;
let prev = "";
for (const c of e) {
if (c === quote) {
// count quotes only if previous character is not escape char
if (c === quote && prev !== this.escape) {
count++;
prev = "";
} else {
// save previous char to temp variable
prev = c;
}

@@ -83,0 +89,0 @@ }

@@ -297,3 +297,18 @@ import { Converter } from "../src/Converter";

it("should output ndjson format", function (done) {
it("should process escape chars when delimiter is between escaped quotes", function(done) {
var test_converter = new Converter({
escape: "\\"
});
var testData =
__dirname + "/data/dataWithSlashEscapeAndDelimiterBetweenQuotes";
var rs = fs.createReadStream(testData);
test_converter.then(function(res) {
assert.equal(res[0].raw, '"hello,"world"');
done();
});
rs.pipe(test_converter);
});
it("should output ndjson format", function(done) {
var conv = new Converter();

@@ -300,0 +315,0 @@ conv.fromString("a,b,c\n1,2,3\n4,5,6")

import csv from "../src";
var assert = require("assert");
import assert from "assert";
var fs = require("fs");

@@ -234,11 +234,91 @@ import { sandbox } from "sinon";

return csv({
headers:["exam_date","sample_no","status","sample_type","patient_id","last_name","first_name","gender_of_patient","patient_birth_date","patient_note","patient_department","accession_number","sample_site","physician","operator","department","note","test_order_code","draw_time","approval_status","approval_time","report_layout","patient_account_number","none_1","errors_detected_during_measurement","age","error_code_01","weight","error_code_02","height","error_code_03","hcg_beta_p","error_code_04","troponin_i_p","error_code_05","ck_mb_p","error_code_06","d_dimer_p","error_code_07","hscrp_p","error_code_08","myoglobin_p","error_code_09","nt_probnp","error_code_10","crp","error_code_11","bnp","error_code_12","tnt","error_code_13","demo_p","error_code_14","pct","error_code_15"]
headers: ["exam_date", "sample_no", "status", "sample_type", "patient_id", "last_name", "first_name", "gender_of_patient", "patient_birth_date", "patient_note", "patient_department", "accession_number", "sample_site", "physician", "operator", "department", "note", "test_order_code", "draw_time", "approval_status", "approval_time", "report_layout", "patient_account_number", "none_1", "errors_detected_during_measurement", "age", "error_code_01", "weight", "error_code_02", "height", "error_code_03", "hcg_beta_p", "error_code_04", "troponin_i_p", "error_code_05", "ck_mb_p", "error_code_06", "d_dimer_p", "error_code_07", "hscrp_p", "error_code_08", "myoglobin_p", "error_code_09", "nt_probnp", "error_code_10", "crp", "error_code_11", "bnp", "error_code_12", "tnt", "error_code_13", "demo_p", "error_code_14", "pct", "error_code_15"]
})
.fromFile(testData)
.fromFile(testData)
.then((d) => {
assert.equal(d.length, 2);
assert.equal(d[0].sample_no, "12669");
})
});
it ("should stream json string correctly",function(done){
const data=`a,b,c
1,2,3
4,5,6`
let hasLeftBracket=false;
let hasRightBracket=false;
csv({
downstreamFormat:"array"
})
.fromString(data)
.on("data",(d)=>{
const str=d.toString();
if (str[0]==="[" && str.length ===2){
hasLeftBracket=true;
}else if (str[0]==="]" && str.length===2){
hasRightBracket=true;
}else{
assert.equal(str[str.length-2],",");
}
})
.on("end",()=>{
assert.equal(hasLeftBracket,true);
assert.equal(hasRightBracket,true);
done();
})
})
it ("should stream json line correctly",function(done){
const data=`a,b,c
1,2,3
4,5,6`
csv({
downstreamFormat:"line"
})
.fromString(data)
.on("data",(d)=>{
const str=d.toString();
assert.notEqual(str[str.length-2],",");
})
.on("end",()=>{
done();
})
})
it ("should not send json if needEmitAll is false",async function(){
const data=`a,b,c
1,2,3
4,5,6`
return csv({
needEmitAll:false
})
.fromString(data)
.then((d)=>{
assert.equal(d.length,2);
assert.equal(d[0].sample_no,"12669");
assert(d.length===0);
})
});
})
it ("should convert null to null object",async function(){
const data=`a,b,c
null,2,3
4,5,6`
return csv({
nullObject:true
})
.fromString(data)
.then((d)=>{
assert.equal(d[0].a,null)
})
})
it ("should process period properly",async function(){
const data=`a..,b,c
1,2,3
4,5,6`
return csv({
})
.fromString(data)
.then((d)=>{
assert.equal(d[0]["a.."],1);
assert.equal(d[1]["a.."],4);
})
})
});

@@ -7,3 +7,2 @@ module.exports = workerMgr;

function workerMgr() {
var spawn = require("child_process").spawn;
var exports = {

@@ -10,0 +9,0 @@ initWorker: initWorker,

@@ -6,3 +6,3 @@ /// <reference types="node" />

import CSVError from "./CSVError";
export declare class Converter extends Transform implements PromiseLike<Array<any>> {
export declare class Converter extends Transform implements PromiseLike<any[]> {
options: TransformOptions;

@@ -9,0 +9,0 @@ preRawData(onRawData: PreRawDataCallback): Converter;

@@ -39,6 +39,7 @@ "use strict";

// console.log("BBB");
setTimeout(function () {
//wait for next cycle to emit the errors.
setImmediate(function () {
_this.result.processError(err);
_this.emit("done", err);
}, 0);
});
});

@@ -45,0 +46,0 @@ _this.once("done", function () {

@@ -6,4 +6,5 @@ "use strict";

};
helper["csv"] = helper;
helper["Converter"] = Converter_1.Converter;
module.exports = helper;
//# sourceMappingURL=index.js.map

@@ -117,3 +117,12 @@ "use strict";

if (head.indexOf(".") > -1) {
if (conv.parseParam.colParser[head] && conv.parseParam.colParser[head].flat) {
var headArr = head.split(".");
var jsonHead = true;
while (headArr.length > 0) {
var headCom = headArr.shift();
if (headCom.length === 0) {
jsonHead = false;
break;
}
}
if (!jsonHead || conv.parseParam.colParser[head] && conv.parseParam.colParser[head].flat) {
conv.parseRuntime.columnValueSetter[headIdx] = flatSetter;

@@ -130,2 +139,5 @@ }

}
if (conv.parseParam.nullObject === true && value === "null") {
value = null;
}
conv.parseRuntime.columnValueSetter[headIdx](resultJson, head, value);

@@ -132,0 +144,0 @@ // flatSetter(resultJson, head, value);

@@ -68,3 +68,3 @@ export interface CSVParseParam {

/**
* Always interpret each line (as defined by eol) as a row. This will prevent eol characters from being used within a row (even inside a quoted field). This ensures that misplaced quotes only break on row, and not all ensuing rows.
* Always interpret each line (as defined by eol) as a row. This will prevent eol characters from being used within a row (even inside a quoted field). Default is false. Change to true if you are confident no inline line breaks (like line break in a cell which has multi line text)
*/

@@ -76,2 +76,14 @@ alwaysSplitAtEOL: boolean;

output: "json" | "csv" | "line";
/**
* Convert string "null" to null object in JSON outputs. Default is false.
*/
nullObject: boolean;
/**
* Define the format required by downstream (this parameter does not work if objectMode is on). `line` -- json is emitted in a single line separated by a line breake like "json1\njson2" . `array` -- downstream requires array format like "[json1,json2]". Default is line.
*/
downstreamFormat: "line" | "array";
/**
* Define whether .then(callback) returns all JSON data in its callback. Default is true. Change to false to save memory if subscribing json lines.
*/
needEmitAll: boolean;
}

@@ -78,0 +90,0 @@ export declare type CellParser = (item: string, head: string, resultRow: any, row: string[], columnIndex: number) => any;

@@ -22,3 +22,6 @@ "use strict";

alwaysSplitAtEOL: false,
output: "json"
output: "json",
nullObject: false,
downstreamFormat: "line",
needEmitAll: true
};

@@ -25,0 +28,0 @@ if (!params) {

@@ -7,2 +7,3 @@ "use strict";

var bluebird_1 = __importDefault(require("bluebird"));
var os_1 = require("os");
var Result = /** @class */ (function () {

@@ -32,3 +33,4 @@ function Result(converter) {

get: function () {
return !!this.converter.parseRuntime.then;
return !!this.converter.parseRuntime.then && this.converter.parseParam.needEmitAll;
// return !!this.converter.parseRuntime.then;
},

@@ -41,2 +43,7 @@ enumerable: true,

var startPos = this.converter.parseRuntime.parsedLineNumber;
if (this.needPushDownstream && this.converter.parseParam.downstreamFormat === "array") {
if (startPos === 0) {
pushDownstream(this.converter, "[" + os_1.EOL);
}
}
// let prom: P<any>;

@@ -77,6 +84,9 @@ return new bluebird_1.default(function (resolve, reject) {

Result.prototype.endProcess = function () {
if (this.needEmitAll) {
if (this.converter.parseRuntime.then && this.converter.parseRuntime.then.onfulfilled) {
if (this.converter.parseRuntime.then && this.converter.parseRuntime.then.onfulfilled) {
if (this.needEmitAll) {
this.converter.parseRuntime.then.onfulfilled(this.finalResult);
}
else {
this.converter.parseRuntime.then.onfulfilled([]);
}
}

@@ -86,2 +96,5 @@ if (this.converter.parseRuntime.subscribe && this.converter.parseRuntime.subscribe.onCompleted) {

}
if (this.needPushDownstream && this.converter.parseParam.downstreamFormat === "array") {
pushDownstream(this.converter, "]" + os_1.EOL);
}
};

@@ -149,3 +162,4 @@ return Result;

if (typeof res === "object" && !conv.options.objectMode) {
conv.push(JSON.stringify(res) + "\n", "utf8");
var data = JSON.stringify(res);
conv.push(data + (conv.parseParam.downstreamFormat === "array" ? "," + os_1.EOL : os_1.EOL), "utf8");
}

@@ -152,0 +166,0 @@ else {

@@ -82,7 +82,14 @@ "use strict";

var count = 0;
var prev = "";
for (var _i = 0, e_1 = e; _i < e_1.length; _i++) {
var c = e_1[_i];
if (c === quote) {
// count quotes only if previous character is not escape char
if (c === quote && prev !== this.escape) {
count++;
prev = "";
}
else {
// save previous char to temp variable
prev = c;
}
}

@@ -89,0 +96,0 @@ if (count % 2 === 1) {

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc