Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
csv-string
Advanced tools
PARSE and STRINGIFY for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
The csv-string npm package provides simple utilities for parsing and stringifying CSV (Comma-Separated Values) data. It is lightweight and easy to use, making it suitable for basic CSV operations.
Stringify
This feature converts an array of arrays into a CSV string. The code sample demonstrates how to convert a 2D array into a CSV formatted string.
const CSVString = require('csv-string');
const data = [['name', 'age'], ['Alice', 30], ['Bob', 25]];
const csv = CSVString.stringify(data);
console.log(csv);
Parse
This feature parses a CSV string into an array of arrays. The code sample shows how to convert a CSV formatted string into a 2D array.
const CSVString = require('csv-string');
const csv = 'name,age\nAlice,30\nBob,25';
const data = CSVString.parse(csv);
console.log(data);
Detect
This feature detects the delimiter used in a CSV string. The code sample demonstrates how to detect the delimiter in a given CSV string.
const CSVString = require('csv-string');
const csv = 'name,age\nAlice,30\nBob,25';
const delimiter = CSVString.detect(csv);
console.log(delimiter);
Encode
This feature encodes an array into a CSV string. The code sample shows how to encode a single array into a CSV formatted string.
const CSVString = require('csv-string');
const data = ['Alice', '30'];
const encoded = CSVString.encode(data);
console.log(encoded);
Decode
This feature decodes a CSV string into an array. The code sample demonstrates how to decode a single CSV formatted string into an array.
const CSVString = require('csv-string');
const csv = 'Alice,30';
const decoded = CSVString.decode(csv);
console.log(decoded);
The csv-parser package is a streaming CSV parser that is more suitable for handling large CSV files. It provides a more robust and efficient way to parse CSV data compared to csv-string, especially for large datasets.
PapaParse is a powerful CSV parser that can handle large files and supports various configurations. It offers more advanced features like asynchronous parsing, web workers, and the ability to handle malformed CSV data, making it more versatile than csv-string.
fast-csv is a comprehensive CSV parser and formatter that supports both reading and writing CSV data. It is designed for high performance and can handle large datasets efficiently, offering more features and better performance compared to csv-string.
Parse and Stringify for CSV strings.
CSV.parse
and CSV.stringify
).
import { CSV } from 'csv-string';
// with String
const arr = CSV.parse('a,b,c\na,b,c');
const str = CSV.stringify(arr);
// with Stream
const stream = CSV.createStream();
stream.on('data', (rows) => {
process.stdout.write(CSV.stringify(rows, ','));
})
process.stdin.pipe(stream);
With npm:
$ npm install csv-string
Use mocha to run the tests.
$ npm install mocha
$ mocha test
Converts a CSV string input
to array output.
var CSV = require('csv-string'),
arr = CSV.parse('a,b,c\na,b,c');
console.log(arr);
Output:
[ [ 'a', 'b', 'c' ], [ 'a', 'b', 'c' ] ]
If separator parameter is not provided, it is automatically detected.
Converts object input
to a CSV string.
var CSV = require('csv-string');
console.log(CSV.stringify(['a', 'b', 'c']));
console.log(CSV.stringify([['c', 'd', 'e'], ['c','d','e']]));
console.log(CSV.stringify({a:'e', b:'f', c:'g'}));
Output:
a,b,c
c,d,e
c,d,e
e,f,g
Detects the best separator.
var CSV = require('csv-string');
console.log(CSV.detect('a,b,c'));
console.log(CSV.detect('a;b;c'));
console.log(CSV.detect('a|b|c'));
console.log(CSV.detect('a\tb\tc'));
Output:
,
;
|
\t
callback(row : Array, index : Number) : undefined
Calls callback
for each CSV row/line. The Array passed to callback contains the fields of the current row.
var CSV = require('csv-string');
var data = 'a,b,c\nd,e,f';
CSV.forEach(data, ',', function(row, index) {
console.log('#' + index + ' : ', row);
});
Output:
#0 : [ 'a', 'b', 'c' ]
#1 : [ 'd', 'e', 'f' ]
callback(row : Array) : undefined
Calls callback
when a CSV row is read. The Array passed to callback contains the fields of the row.
Returns the first offset after the row.
var CSV = require('csv-string');
var data = 'a,b,c\nd,e,f';
var index = CSV.read(data, ',', function(row) {
console.log(row);
});
console.log(data.slice(index));
Output:
[ 'a', 'b', 'c' ]
d,e,f
callback(rows : Array) : undefined
Calls callback
when all CSV rows are read. The Array passed to callback contains the rows of the file.
Returns the offset of the end of parsing (generaly it's the end of the input string).
var CSV = require('csv-string');
var data = 'a,b,c\nd,e,f';
index = CSV.readAll(data, function(row) {
console.log(row);
});
console.log('-' + data.slice(index) + '-');
Output:
[ [ 'a', 'b', 'c' ], [ 'd', 'e', 'f' ] ]
--
callback(rows : Array) : undefined
Calls callback
when all CSV rows are read. The last row could be ignored, because the remainder could be in another chunk.
The Array passed to callback contains the rows of the file.
Returns the offset of the end of parsing. If the last row is ignored, the offset will point to the beginnning of the row.
var CSV = require('csv-string');
var data = 'a,b,c\nd,e';
index = CSV.readChunk(data, function(row) {
console.log(row);
});
console.log('-' + data.slice(index) + '-');
Output:
[ [ 'a', 'b', 'c' ] ]
--
Create a writable stream for CSV chunk. Options are :
Example : Read CSV file from the standard input.
var stream = CSV.createStream();
stream.on('data', function (row) {
console.log(row);
}
)
process.stdin.resume();
process.stdin.setEncoding('utf8');
process.stdin.pipe(stream);
I made a very basic benchmark to compare this project to other related projects, using file streams as input. See ./bench
for source code.
time node ./SCRITPNAME.js >/dev/null
Package | Input equal Output | Time for ~1 200 000 rows |
---|---|---|
a-csv | almost | 0m13.903s |
csv-streamer | yes | 0m15.599s |
csv-stream | yes | 0m17.265s |
csv-string | yes | 0m15.432s |
fast-csv | no | - |
nodecsv | yes | 0m22.129s |
FAQs
PARSE and STRINGIFY for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
We found that csv-string demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.