Security News
PyPI Introduces Digital Attestations to Strengthen Python Package Security
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
csv-string
Advanced tools
PARSE and STRINGIFY for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
The csv-string npm package provides simple utilities for parsing and stringifying CSV (Comma-Separated Values) data. It is lightweight and easy to use, making it suitable for basic CSV operations.
Stringify
This feature converts an array of arrays into a CSV string. The code sample demonstrates how to convert a 2D array into a CSV formatted string.
const CSVString = require('csv-string');
const data = [['name', 'age'], ['Alice', 30], ['Bob', 25]];
const csv = CSVString.stringify(data);
console.log(csv);
Parse
This feature parses a CSV string into an array of arrays. The code sample shows how to convert a CSV formatted string into a 2D array.
const CSVString = require('csv-string');
const csv = 'name,age\nAlice,30\nBob,25';
const data = CSVString.parse(csv);
console.log(data);
Detect
This feature detects the delimiter used in a CSV string. The code sample demonstrates how to detect the delimiter in a given CSV string.
const CSVString = require('csv-string');
const csv = 'name,age\nAlice,30\nBob,25';
const delimiter = CSVString.detect(csv);
console.log(delimiter);
Encode
This feature encodes an array into a CSV string. The code sample shows how to encode a single array into a CSV formatted string.
const CSVString = require('csv-string');
const data = ['Alice', '30'];
const encoded = CSVString.encode(data);
console.log(encoded);
Decode
This feature decodes a CSV string into an array. The code sample demonstrates how to decode a single CSV formatted string into an array.
const CSVString = require('csv-string');
const csv = 'Alice,30';
const decoded = CSVString.decode(csv);
console.log(decoded);
The csv-parser package is a streaming CSV parser that is more suitable for handling large CSV files. It provides a more robust and efficient way to parse CSV data compared to csv-string, especially for large datasets.
PapaParse is a powerful CSV parser that can handle large files and supports various configurations. It offers more advanced features like asynchronous parsing, web workers, and the ability to handle malformed CSV data, making it more versatile than csv-string.
fast-csv is a comprehensive CSV parser and formatter that supports both reading and writing CSV data. It is designed for high performance and can handle large datasets efficiently, offering more features and better performance compared to csv-string.
Parse and Stringify for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
With npm do:
$ npm install csv-string
Use mocha to run the tests.
$ npm install mocha
$ mocha test
Parse input
to convert to an array.
var CSV = require('csv-string'),
arr = CSV.parse('a,b,c\na,b,c');
console.log(arr);
Output:
[ [ 'a', 'b', 'c' ], [ 'a', 'b', 'c' ] ]
If separator parameter is not provided, it is automatically detected.
Converts input
to a CSV string.
var CSV = require('csv-string');
console.log(CSV.stringify(['a', 'b', 'c']));
console.log(CSV.stringify([['c', 'd', 'e'], ['c','d','e']]));
console.log(CSV.stringify({a:'e', b:'f', c:'g'}));
Output:
a,b,c
c,d,e
c,d,e
e,f,g
Detects the best separator.
var CSV = require('csv-string');
console.log(CSV.detect('a,b,c'));
console.log(CSV.detect('a;b;c'));
console.log(CSV.detect('a|b|c'));
console.log(CSV.detect('a\tb\tc'));
Output:
,
;
|
\t
callback(row : Array, index : Number) : undefined
Calls callback
for each CSV row/line. The Array passed to callback contains the fields of the current row.
var CSV = require('csv-string');
var data = 'a,b,c\nd,e,f';
CSV.forEach(data, ',', function(row, index) {
console.log('#' + index + ' : ', row);
});
Output:
#0 : [ 'a', 'b', 'c' ]
#1 : [ 'd', 'e', 'f' ]
callback(row : Array) : undefined
Calls callback
when a CSV row is readed. The Array passed to callback contains the fields of the row.
Returns the first offset after the row.
var CSV = require('csv-string');
var data = 'a,b,c\nd,e,f';
var index = CSV.read(data, ',', function(row) {
console.log(row);
});
console.log(data.slice(index));
Output:
[ 'a', 'b', 'c' ]
d,e,f
callback(rows : Array) : undefined
Calls callback
when a all CSV rows is readed. The Array passed to callback contains the rows of the file.
Returns the offset of the end of parsing (generaly it's the end of the input string).
var CSV = require('csv-string');
var data = 'a,b,c\nd,e,f';
index = CSV.readAll(data, function(row) {
console.log(row);
});
console.log('-' + data.slice(index) + '-');
Output:
[ [ 'a', 'b', 'c' ], [ 'd', 'e', 'f' ] ]
--
callback(rows : Array) : undefined
Calls callback
when a all CSV rows is readed. The last row could be ignored, because the remainder could be in another chunk.
The Array passed to callback contains the rows of the file.
Returns the offset of the end of parsing. When the last row is ignored, the offset point at the begin of row.
var CSV = require('csv-string');
var data = 'a,b,c\nd,e';
index = CSV.readChunk(data, function(row) {
console.log(row);
});
console.log('-' + data.slice(index) + '-');
Output:
[ [ 'a', 'b', 'c' ] ]
--
Create a writable stream for CSV chunk. Options are :
Example : Read CSV file from the standard input.
var stream = CSV.createStream();
stream.on('data', function (row) {
console.log(row);
}
)
process.stdin.resume();
process.stdin.setEncoding('utf8');
process.stdin.pipe(stream);
A for file and stream, there are many others packages that already exists. To compare them, I made a very basic benchmark (see ./bench for source code)
time node ./SCRITPNAME.js >/dev/null
Package | Input equal Output | Time for ~1 200 000 rows |
---|---|---|
a-csv | almost | 0m13.903s |
csv-streamer | yes | 0m15.599s |
csv-stream | yes | 0m17.265s |
csv-string | yes | 0m15.432s |
fast-csv | no | - |
nodecsv | yes | 0m22.129s |
FAQs
PARSE and STRINGIFY for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
The npm package csv-string receives a total of 62,864 weekly downloads. As such, csv-string popularity was classified as popular.
We found that csv-string demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.