
Security News
Open Source Maintainers Feeling the Weight of the EU’s Cyber Resilience Act
The EU Cyber Resilience Act is prompting compliance requests that open source maintainers may not be obligated or equipped to handle.
csv-string
Advanced tools
PARSE and STRINGIFY for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
Parse and Stringify for CSV strings.
CSV.parse
and CSV.stringify
).import * as CSV from 'csv-string';
// with String
const arr = CSV.parse('a,b,c\na,b,c');
const str = CSV.stringify(arr);
// with Stream
const stream = CSV.createStream();
stream.on('data', rows => {
process.stdout.write(CSV.stringify(rows, ','));
});
process.stdin.pipe(stream);
using npm:
npm install csv-string
or yarn
yarn add csv-string
Converts a CSV string input
to array output.
Options :
comma
String to indicate the CSV separator. (optional, default ,
)quote
String to indicate the CSV quote if need. (optional, default "
)output
String choose 'objects' or 'tuples' to change output for Array or Object. (optional, default tuples
)Example 1 :
const CSV = require('csv-string');
const parsedCsv = CSV.parse('a;b;c\nd;e;f', ';');
console.log(parsedCsv);
Output:
[
["a", "b", "c"],
["d", "e", "f"]
]
Example 2 :
const CSV = require('csv-string');
const parsedCsv = CSV.parse('a,b,c\n1,2,3\n4,5,6', { output: 'objects' });
console.log(parsedCsv);
Output:
[
{ a: '1', b: '2', c: '3' },
{ a: '4', b: '5', c: '6' }
]
If separator parameter is not provided, it is automatically detected.
Converts object input
to a CSV string.
import * as CSV from 'csv-string';
console.log(CSV.stringify(['a', 'b', 'c']));
console.log(
CSV.stringify([
['c', 'd', 'e'],
['c', 'd', 'e']
])
);
console.log(CSV.stringify({ a: 'e', b: 'f', c: 'g' }));
Output:
a,b,c
c,d,e
c,d,e
e,f,g
Detects the best separator.
import * as CSV from 'csv-string';
console.log(CSV.detect('a,b,c'));
console.log(CSV.detect('a;b;c'));
console.log(CSV.detect('a|b|c'));
console.log(CSV.detect('a\tb\tc'));
Output:
,
;
|
\t
callback(row: array, index: number): void
Calls callback
for each CSV row/line. The Array passed to callback contains the fields of the current row.
import * as CSV from 'csv-string';
const data = 'a,b,c\nd,e,f';
CSV.forEach(data, ',', function (row, index) {
console.log('#' + index + ' : ', row);
});
Output:
#0 : [ 'a', 'b', 'c' ]
#1 : [ 'd', 'e', 'f' ]
callback(row: array): void
Calls callback
when a CSV row is read. The Array passed to callback contains the fields of the row.
Returns the first offset after the row.
import * as CSV from 'csv-string';
const data = 'a,b,c\nd,e,f';
const index = CSV.read(data, ',', row => {
console.log(row);
});
console.log(data.slice(index));
Output:
[ 'a', 'b', 'c' ]
d,e,f
callback(rows: array): void
Calls callback
when all CSV rows are read. The Array passed to callback contains the rows of the file.
Returns the offset of the end of parsing (generally it's the end of the input string).
import * as CSV from 'csv-string';
const data = 'a,b,c\nd,e,f';
const index = CSV.readAll(data, row => {
console.log(row);
});
console.log('-' + data.slice(index) + '-');
Output:
[ [ 'a', 'b', 'c' ], [ 'd', 'e', 'f' ] ]
--
callback(rows: array): void
Calls callback
when all CSV rows are read. The last row could be ignored, because the remainder could be in another chunk.
The Array passed to callback contains the rows of the file.
Returns the offset of the end of parsing. If the last row is ignored, the offset will point to the beginnning of the row.
import * as CSV from 'csv-string';
const data = 'a,b,c\nd,e';
const index = CSV.readChunk(data, row => {
console.log(row);
});
console.log('-' + data.slice(index) + '-');
Output:
[ [ 'a', 'b', 'c' ] ]
--
Create a writable stream for CSV chunk. Options are :
Example : Read CSV file from the standard input.
const stream = CSV.createStream();
stream.on('data', row => {
console.log(row);
});
process.stdin.resume();
process.stdin.setEncoding('utf8');
process.stdin.pipe(stream);
clone
yarn install
yarn test
(ensure all tests pass)yarn bench
(to check the performance impact)There is a quite basic benchmark to compare this project to other related ones, using file streams as input. See ./bench
for source code.
yarn bench
for a test file with 949,044 rows
Package | Time | Output/Input similarity |
---|---|---|
a-csv | 6.01s | ~99% |
csv-stream | 6.64s | ~73% |
csv-streamer | 7.03s | ~79% |
csv-string | 6.53s | 100% |
fast-csv | 12.33s | 99.99% |
nodecsv | 7.10s | 100% |
FAQs
PARSE and STRINGIFY for CSV strings. It's like JSON object but for CSV. It can also work row by row. And, if can parse strings, it can be use to parse files or streams too.
The npm package csv-string receives a total of 104,941 weekly downloads. As such, csv-string popularity was classified as popular.
We found that csv-string demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The EU Cyber Resilience Act is prompting compliance requests that open source maintainers may not be obligated or equipped to handle.
Security News
Crates.io adds Trusted Publishing support, enabling secure GitHub Actions-based crate releases without long-lived API tokens.
Research
/Security News
Undocumented protestware found in 28 npm packages disrupts UI for Russian-language users visiting Russian and Belarusian domains.