Security News
38% of CISOs Fear They’re Not Moving Fast Enough on AI
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
csv-streamify
Advanced tools
Parses csv files. Accepts options. No coffee script, no weird APIs. Just streams. Tested against csv-spectrum and used in production. It is also "fast enough" (around 60,000 rows per second, but that varies with data obviously).
Works in node 4
, 6
, 8
and 9
. Might work in earlier versions, but is not tested in it.
npm install csv-streamify
This module implements a simple node stream.Transform stream.
You can write to it, read from it and use .pipe
as you would expect.
const csv = require('csv-streamify')
const fs = require('fs')
const parser = csv()
// emits each line as a buffer or as a string representing an array of fields
parser.on('data', function (line) {
console.log(line)
})
// now pipe some data into it
fs.createReadStream('/path/to/file.csv').pipe(parser)
The first argument can either be an options object (see below) or a callback function.
Note: If you pass a callback to csv-streamify
it will buffer the parsed data for you and
pass it to the callback when it's done. This behaviour can obviously lead to out of memory errors with very large csv files.
const csv = require('csv-streamify')
const fs = require('fs')
const parser = csv({ objectMode: true }, function (err, result) {
if (err) throw err
// our csv has been parsed succesfully
result.forEach(function (line) { console.log(line) })
})
// now pipe some data into it
fs.createReadStream('/path/to/file.csv').pipe(parser)
You can pass some options to the parser. All of them are optional.
The options are also passed to the underlying transform stream, so you can pass in any standard node core stream options.
{
delimiter: ',', // comma, semicolon, whatever
newline: '\n', // newline character (use \r\n for CRLF files)
quote: '"', // what's considered a quote
empty: '', // empty fields are replaced by this,
// if true, emit arrays instead of stringified arrays or buffers
objectMode: false,
// if set to true, uses first row as keys -> [ { column1: value1, column2: value2 }, ...]
columns: false
}
Also, take a look at iconv-lite (npm install iconv-lite --save
), it provides pure javascript streaming character encoding conversion.
To use on the command line install it globally:
$ npm install csv-streamify -g
This should add the csv-streamify
command to your $PATH
.
Then, you either pipe data into it or give it a filename:
# pipe data in
$ cat some_data.csv | csv-streamify
# pass a filename
$ csv-streamify some_data.csv > output.json
# tell csv-streamify to read from + wait on stdin
$ csv-streamify -
If you would like to contribute either of those just open an issue so we can discuss it further. :)
Nicolas Hery (objectMode)
FAQs
Streaming CSV Parser. Made entirely out of streams.
The npm package csv-streamify receives a total of 19,273 weekly downloads. As such, csv-streamify popularity was classified as popular.
We found that csv-streamify demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
Research
Security News
Socket researchers uncovered a backdoored typosquat of BoltDB in the Go ecosystem, exploiting Go Module Proxy caching to persist undetected for years.
Security News
Company News
Socket is joining TC54 to help develop standards for software supply chain security, contributing to the evolution of SBOMs, CycloneDX, and Package URL specifications.