batch-stream-csv
Process large CSV files in batches without backpressure
The CSV is streamed in batches and the stream is paused while you perform async operations on each batch to prevent backpressure building up
Uses csv-parser
under the hood
Install
$ npm install batch-stream-csv
Usage:
const batch = require('batch-stream-csv');
batch('./file.csv', batchHandler, options).then(() => console.log('All done!'))
async function batchHandler (batch, progress) => {
console.log(batch)
await db.batchInsert('table', batch)
console.log(`${progress * 100}% complete`)
}
API:
Options: