
Research
Node.js Fixes AsyncLocalStorage Crash Bug That Could Take Down Production Servers
Node.js patched a crash bug where AsyncLocalStorage could cause stack overflows to bypass error handlers and terminate production servers.
@humanwhocodes/csv-tools
Advanced tools
CSV stream processing tools for counting rows and chunking data
If you find this useful, please consider supporting my work with a donation.
A collection of tools for processing CSV files using streams. This package provides functions to count data rows and split CSV data into smaller chunks while preserving headers.
npm install @humanwhocodes/csv-tools
This package exports two main functions for working with CSV data via ReadableStream objects:
countRows(stream, options)Counts rows in a CSV file with configurable options.
import { countRows } from "@humanwhocodes/csv-tools";
// From a file in Node.js
import { createReadStream } from "node:fs";
import { ReadableStream } from "node:stream/web";
const fileStream = createReadStream("data.csv");
const webStream = ReadableStream.from(fileStream);
// Count only data rows (exclude header)
const dataRowCount = await countRows(webStream);
console.log(`Found ${dataRowCount} data rows`);
// Count all rows including header
const fileStream2 = createReadStream("data.csv");
const webStream2 = ReadableStream.from(fileStream2);
const totalRowCount = await countRows(webStream2, { countHeaderRow: true });
console.log(`Found ${totalRowCount} total rows`);
Parameters:
stream (ReadableStream<Uint8Array>) - A readable stream containing CSV dataoptions (Object, optional) - Configuration options
countHeaderRow (boolean, default: false) - Whether to count the header rowcountEmptyRows (boolean, default: false) - Whether to count empty rowsReturns: Promise<number> - The count of rows in the CSV file
chunk(stream, options)An async generator function that yields strings of mini CSV files. Each chunk contains the header row followed by up to chunkSize data rows.
import { chunk } from "@humanwhocodes/csv-tools";
// From a file in Node.js
import { createReadStream } from "node:fs";
import { ReadableStream } from "node:stream/web";
const fileStream = createReadStream("data.csv");
const webStream = ReadableStream.from(fileStream);
// Process CSV in chunks of 50 rows
for await (const csvChunk of chunk(webStream, { chunkSize: 50 })) {
// Each csvChunk is a string with header + up to 50 data rows
console.log("Processing chunk:");
console.log(csvChunk);
// Process the chunk...
}
Parameters:
stream (ReadableStream<Uint8Array>) - A readable stream containing CSV dataoptions (Object) - Configuration options
chunkSize (number, optional) - Number of data rows per chunk. Default: 100includeEmptyRows (boolean, optional) - Whether to include empty rows. Default: falseReturns: AsyncGenerator<string> - An async generator yielding CSV chunks as strings
import { countRows, chunk } from "@humanwhocodes/csv-tools";
// Fetch CSV from URL
const response = await fetch("https://example.com/data.csv");
const reader = response.body.getReader();
const stream = new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader.read().then(({ done, value }) => {
if (done) {
controller.close();
return;
}
controller.enqueue(value);
return pump();
});
}
},
});
// Count rows
const rowCount = await countRows(stream);
console.log(`Total rows: ${rowCount}`);
// Or process in chunks
const response2 = await fetch("https://example.com/data.csv");
const reader2 = response2.body.getReader();
const stream2 = new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader2.read().then(({ done, value }) => {
if (done) {
controller.close();
return;
}
controller.enqueue(value);
return pump();
});
}
},
});
for await (const csvChunk of chunk(stream2, { chunkSize: 100 })) {
// Process each chunk
await processData(csvChunk);
}
Copyright 2025 Nicholas C. Zakas
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
FAQs
CSV stream processing tools for counting rows and chunking data
We found that @humanwhocodes/csv-tools demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Node.js patched a crash bug where AsyncLocalStorage could cause stack overflows to bypass error handlers and terminate production servers.

Research
/Security News
A malicious Chrome extension steals newly created MEXC API keys, exfiltrates them to Telegram, and enables full account takeover with trading and withdrawal rights.

Security News
CVE disclosures hit a record 48,185 in 2025, driven largely by vulnerabilities in third-party WordPress plugins.