
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
node-csv-streamer
Advanced tools
Stream large datasets and export to CSV - streaming, memory-efficient exporter
Stream large datasets efficiently and export to CSV — framework-agnostic and NestJS-friendly.
firstName-lastName)npm install node-csv-streamer
node-csv-streamer is designed for large dataset exports where loading everything into memory isn’t feasible.
Instead of fetching all data at once, it streams your data in batches using a fetchFn(skip, limit, query) function.
This approach ensures low memory usage and smooth CSV generation — ideal for analytics dashboards, data exports, or reporting systems.
Your data-fetching method must follow this pattern:
async function fetchFn(skip: number, limit: number, query: any): Promise<any[]> {
// Fetch data from your source
return await dataSource.find(query).skip(skip).limit(limit); //This is just an example
}
skip → starting point for batch retrieval
limit → batch size (default is 1000)
query → optional filtering parameters
Each batch is streamed and written directly to the CSV file.
When using node-csv-streamer, the assumption is you are working with a large data set or you want a memory efficient way to get and send data as csv to a writable stream.
To achieve this, your data fetching method must be setup to get data in batches. You can set the batch use the existing 1000.
Your data fetching method should be define in this pattern: fetchFn(skip,limit,query)
The skip is currently used to define how where to start from in fetching data, the limit is the batchSize(1000) to retrieve at each call to your source data location and query is the criteria for selecting the data.
This fetching pattern is used over using a method to gets all the data you need into memory.
import { NodeCsvStream } from "node-csv-streamer";
import express from "express";
const app = express();
app.get("/download-csv", async (req, res) => {
// Set response headers for CSV download
res.setHeader("Content-Type", "text/csv");
res.setHeader(
"Content-Disposition",
`attachment; filename="example-csv-file.csv"`
);
// Define mapping between CSV headers and your data source fields
// You can combine multiple fields using a hyphen ("-")
const csvHeaderMapping = {
Email: "email",
Name: "firstName-lastName",
"Phone Number": "phoneNumber",
};
// Stream CSV data directly to the HTTP response
await NodeCsvStream.download(
res,
csvHeaderMapping,
aggregateEmployeeRecords, // or this.aggregateEmployeeRecords.bind(this)
{}, // optional query parameters
undefined, // optional formatting function
100 // batch size
);
});
NodeCsvStream.download(res, fileMapping, fetchFn, query?, docsFormattingFn?, batchSize?)| Parameter | Type | Description |
|---|---|---|
| res | Writable | Writable stream (e.g., Express or NestJS Response object). Typically the HTTP response where the CSV will be streamed directly. |
| fileMapping | Record<string, string> | Defines the mapping between CSV column headers and data source keys. Supports concatenation with a hyphen (e.g., "firstName-lastName"). |
| fetchFn | (skip: number, limit: number, query: any) => Promise<any[]> | Function responsible for fetching data in batches. It is called repeatedly until no more records are returned. |
| query (optional) | any | Query object passed to fetchFn for data filtering or scoping. |
| docsFormattingFn (optional) | (doc: any, mapping: Record<string, string>) => any | Optional transformation function to modify each record before converting to CSV. |
| batchSize (optional) | number | Number of records to fetch per batch. Defaults to 1000. |
await NodeCsvStream.download(
res,
{
Email: "email",
Name: "firstName-lastName",
"Phone Number": "phoneNumber",
},
aggregateEmployeeRecords, // async function (skip, limit, query)
{ active: true }, // optional query
undefined, // optional transform function
500 // optional batch size
);
MIT © 2025 — Maintained by Adeleke Bright
FAQs
Stream large datasets and export to CSV - streaming, memory-efficient exporter
The npm package node-csv-streamer receives a total of 0 weekly downloads. As such, node-csv-streamer popularity was classified as not popular.
We found that node-csv-streamer demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.