
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
fast-csv-parser
Advanced tools
[](https://github.com/jonaylor89/fast-csv-parser/actions) [](https://badge.fury.io/js/fast-csv-parser) [
🔧 100% API compatible with the original csv-parser
🦀 Rust-powered native performance
🌐 Cross-platform support (Windows, macOS, Linux, ARM)
📦 Zero dependencies in production
🔤 UTF-16 support - handles UTF-16 LE/BE with BOM detection
Performance characteristics vary by file size. Here are real benchmark results:
| Dataset | Rows | Size | csv-parser | fast-csv-parser | Speedup |
|---|---|---|---|---|---|
| large-dataset.csv | 7,268 | 1.1MB | 59ms | 47ms | 🚀 1.26x faster |
| option-maxRowBytes.csv | 4,577 | 700KB | 36ms | 29ms | 🚀 1.24x faster |
Small files show the Node.js ↔ Rust boundary overhead:
Performance Ratio (higher = better)
2x ┤
│ ╭─ Peak performance zone
1.5x ┤ ╱
│ ╱
1x ┼─────────────────────── Break-even point (~1KB)
│
0.5x ┤ Overhead zone
└─────────────────────────────────────
0KB 1KB 10KB 100KB 1MB+
File Size
💡 Recommendation: Use fast-csv-parser for files >10KB or high-throughput scenarios.
Using npm:
npm install fast-csv-parser
Using yarn:
yarn add fast-csv-parser
Using pnpm:
pnpm add fast-csv-parser
Simply replace your csv-parser import:
// Before
const csv = require('csv-parser')
// After - that's it!
const csv = require('fast-csv-parser')
// Your existing code works unchanged
const fs = require('fs')
const results = []
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results)
// [
// { NAME: 'Daffy Duck', AGE: '24' },
// { NAME: 'Bugs Bunny', AGE: '22' }
// ]
})
const csv = require('fast-csv-parser')
fs.createReadStream('data.csv')
.pipe(csv({
headers: ['name', 'age', 'city'],
skipLines: 1,
mapHeaders: ({ header }) => header.toUpperCase(),
mapValues: ({ header, value }) => {
if (header === 'age') return parseInt(value)
return value.trim()
}
}))
.on('data', (row) => console.log(row))
const csv = require('fast-csv-parser')
fs.createReadStream('data.tsv')
.pipe(csv({ separator: '\t' }))
.on('data', (row) => console.log(row))
const csv = require('fast-csv-parser')
fs.createReadStream('data.csv')
.pipe(csv({ strict: true }))
.on('data', (row) => console.log(row))
.on('headers', (headers) => console.log('Headers:', headers))
.on('error', (err) => {
console.error('Parse error:', err.message)
})
.on('end', () => console.log('Parsing complete'))
const csv = require('fast-csv-parser')
const { Transform } = require('stream')
const processor = new Transform({
objectMode: true,
transform(row, encoding, callback) {
// Process each row
row.processed_at = new Date().toISOString()
this.push(row)
callback()
}
})
fs.createReadStream('input.csv')
.pipe(csv())
.pipe(processor)
.pipe(fs.createWriteStream('output.json'))
fast-csv-parser implements the exact same API as the original csv-parser. All options, events, and behaviors are identical.
Returns: Transform Stream
All original csv-parser options are supported:
separator (String, default: ,) - Column separatorquote (String, default: ") - Quote characterescape (String, default: ") - Escape characternewline (String, default: \n) - Line endingheaders (Array|Boolean) - Custom headers or disable header parsingmapHeaders (Function) - Transform header namesmapValues (Function) - Transform cell valuesskipLines (Number, default: 0) - Skip initial linesskipComments (Boolean|String, default: false) - Skip comment linesmaxRowBytes (Number) - Maximum bytes per rowstrict (Boolean, default: false) - Strict column count validationraw (Boolean, default: false) - Disable UTF-8 decodingconst csv = require('fast-csv-parser')
fs.createReadStream('data.tsv')
.pipe(csv({
separator: '\t',
mapHeaders: ({ header }) => header.toLowerCase(),
mapValues: ({ value }) => value.trim()
}))
.on('data', (row) => console.log(row))
dataEmitted for each parsed row (excluding headers).
headersEmitted after header row is parsed with Array<string> of header names.
endEmitted when parsing is complete.
errorEmitted on parsing errors.
raw: true for maximum speed if you don't need UTF-8 processingmapValues functions - They can negate performance gainsmaxRowBytes to avoid memory issues with malformed dataPre-built binaries are available for:
Migration is seamless:
// Old code
const csvParser = require('csv-parser')
// New code - just change the import!
const csvParser = require('fast-csv-parser')
// Everything else stays the same
fast-csv-parser automatically detects and handles multiple text encodings:
No configuration needed - encoding is detected automatically from Byte Order Marks (BOM):
const csv = require('fast-csv-parser')
// Works with UTF-8, UTF-16 LE, UTF-16 BE files automatically
fs.createReadStream('data-utf16.csv') // UTF-16 file
.pipe(csv())
.on('data', (row) => console.log(row))
| Encoding | BOM | Detection | Status |
|---|---|---|---|
| UTF-8 | EF BB BF | Auto-detected, BOM stripped | ✅ Supported |
| UTF-16 LE | FF FE | Auto-detected | ✅ Supported |
| UTF-16 BE | FE FF | Auto-detected | ✅ Supported |
| ASCII | None | Treated as UTF-8 | ✅ Supported |
| Other | - | Not supported | ❌ |
Run benchmarks yourself:
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser
# Install dependencies
npm install
# Run performance comparison against original csv-parser
./benchmark-comparison.js
# Run individual benchmarks
npm run bench
Sample output:
🏁 CSV Parser Performance Comparison
Comparing Rust implementation vs Original JavaScript csv-parser
📊 PERFORMANCE COMPARISON
========================================
File Original Rust Speedup
large-dataset.csv 59ms 47ms 1.26x ⚡
option-maxRowBytes.csv 36ms 29ms 1.24x ⚡
basic.csv 0.098ms 0.43ms 0.23x
Errors are handled identically to the original csv-parser:
fs.createReadStream('data.csv')
.pipe(csv())
.on('error', (err) => {
if (err instanceof RangeError) {
console.log('Row length mismatch')
} else {
console.log('Parse error:', err.message)
}
})
Full TypeScript definitions are included:
import csv from 'fast-csv-parser'
interface Row {
name: string
age: number
}
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row: Row) => {
console.log(row.name, row.age)
})
The CLI is fully compatible with csv-parser:
# Parse CSV to NDJSON
fast-csv-parser data.csv
# Parse TSV
fast-csv-parser -s $'\t' data.tsv
# Custom options
fast-csv-parser --separator=';' --quote='"' data.csv
For most cases, encoding is handled automatically. For advanced scenarios:
const csv = require('fast-csv-parser')
// Automatic encoding detection (recommended)
fs.createReadStream('data.csv') // Any supported encoding
.pipe(csv())
.on('data', (row) => console.log(row))
// For unsupported encodings, use iconv-lite preprocessing
const iconv = require('iconv-lite')
fs.createReadStream('latin1-data.csv')
.pipe(iconv.decodeStream('latin1'))
.pipe(csv())
.on('data', (row) => console.log(row))
Use fast-csv-parser when:
Stick with csv-parser when:
A: Yes! Just change your import from csv-parser to fast-csv-parser. All options, events, and behaviors are identical.
A: Use fast-csv-parser when:
A: No breaking changes. The API is 100% compatible. UTF-16 files are now properly supported with automatic encoding detection.
A: There's a ~0.1ms overhead from the Node.js ↔ Rust boundary. For tiny files, this overhead exceeds the parsing time. The crossover point is around 1KB.
A: Not currently. This is a native Node.js addon. For browser use, stick with the original csv-parser.
A: Very stable. It passes all csv-parser tests and maintains the same error handling. The Rust core uses battle-tested CSV parsing libraries.
Requirements:
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser
# Install dependencies
npm install
# Build the native addon
npm run build
# Run tests
npm test
# Run benchmarks
npm run bench
fast-csv-parser/
├── src/ # Rust source code
│ ├── lib.rs # N-API bindings
│ └── parser.rs # Core CSV parsing logic
├── __test__/ # Test files and fixtures
├── examples/ # Usage examples
├── bin/ # CLI tools
├── main.js # Main entry point with Stream API
└── index.js # Auto-generated native binding loader (build-safe)
src/parser.rs) - High-performance CSV parsingsrc/lib.rs) - Node.js ↔ Rust interfaceindex.js) - Auto-generated cross-platform binary loadingmain.js) - Stream API compatibility layer (build-safe)Contributions welcome! This project maintains:
# Run all tests
npm test
# Run specific test
npm test headers.test.mjs
# Run benchmarks
./benchmark-comparison.js
MIT License - same as the original csv-parser.
Built on the shoulders of giants:
csv-parser by @mafintosh - The original, excellent CSV parsernapi-rs - Rust N-API framework for Node.js addonsSpecial thanks to the csv-parser community for creating such a robust and well-designed API that made this drop-in replacement possible.
🚀 Ready to speed up your CSV processing?
npm install fast-csv-parser
Just replace your import and enjoy the performance boost!
FAQs
[](https://github.com/jonaylor89/fast-csv-parser/actions) [](https://badge.fury.io/js/fast-csv-parser) [
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.