New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

fast-csv-parser

Package Overview
Dependencies
Maintainers
1
Versions
5
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

fast-csv-parser

[![CI](https://github.com/jonaylor89/fast-csv-parser/workflows/CI/badge.svg)](https://github.com/jonaylor89/fast-csv-parser/actions) [![npm version](https://badge.fury.io/js/fast-csv-parser.svg)](https://badge.fury.io/js/fast-csv-parser) [![Downloads](htt

latest
Source
npmnpm
Version
0.0.4
Version published
Weekly downloads
1
-83.33%
Maintainers
1
Weekly downloads
 
Created
Source

fast-csv-parser

CI npm version Downloads License: MIT

🚀 A high-performance, drop-in replacement for csv-parser powered by Rust

Streaming CSV parser that aims for maximum speed while maintaining 100% compatibility with the original csv-parser API. Built with Rust and N-API for blazing-fast performance on large datasets.

fast-csv-parser is a drop-in replacement - simply change your import and enjoy better performance with zero code changes required.

⚡️ Up to 1.3x faster on large datasets (7K+ rows) 🔧 100% API compatible with the original csv-parser 🦀 Rust-powered native performance 🌐 Cross-platform support (Windows, macOS, Linux, ARM) 📦 Zero dependencies in production 🔤 UTF-16 support - handles UTF-16 LE/BE with BOM detection

Performance Comparison

Performance characteristics vary by file size. Here are real benchmark results:

Large Files (Where It Matters)

DatasetRowsSizecsv-parserfast-csv-parserSpeedup
large-dataset.csv7,2681.1MB59ms47ms🚀 1.26x faster
option-maxRowBytes.csv4,577700KB36ms29ms🚀 1.24x faster

Small Files (Startup Overhead)

Small files show the Node.js ↔ Rust boundary overhead:

  • Files <100 rows: ~0.6x performance (0.1ms → 0.2ms)
  • The overhead is constant (~0.1ms) regardless of file size

Performance Profile

          Performance Ratio (higher = better)
     2x  ┤
         │     ╭─ Peak performance zone
   1.5x  ┤   ╱
         │ ╱
     1x  ┼─────────────────────── Break-even point (~1KB)
         │
   0.5x  ┤ Overhead zone
         └─────────────────────────────────────
         0KB    1KB     10KB    100KB   1MB+
                        File Size

💡 Recommendation: Use fast-csv-parser for files >10KB or high-throughput scenarios.

📦 Installation

Using npm:

npm install fast-csv-parser

Using yarn:

yarn add fast-csv-parser

Using pnpm:

pnpm add fast-csv-parser

🚀 Usage Examples

Basic Usage

Simply replace your csv-parser import:

// Before
const csv = require('csv-parser')

// After - that's it!
const csv = require('fast-csv-parser')

// Your existing code works unchanged
const fs = require('fs')
const results = []

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('data', (data) => results.push(data))
  .on('end', () => {
    console.log(results)
    // [
    //   { NAME: 'Daffy Duck', AGE: '24' },
    //   { NAME: 'Bugs Bunny', AGE: '22' }
    // ]
  })

Advanced Usage Examples

Custom Headers and Transformations

const csv = require('fast-csv-parser')

fs.createReadStream('data.csv')
  .pipe(csv({
    headers: ['name', 'age', 'city'],
    skipLines: 1,
    mapHeaders: ({ header }) => header.toUpperCase(),
    mapValues: ({ header, value }) => {
      if (header === 'age') return parseInt(value)
      return value.trim()
    }
  }))
  .on('data', (row) => console.log(row))

Processing TSV Files

const csv = require('fast-csv-parser')

fs.createReadStream('data.tsv')
  .pipe(csv({ separator: '\t' }))
  .on('data', (row) => console.log(row))

Error Handling

const csv = require('fast-csv-parser')

fs.createReadStream('data.csv')
  .pipe(csv({ strict: true }))
  .on('data', (row) => console.log(row))
  .on('headers', (headers) => console.log('Headers:', headers))
  .on('error', (err) => {
    console.error('Parse error:', err.message)
  })
  .on('end', () => console.log('Parsing complete'))

Streaming Transform Pipeline

const csv = require('fast-csv-parser')
const { Transform } = require('stream')

const processor = new Transform({
  objectMode: true,
  transform(row, encoding, callback) {
    // Process each row
    row.processed_at = new Date().toISOString()
    this.push(row)
    callback()
  }
})

fs.createReadStream('input.csv')
  .pipe(csv())
  .pipe(processor)
  .pipe(fs.createWriteStream('output.json'))

📚 API Documentation

fast-csv-parser implements the exact same API as the original csv-parser. All options, events, and behaviors are identical.

csv([options | headers])

Returns: Transform Stream

Options

All original csv-parser options are supported:

  • separator (String, default: ,) - Column separator
  • quote (String, default: ") - Quote character
  • escape (String, default: ") - Escape character
  • newline (String, default: \n) - Line ending
  • headers (Array|Boolean) - Custom headers or disable header parsing
  • mapHeaders (Function) - Transform header names
  • mapValues (Function) - Transform cell values
  • skipLines (Number, default: 0) - Skip initial lines
  • skipComments (Boolean|String, default: false) - Skip comment lines
  • maxRowBytes (Number) - Maximum bytes per row
  • strict (Boolean, default: false) - Strict column count validation
  • raw (Boolean, default: false) - Disable UTF-8 decoding

Example with Options

const csv = require('fast-csv-parser')

fs.createReadStream('data.tsv')
  .pipe(csv({
    separator: '\t',
    mapHeaders: ({ header }) => header.toLowerCase(),
    mapValues: ({ value }) => value.trim()
  }))
  .on('data', (row) => console.log(row))

🎯 Events

data

Emitted for each parsed row (excluding headers).

headers

Emitted after header row is parsed with Array<string> of header names.

end

Emitted when parsing is complete.

error

Emitted on parsing errors.

⚡ Performance Tips

  • Use on large files - The performance benefits are most apparent with files >1MB
  • Enable raw: true for maximum speed if you don't need UTF-8 processing
  • Avoid complex mapValues functions - They can negate performance gains
  • Set appropriate maxRowBytes to avoid memory issues with malformed data

🌐 Platform Support

Pre-built binaries are available for:

  • macOS: x64, ARM64 (Apple Silicon)
  • Linux: x64, ARM64, ARM7 (GNU & musl)
  • Windows: x64, x86, ARM64
  • FreeBSD: x64
  • Android: ARM64, ARM7

🔄 Migration from csv-parser

Migration is seamless:

// Old code
const csvParser = require('csv-parser')

// New code - just change the import!
const csvParser = require('fast-csv-parser')

// Everything else stays the same

🔤 Encoding Support

fast-csv-parser automatically detects and handles multiple text encodings:

  • UTF-8 (default) - with automatic BOM stripping
  • UTF-16 LE (Little Endian) - with BOM detection
  • UTF-16 BE (Big Endian) - with BOM detection

No configuration needed - encoding is detected automatically from Byte Order Marks (BOM):

const csv = require('fast-csv-parser')

// Works with UTF-8, UTF-16 LE, UTF-16 BE files automatically
fs.createReadStream('data-utf16.csv')  // UTF-16 file
  .pipe(csv())
  .on('data', (row) => console.log(row))

Encoding Details

EncodingBOMDetectionStatus
UTF-8EF BB BFAuto-detected, BOM stripped✅ Supported
UTF-16 LEFF FEAuto-detected✅ Supported
UTF-16 BEFE FFAuto-detected✅ Supported
ASCIINoneTreated as UTF-8✅ Supported
Other-Not supported

📊 Benchmarks

Run benchmarks yourself:

# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser

# Install dependencies
npm install

# Run performance comparison against original csv-parser
./benchmark-comparison.js

# Run individual benchmarks
npm run bench

Sample output:

🏁 CSV Parser Performance Comparison
Comparing Rust implementation vs Original JavaScript csv-parser

📊 PERFORMANCE COMPARISON
========================================
File                    Original    Rust     Speedup
large-dataset.csv         59ms      47ms     1.26x ⚡
option-maxRowBytes.csv    36ms      29ms     1.24x ⚡
basic.csv               0.098ms    0.43ms    0.23x

🛠️ Error Handling

Errors are handled identically to the original csv-parser:

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('error', (err) => {
    if (err instanceof RangeError) {
      console.log('Row length mismatch')
    } else {
      console.log('Parse error:', err.message)
    }
  })

📝 TypeScript Support

Full TypeScript definitions are included:

import csv from 'fast-csv-parser'

interface Row {
  name: string
  age: number
}

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('data', (row: Row) => {
    console.log(row.name, row.age)
  })

🖥️ CLI Usage

The CLI is fully compatible with csv-parser:

# Parse CSV to NDJSON
fast-csv-parser data.csv

# Parse TSV
fast-csv-parser -s $'\t' data.tsv

# Custom options
fast-csv-parser --separator=';' --quote='"' data.csv

🔤 Advanced Encoding Usage

For most cases, encoding is handled automatically. For advanced scenarios:

const csv = require('fast-csv-parser')

// Automatic encoding detection (recommended)
fs.createReadStream('data.csv')  // Any supported encoding
  .pipe(csv())
  .on('data', (row) => console.log(row))

// For unsupported encodings, use iconv-lite preprocessing
const iconv = require('iconv-lite')
fs.createReadStream('latin1-data.csv')
  .pipe(iconv.decodeStream('latin1'))
  .pipe(csv())
  .on('data', (row) => console.log(row))

🤔 When to Use

Use fast-csv-parser when:

  • Processing large CSV files (>1MB)
  • Performance is critical
  • You want a drop-in replacement for csv-parser
  • Working with data pipelines or ETL processes

Stick with csv-parser when:

  • Processing many small files (<1KB each) where startup overhead matters
  • Bundle size is extremely critical (csv-parser is pure JS)
  • You need cutting-edge features that haven't been ported yet

❓ FAQ

Q: Is this a drop-in replacement?

A: Yes! Just change your import from csv-parser to fast-csv-parser. All options, events, and behaviors are identical.

Q: When should I use this over the original?

A: Use fast-csv-parser when:

  • Processing files larger than 10KB
  • Performance is critical (ETL pipelines, data processing)
  • You're already using csv-parser and want better performance

Q: Are there any breaking changes?

A: No breaking changes. The API is 100% compatible. UTF-16 files are now properly supported with automatic encoding detection.

Q: Why is it slower on small files?

A: There's a ~0.1ms overhead from the Node.js ↔ Rust boundary. For tiny files, this overhead exceeds the parsing time. The crossover point is around 1KB.

Q: Can I use this in the browser?

A: Not currently. This is a native Node.js addon. For browser use, stick with the original csv-parser.

Q: How stable is this?

A: Very stable. It passes all csv-parser tests and maintains the same error handling. The Rust core uses battle-tested CSV parsing libraries.

🛠️ Development

Building from Source

Requirements:

  • Node.js 16+
  • Rust 1.60+
  • Supported platform (see Platform Support above)
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser

# Install dependencies
npm install

# Build the native addon
npm run build

# Run tests
npm test

# Run benchmarks
npm run bench

Project Structure

fast-csv-parser/
├── src/                 # Rust source code
│   ├── lib.rs          # N-API bindings
│   └── parser.rs       # Core CSV parsing logic
├── __test__/           # Test files and fixtures
├── examples/           # Usage examples
├── bin/                # CLI tools
├── main.js             # Main entry point with Stream API
└── index.js           # Auto-generated native binding loader (build-safe)

Architecture

  • Rust Core (src/parser.rs) - High-performance CSV parsing
  • N-API Bridge (src/lib.rs) - Node.js ↔ Rust interface
  • Native Loader (index.js) - Auto-generated cross-platform binary loading
  • Main Entry (main.js) - Stream API compatibility layer (build-safe)

🤝 Contributing

Contributions welcome! This project maintains:

  • 100% API compatibility with csv-parser
  • Comprehensive test coverage (23+ test files)
  • Performance benchmarks
  • Cross-platform CI/CD
  • Rust best practices

Testing

# Run all tests
npm test

# Run specific test
npm test headers.test.mjs

# Run benchmarks
./benchmark-comparison.js

📄 License

MIT License - same as the original csv-parser.

🙏 Acknowledgments

Built on the shoulders of giants:

Special thanks to the csv-parser community for creating such a robust and well-designed API that made this drop-in replacement possible.

🚀 Ready to speed up your CSV processing?

npm install fast-csv-parser

Just replace your import and enjoy the performance boost!

⭐ Star on GitHub📦 View on npm🐛 Report Issues

FAQs

Package last updated on 03 Jul 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts