CSVtoJSON


Convert CSV files to JSON with no dependencies. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance. Memory-efficient streaming for processing large files without loading them entirely into memory.
Overview
Transform CSV data into JSON with a simple, chainable API. Choose your implementation style:
- Synchronous API - Blocking operations for simple workflows
- Asynchronous API - Promise-based for modern async/await patterns with memory-efficient streaming for large files
- Browser API - Client-side CSV parsing for web applications
Demo and JSDoc
Features
✅ RFC 4180 Compliant - Proper handling of quoted fields, delimiters, newlines, and escape sequences
✅ Zero Dependencies - No external packages required
✅ Full TypeScript Support - Included type definitions for all APIs
✅ Flexible Configuration - Custom delimiters, encoding, trimming, and more
✅ Method Chaining - Fluent API for readable code
✅ Memory-Efficient Streaming - Process large files without loading them entirely into memory
✅ Comprehensive Error Handling - Detailed, actionable error messages with solutions (see ERROR_HANDLING.md)
RFC 4180 Standard
RFC 4180 is the IETF standard specification for CSV (Comma-Separated Values) files. This library is fully compliant with RFC 4180, ensuring proper handling of:
| Default Delimiter | Comma (,) |
| Record Delimiter | CRLF (\r\n) or LF (\n) |
| Quote Character | Double-quote (") |
| Quote Escaping | Double quotes ("") |
RFC 4180 Example
firstName,lastName,email
"Smith, John",Smith,john@example.com
Jane,Doe,jane@example.com
"Cooper, Andy",Cooper,andy@company.com
Note the quoted fields containing commas are properly handled. See RFC4180_MIGRATION_GUIDE.md for breaking changes and migration details.
Quick Start
Installation
npm install convert-csv-to-json
Synchronous (Simple)
const csvToJson = require('convert-csv-to-json');
const json = csvToJson.getJsonFromCsv('input.csv');
Asynchronous (Modern)
const csvToJson = require('convert-csv-to-json');
const json = await csvToJson.getJsonFromCsvAsync('input.csv');
Browser
const convert = require('convert-csv-to-json');
const json = await convert.browser.parseFile(file);
Documentation
Common Tasks
Parse CSV String
const json = csvToJson.csvStringToJson('name,age\nAlice,30');
Custom Delimiter
const json = csvToJson
.fieldDelimiter(';')
.getJsonFromCsv('input.csv');
Format Values
const json = csvToJson
.formatValueByType()
.getJsonFromCsv('input.csv');
Handle Quoted Fields
const json = csvToJson
.supportQuotedField(true)
.getJsonFromCsv('input.csv');
Batch Process Files (Async)
const files = ['file1.csv', 'file2.csv', 'file3.csv'];
const results = await Promise.all(
files.map(f => csvToJson.getJsonFromCsvAsync(f))
);
Configuration Options
All APIs (Sync, Async and Browser) support the same configuration methods:
fieldDelimiter(char) - Set field delimiter (default: ,)
formatValueByType() - Auto-convert numbers, booleans
supportQuotedField(bool) - Handle quoted fields with embedded delimiters
indexHeader(num) - Specify header row (default: 0)
trimHeaderFieldWhiteSpace(bool) - Remove spaces from headers
parseSubArray(delim, sep) - Parse delimited arrays
mapRows(fn) - Transform, filter, or enrich each row
getJsonFromStreamAsync(stream) - Process CSV from Readable streams for NodeJS and Browser
getJsonFromFileStreamingAsync(filePath) - Stream processing for large files for NodeJS and Browser
getJsonFromFileStreamingAsyncWithCallback(filePath, options = {}) - Parse CSV from a File using streaming with progress callbacks for large files
utf8Encoding(), latin1Encoding(), etc. - Set file encoding
Examples
fieldDelimiter(char) - Set field delimiter (default: ,)
csvToJson.fieldDelimiter(';').getJsonFromCsv('data.csv');
csvToJson.fieldDelimiter('\t').getJsonFromCsv('data.tsv');
csvToJson.fieldDelimiter('|').getJsonFromCsv('data.psv');
formatValueByType() - Auto-convert numbers, booleans
csvToJson.formatValueByType().getJsonFromCsv('data.csv');
supportQuotedField(bool) - Handle quoted fields with embedded delimiters
csvToJson.supportQuotedField(true).getJsonFromCsv('data.csv');
csvToJson.indexHeader(2).getJsonFromCsv('data.csv');
csvToJson.trimHeaderFieldWhiteSpace(true).getJsonFromCsv('data.csv');
parseSubArray(delim, sep) - Parse delimited arrays
csvToJson.parseSubArray('*', ',').getJsonFromCsv('data.csv');
mapRows(fn) - Transform, filter, or enrich each row
const result = csvToJson
.fieldDelimiter(',')
.mapRows((row) => {
if (parseInt(row.age) >= 30) {
return row;
}
return null;
})
.getJsonFromCsv('input.csv');
See mapRows Feature - Usage Guide.
utf8Encoding(), latin1Encoding(), etc. - Set file encoding
csvToJson.utf8Encoding().getJsonFromCsv('data.csv');
csvToJson.latin1Encoding().getJsonFromCsv('data.csv');
csvToJson.customEncoding('ucs2').getJsonFromCsv('data.csv');
getJsonFromStreamAsync(stream) - Process CSV from Readable streams
const fs = require('fs');
const csvToJson = require('convert-csv-to-json');
async function processLargeCSV() {
const stream = fs.createReadStream('large-dataset.csv');
const jsonData = await csvToJson
.fieldDelimiter(';')
.supportQuotedField(true)
.getJsonFromStreamAsync(stream);
console.log(`Processed ${jsonData.length} records efficiently`);
return jsonData;
}
getJsonFromFileStreamingAsync(filePath) - Stream processing for large files
const csvToJson = require('convert-csv-to-json');
async function processLargeCSV(filePath) {
const jsonData = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.getJsonFromFileStreamingAsync(filePath);
console.log(`Streamed and processed ${jsonData.length} records`);
return jsonData;
}
const data = await processLargeCSV('massive-dataset.csv');
getJsonFromFileStreamingAsyncWithCallback(filePath, options = {}) - Parse CSV from a File object using streaming with progress callbacks for large files
const csvToJson = require('convert-csv-to-json');
const fileInput = document.querySelector('#csvfile').files[0];
csvToJson.browser.getJsonFromFileStreamingAsyncWithCallback(fileInput, {
chunkSize: 500,
onChunk: (rows, processed, total) => {
console.log(`Processed ${processed}/${total} rows`);
},
onComplete: (allRows) => {
console.log('Processing complete!');
},
onError: (error) => {
console.error('Error:', error);
}
});
See SYNC.md, ASYNC.md or BROWSER.md for complete configuration details.
Example: Complete Workflow
const csvToJson = require('convert-csv-to-json');
async function processCSV() {
const data = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.supportQuotedField(true)
.getJsonFromCsvAsync('data.csv');
console.log(`Parsed ${data.length} records`);
return data;
}
Migration Guides
Development
Install dependencies:
npm install
Run tests:
npm test
Debug tests:
npm run test-debug
CI/CD GitHub Action
See CI/CD GitHub Action.
Release
When pushing to the master branch:
- Include
[MAJOR] in commit message for major release (e.g., v1.0.0 → v2.0.0)
- Include
[PATCH] in commit message for patch release (e.g., v1.0.0 → v1.0.1)
- Minor release is applied by default (e.g., v1.0.0 → v1.1.0)
License
CSVtoJSON is licensed under the MIT License.
Support
Found a bug or need a feature? Open an issue on GitHub.
Follow me and consider starring the project to show your support ⭐
Buy Me a Coffee
If you find this project helpful and would like to support its development:
BTC: 37vdjQhbaR7k7XzhMKWzMcnqUxfw1njBNk