
Product
Socket for Jira Is Now Available
Socket for Jira lets teams turn alerts into Jira tickets with manual creation, automated ticketing rules, and two-way sync.
convert-csv-to-json
Advanced tools
Convert CSV files to JSON with no dependencies. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance. Memory-efficient streaming for processing large files without loading them entirely into memory.
Transform CSV data into JSON with a simple, chainable API. Choose your implementation style:
✅ RFC 4180 Compliant - Proper handling of quoted fields, delimiters, newlines, and escape sequences
✅ Zero Dependencies - No external packages required
✅ Full TypeScript Support - Included type definitions for all APIs
✅ Flexible Configuration - Custom delimiters, encoding, trimming, and more
✅ Method Chaining - Fluent API for readable code
✅ Memory-Efficient Streaming - Process large files without loading them entirely into memory
✅ Comprehensive Error Handling - Detailed, actionable error messages with solutions (see ERROR_HANDLING.md)
RFC 4180 is the IETF standard specification for CSV (Comma-Separated Values) files. This library is fully compliant with RFC 4180, ensuring proper handling of:
| Aspect | RFC 4180 Specification |
|---|---|
| Default Delimiter | Comma (,) |
| Record Delimiter | CRLF (\r\n) or LF (\n) |
| Quote Character | Double-quote (") |
| Quote Escaping | Double quotes ("") |
firstName,lastName,email
"Smith, John",Smith,john@example.com
Jane,Doe,jane@example.com
"Cooper, Andy",Cooper,andy@company.com
Note the quoted fields containing commas are properly handled. See RFC4180_MIGRATION_GUIDE.md for breaking changes and migration details.
npm install convert-csv-to-json
const csvToJson = require('convert-csv-to-json');
const json = csvToJson.getJsonFromCsv('input.csv');
const csvToJson = require('convert-csv-to-json');
const json = await csvToJson.getJsonFromCsvAsync('input.csv');
const convert = require('convert-csv-to-json');
const json = await convert.browser.parseFile(file);
| Implementation | Use Case | Learn More |
|---|---|---|
| Sync API | Simple, blocking operations | Read SYNC.md |
| Async API | Concurrent operations, large files | Read ASYNC.md |
| Browser API | Client-side file parsing | Read BROWSER.md |
const json = csvToJson.csvStringToJson('name,age\nAlice,30');
const json = csvToJson
.fieldDelimiter(';')
.getJsonFromCsv('input.csv');
const json = csvToJson
.formatValueByType()
.getJsonFromCsv('input.csv');
// Converts "30" → 30, "true" → true, etc.
const json = csvToJson
.supportQuotedField(true)
.getJsonFromCsv('input.csv');
const files = ['file1.csv', 'file2.csv', 'file3.csv'];
const results = await Promise.all(
files.map(f => csvToJson.getJsonFromCsvAsync(f))
);
All APIs (Sync, Async and Browser) support the same configuration methods:
fieldDelimiter(char) - Set field delimiter (default: ,)formatValueByType() - Auto-convert numbers, booleanssupportQuotedField(bool) - Handle quoted fields with embedded delimitersindexHeader(num) - Specify header row (default: 0)trimHeaderFieldWhiteSpace(bool) - Remove spaces from headersparseSubArray(delim, sep) - Parse delimited arraysmapRows(fn) - Transform, filter, or enrich each rowgetJsonFromStreamAsync(stream) - Process CSV from Readable streams for NodeJS and BrowsergetJsonFromFileStreamingAsync(filePath) - Stream processing for large files for NodeJS and BrowsergetJsonFromFileStreamingAsyncWithCallback(filePath, options = {}) - Parse CSV from a File using streaming with progress callbacks for large filesutf8Encoding(), latin1Encoding(), etc. - Set file encodingfieldDelimiter(char) - Set field delimiter (default: ,)// Semicolon-delimited
csvToJson.fieldDelimiter(';').getJsonFromCsv('data.csv');
// Tab-delimited
csvToJson.fieldDelimiter('\t').getJsonFromCsv('data.tsv');
// Pipe-delimited
csvToJson.fieldDelimiter('|').getJsonFromCsv('data.psv');
formatValueByType() - Auto-convert numbers, booleans// Input: name,age,active
// John,30,true
csvToJson.formatValueByType().getJsonFromCsv('data.csv');
// Output: { name: 'John', age: 30, active: true }
supportQuotedField(bool) - Handle quoted fields with embedded delimiters// Input: name,description
// "Smith, John","He said ""Hello"""
csvToJson.supportQuotedField(true).getJsonFromCsv('data.csv');
// Output: { name: 'Smith, John', description: 'He said "Hello"' }
indexHeader(num) - Specify header row (default: 0)// If headers are in row 2 (3rd line):
csvToJson.indexHeader(2).getJsonFromCsv('data.csv');
trimHeaderFieldWhiteSpace(bool) - Remove spaces from headers// Input: " First Name ", " Last Name "
csvToJson.trimHeaderFieldWhiteSpace(true).getJsonFromCsv('data.csv');
// Output: { FirstName: 'John', LastName: 'Doe' }
parseSubArray(delim, sep) - Parse delimited arrays// Input: name,tags
// John,*javascript,nodejs,typescript*
csvToJson.parseSubArray('*', ',').getJsonFromCsv('data.csv');
// Output: { name: 'John', tags: ['javascript', 'nodejs', 'typescript'] }
mapRows(fn) - Transform, filter, or enrich each row// Filter out rows that don't match a condition
const result = csvToJson
.fieldDelimiter(',')
.mapRows((row) => {
// Only keep rows where age >= 30
if (parseInt(row.age) >= 30) {
return row;
}
return null; // Filters out this row
})
.getJsonFromCsv('input.csv');
See mapRows Feature - Usage Guide.
utf8Encoding(), latin1Encoding(), etc. - Set file encoding// UTF-8 encoding
csvToJson.utf8Encoding().getJsonFromCsv('data.csv');
// Latin-1 encoding
csvToJson.latin1Encoding().getJsonFromCsv('data.csv');
// Custom encoding
csvToJson.customEncoding('ucs2').getJsonFromCsv('data.csv');
getJsonFromStreamAsync(stream) - Process CSV from Readable streamsconst fs = require('fs');
const csvToJson = require('convert-csv-to-json');
// Process large files without loading them entirely into memory
async function processLargeCSV() {
const stream = fs.createReadStream('large-dataset.csv');
const jsonData = await csvToJson
.fieldDelimiter(';')
.supportQuotedField(true)
.getJsonFromStreamAsync(stream);
console.log(`Processed ${jsonData.length} records efficiently`);
return jsonData;
}
getJsonFromFileStreamingAsync(filePath) - Stream processing for large filesconst csvToJson = require('convert-csv-to-json');
// Most efficient way to process large CSV files
async function processLargeCSV(filePath) {
const jsonData = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.getJsonFromFileStreamingAsync(filePath);
console.log(`Streamed and processed ${jsonData.length} records`);
return jsonData;
}
// Usage - handles files of any size without memory constraints
const data = await processLargeCSV('massive-dataset.csv');
getJsonFromFileStreamingAsyncWithCallback(filePath, options = {}) - Parse CSV from a File object using streaming with progress callbacks for large filesconst csvToJson = require('convert-csv-to-json');
const fileInput = document.querySelector('#csvfile').files[0];
csvToJson.browser.getJsonFromFileStreamingAsyncWithCallback(fileInput, {
chunkSize: 500,
onChunk: (rows, processed, total) => {
console.log(`Processed ${processed}/${total} rows`);
// Handle chunk of rows here
},
onComplete: (allRows) => {
console.log('Processing complete!');
},
onError: (error) => {
console.error('Error:', error);
}
});
See SYNC.md, ASYNC.md or BROWSER.md for complete configuration details.
const csvToJson = require('convert-csv-to-json');
async function processCSV() {
const data = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.supportQuotedField(true)
.getJsonFromCsvAsync('data.csv');
console.log(`Parsed ${data.length} records`);
return data;
}
Install dependencies:
npm install
Run tests:
npm test
Debug tests:
npm run test-debug
See CI/CD GitHub Action.
When pushing to the master branch:
[MAJOR] in commit message for major release (e.g., v1.0.0 → v2.0.0)[PATCH] in commit message for patch release (e.g., v1.0.0 → v1.0.1)CSVtoJSON is licensed under the MIT License.
Found a bug or need a feature? Open an issue on GitHub.
Follow me and consider starring the project to show your support ⭐
If you find this project helpful and would like to support its development:
BTC: 37vdjQhbaR7k7XzhMKWzMcnqUxfw1njBNk
FAQs
Convert CSV to JSON
The npm package convert-csv-to-json receives a total of 52,300 weekly downloads. As such, convert-csv-to-json popularity was classified as popular.
We found that convert-csv-to-json demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Socket for Jira lets teams turn alerts into Jira tickets with manual creation, automated ticketing rules, and two-way sync.

Company News
Socket won two 2026 Reppy Awards from RepVue, ranking in the top 5% of all sales orgs. AE Alexandra Lister shares what it's like to grow a sales career here.

Security News
NIST will stop enriching most CVEs under a new risk-based model, narrowing the NVD's scope as vulnerability submissions continue to surge.