Socket
Book a DemoInstallSign in
Socket

csv-to-array-browser

Package Overview
Dependencies
Maintainers
1
Versions
13
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

csv-to-array-browser

converts csv data to javascript array

latest
npmnpm
Version
2.2.0
Version published
Maintainers
1
Created
Source

csv-to-array-browser

Parse CSV files into JavaScript objects — in the browser and Node.js — with zero dependencies.

  • ✅ Works with File, Blob, ArrayBuffer, Buffer, or plain string
  • ✅ Handles quotes ("a, b"), escaped quotes (""), CR/LF line endings
  • ✅ Auto-detects common delimiters (,, ;, \t, |) or you can set one
  • ✅ Optional type coercion ("42"42, "true"true, "null"null)
  • ✅ Strips UTF-8 BOM so your first header isn’t corrupted
  • ✅ Tiny surface area, TypeScript-ready

📦 Installation

npm i csv-to-array-browser

🚀 Usage

Browser (with file input)

import { parseCSV } from "csv-to-array-browser";

document.querySelector("#file").addEventListener("change", async (e) => {
  const file = e.target.files[0];
  const rows = await parseCSV(file, { sheet: undefined }); // sheet is ignored; CSV options shown below
  console.log(rows);
});

<input id="file" type="file" accept=".csv" />;

Node.js (with Buffer/ArrayBuffer)

import { readFileSync } from "node:fs";
import { parseCSV } from "csv-to-array-browser";

const buf = readFileSync("./example.csv"); // Buffer
const rows = await parseCSV(buf, { inferTypes: true });

console.log(rows);

📖 API

parseWorkbook(input, options?)

  • input

    File | Blob | ArrayBuffer | Buffer

    (Works with browser files, Node Buffers, or raw ArrayBuffers)

  • options (optional)

OptionTypeDefaultDescription
delimiterstringautoField delimiter. If omitted, tries `, ; \t
headersboolean | string[]trueHeader handling. true: first row is headers. false: synthesize headers (c0, c1, …). string[]: use these headers and skip the first CSV row.
preserveHeaderCasebooleanfalseWhen false, headers are trimmed + lowercased. When true, keep original casing.
commentstringundefinedLines starting with this (e.g. "#") are skipped.
skipEmptyLinesbooleantrueDrop empty rows.
defvalunknownnullValue to use when a row has fewer cells than headers.
inferTypesbooleanfalseCoerce "true"/"false" → boolean, "null" → null, number-like strings → numbers; otherwise leave as strings.
trimCellsbooleantrueTrims cells when true

The parser automatically strips a UTF-8 BOM at the start of the file.

Examples

1) Basic CSV

const csv = `name,age,city
Alice,30,Paris
Bob,28,Berlin`;

await parseCSV(csv);
// [
//   { name: "Alice", age: "30", city: "Paris" },
//   { name: "Bob",   age: "28",  city: "Berlin" }
// ]

2) With type coercion

const csv = `name,age,active
Alice,30,true
Bob,28,false`;

await parseCSV(csv, { inferTypes: true });
// [
//   { name: "Alice", age: 30, active: true },
//   { name: "Bob",   age: 28, active: false }
// ]

3) Custom delimiter (semicolon)

const csv = `name;age;city
Alice;30;Paris`;

await parseCSV(csv, { delimiter: ";" });
// [{ name: "Alice", age: "30", city: "Paris" }]

4) Custom headers (replace the file’s header row)

const csv = `a,b
1,2`;

await parseCSV(csv, { headers: ["X", "Y"] });
// [{ x: "1", y: "2" }]

5) No headers (synthesize c0, c1, …)

const csv = `a,b
1,2`;

await parseCSV(csv, { headers: false });
// [
//   { c0: "a", c1: "b" },
//   { c0: "1", c1: "2" }
// ]

6) Quoted fields + escaped quotes

const csv = `name,quote
"Alice, Jr.","She said ""Hi"""`;

await parseCSV(csv);
// [{ name: "Alice, Jr.", quote: 'She said "Hi"' }]

7) Comments & CRLF

const csv = `# header
name,age\r\nAlice,30\r\nBob,28\r\n`;

await parseCSV(csv, { comment: "#" });
// [
//   { name: "Alice", age: "30" },
//   { name: "Bob", age: "28" }
// ]

8) Trim cells

const csv = `# header
name,age\r\n Alice ,30\r\nBob,28\r\n`;

await parseCSV(csv, { trimCells: "false" });
// [
//   { name: " Alice ", age: "30" },
//   { name: "Bob", age: "28" }
// ]

FAQ

Q: How big a CSV can it handle?

For typical UX (uploads < ~10–20 MB) it’s great. For very large files or progressive processing, consider a streaming approach (Node: csv-parse; browser: Web Streams + Worker).

Q: How does auto delimiter detection work?

It samples the first lines and tries common delimiters (, ; \t |), picking the one that yields the most consistent, widest table.

Q: Does it parse dates?

Not automatically. You can enable inferTypes for booleans/numbers/nulls and then post-process date fields as needed.

🛠 Development

Clone the repo and run:

npm install
npm run build
npm test

npm run build → Build dist files

npm test → Run unit tests (Vitest)

npm run pack:check → Preview files that will publish

Keywords

csv

FAQs

Package last updated on 31 Aug 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts