
Security News
Another Round of TEA Protocol Spam Floods npm, But It’s Not a Worm
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.
csv-to-array-browser
Advanced tools
Parse CSV files into JavaScript objects — in the browser and Node.js — with zero dependencies.
File, Blob, ArrayBuffer, Buffer, or plain string"a, b"), escaped quotes (""), CR/LF line endings,, ;, \t, |) or you can set one"42" → 42, "true" → true, "null" → null)npm i csv-to-array-browser
import { parseCSV } from "csv-to-array-browser";
document.querySelector("#file").addEventListener("change", async (e) => {
const file = e.target.files[0];
const rows = await parseCSV(file, { sheet: undefined }); // sheet is ignored; CSV options shown below
console.log(rows);
});
<input id="file" type="file" accept=".csv" />;
import { readFileSync } from "node:fs";
import { parseCSV } from "csv-to-array-browser";
const buf = readFileSync("./example.csv"); // Buffer
const rows = await parseCSV(buf, { inferTypes: true });
console.log(rows);
parseWorkbook(input, options?)
input
File | Blob | ArrayBuffer | Buffer
(Works with browser files, Node Buffers, or raw ArrayBuffers)
options (optional)
| Option | Type | Default | Description |
|---|---|---|---|
delimiter | string | auto | Field delimiter. If omitted, tries `, ; \t |
headers | boolean | string[] | true | Header handling. true: first row is headers. false: synthesize headers (c0, c1, …). string[]: use these headers and skip the first CSV row. |
preserveHeaderCase | boolean | false | When false, headers are trimmed + lowercased. When true, keep original casing. |
comment | string | undefined | Lines starting with this (e.g. "#") are skipped. |
skipEmptyLines | boolean | true | Drop empty rows. |
defval | unknown | null | Value to use when a row has fewer cells than headers. |
inferTypes | boolean | false | Coerce "true"/"false" → boolean, "null" → null, number-like strings → numbers; otherwise leave as strings. |
trimCells | boolean | true | Trims cells when true |
The parser automatically strips a UTF-8 BOM at the start of the file.
const csv = `name,age,city
Alice,30,Paris
Bob,28,Berlin`;
await parseCSV(csv);
// [
// { name: "Alice", age: "30", city: "Paris" },
// { name: "Bob", age: "28", city: "Berlin" }
// ]
const csv = `name,age,active
Alice,30,true
Bob,28,false`;
await parseCSV(csv, { inferTypes: true });
// [
// { name: "Alice", age: 30, active: true },
// { name: "Bob", age: 28, active: false }
// ]
const csv = `name;age;city
Alice;30;Paris`;
await parseCSV(csv, { delimiter: ";" });
// [{ name: "Alice", age: "30", city: "Paris" }]
const csv = `a,b
1,2`;
await parseCSV(csv, { headers: ["X", "Y"] });
// [{ x: "1", y: "2" }]
const csv = `a,b
1,2`;
await parseCSV(csv, { headers: false });
// [
// { c0: "a", c1: "b" },
// { c0: "1", c1: "2" }
// ]
const csv = `name,quote
"Alice, Jr.","She said ""Hi"""`;
await parseCSV(csv);
// [{ name: "Alice, Jr.", quote: 'She said "Hi"' }]
const csv = `# header
name,age\r\nAlice,30\r\nBob,28\r\n`;
await parseCSV(csv, { comment: "#" });
// [
// { name: "Alice", age: "30" },
// { name: "Bob", age: "28" }
// ]
const csv = `# header
name,age\r\n Alice ,30\r\nBob,28\r\n`;
await parseCSV(csv, { trimCells: "false" });
// [
// { name: " Alice ", age: "30" },
// { name: "Bob", age: "28" }
// ]
For typical UX (uploads < ~10–20 MB) it’s great. For very large files or progressive processing, consider a streaming approach (Node: csv-parse; browser: Web Streams + Worker).
It samples the first lines and tries common delimiters (, ; \t |), picking the one that yields the most consistent, widest table.
Not automatically. You can enable inferTypes for booleans/numbers/nulls and then post-process date fields as needed.
npm install
npm run build
npm test
npm run build → Build dist files
npm test → Run unit tests (Vitest)
npm run pack:check → Preview files that will publish
FAQs
converts csv data to javascript array
We found that csv-to-array-browser demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.

Security News
PyPI adds Trusted Publishing support for GitLab Self-Managed as adoption reaches 25% of uploads

Research
/Security News
A malicious Chrome extension posing as an Ethereum wallet steals seed phrases by encoding them into Sui transactions, enabling full wallet takeover.