Security News
PyPI Introduces Digital Attestations to Strengthen Python Package Security
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
@fast-csv/parse
Advanced tools
@fast-csv/parse is a powerful and flexible CSV parsing library for Node.js. It allows you to read and parse CSV files or strings with ease, providing a variety of options to handle different CSV formats and use cases.
Parsing CSV from a file
This feature allows you to parse a CSV file from the file system. The `headers: true` option indicates that the first row of the CSV file contains headers.
const fs = require('fs');
const { parse } = require('@fast-csv/parse');
fs.createReadStream('path/to/your.csv')
.pipe(parse({ headers: true }))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
Parsing CSV from a string
This feature allows you to parse a CSV string directly. The `headers: true` option indicates that the first row of the CSV string contains headers.
const { parseString } = require('@fast-csv/parse');
const csvString = 'header1,header2\nvalue1,value2\nvalue3,value4';
parseString(csvString, { headers: true })
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
Handling different delimiters
This feature allows you to parse CSV data with a custom delimiter. In this example, the delimiter is set to a semicolon (`;`).
const { parseString } = require('@fast-csv/parse');
const csvString = 'header1;header2\nvalue1;value2\nvalue3;value4';
parseString(csvString, { headers: true, delimiter: ';' })
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
Transforming data during parsing
This feature allows you to transform data during parsing. The `transform` function is applied to each row, allowing you to modify the data as it is being parsed.
const { parse } = require('@fast-csv/parse');
const transform = (row) => {
return {
...row,
transformed: true
};
};
parse({ headers: true, transform })
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
csv-parser is a simple and fast CSV parsing library for Node.js. It is similar to @fast-csv/parse in terms of functionality but is often praised for its simplicity and ease of use. However, it may not offer as many advanced features and customization options as @fast-csv/parse.
PapaParse is a powerful CSV parsing library that works in both Node.js and the browser. It offers a wide range of features, including support for large files, web workers, and various parsing options. Compared to @fast-csv/parse, PapaParse is more versatile in terms of environment support but may have a steeper learning curve.
csv-parse is a part of the CSV module suite from the Node.js CSV project. It provides robust CSV parsing capabilities with a focus on performance and flexibility. While it offers similar functionality to @fast-csv/parse, it is part of a larger suite of CSV-related tools, which can be advantageous for more complex CSV handling needs.
@fast-csv/parse
fast-csv
package to parse CSVs.
npm i -S @fast-csv/parse
To use fast-csv
in javascript
you can require the module
const csv = require('@fast-csv/parse');
To import with typescript
import * as format csv '@fast-csv/parse';
objectMode: {boolean} = true
: Ensure that data
events have an object emitted rather than the stringified version set to false to have a stringified buffer.delimiter: {string} = ','
: If your data uses an alternate delimiter such as ;
or \t
.
delimiter
you may only pass in a single character delimiterquote: {string} = '"'
: The character to use to quote fields that contain a delimiter. If you set to null
then all quoting will be ignored.
"first,name",last name
escape: {string} = '"'
: The character to used tp escape quotes inside of a quoted field.
i.e
: First,"Name"' => '"First,""Name"""
headers: {boolean|string[]|(string[]) => string[])} = false
:
true
string[]
string[]
and set the renameHeaders
option to true
strictColumnHandling
is set to true
renameHeaders: {boolean} = false
: If you want the first line of the file to be removed and replaced by the one provided in the headers
option.
headers
option is a string[]
headers
option is a function then this option is always set to true.ignoreEmpty: {boolean} = false
: If you wish to ignore empty rows.
comment: {string} = null
: If your CSV contains comments you can use this option to ignore lines that begin with the specified character (e.g. #
).discardUnmappedColumns: {boolean} = false
: If you want to discard columns that do not map to a header.
strictColumnHandling: {boolean} = false
: If you want to consider empty lines/lines with too few fields as invalid and emit a data-invalid
event
headers
are present.trim: {boolean} = false
: Set to true
to trim all fieldsrtrim: {boolean} = false
: Set to true
to right trim all fields.ltrim: {boolean} = false
: Set to true
to left trim all fields.encoding: {string} = 'utf8'
: Passed to StringDecoder when decoding incoming buffers. Change if incoming content is not 'utf8' encoded.maxRows: {number} = 0
: If number is > 0
the specified number of rows will be parsed.(e.g. 100
would return the first 100 rows of data).skipRows: {number} = 0
: If number is > 0
the specified number of parsed rows will be skipped.skipLines: {number} = 0
: If number is > 0
the specified number of lines will be skipped.headers
: Emitted when the headers are parsed
true
the headers will be the parsed headers from the csv.false
or the csv has no rows then the event WILL NOT be emitted.data
: Emitted when a record is parsed.
objectMode
is set to false then all rows will be a buffer with a JSON row.data-invalid
: Emitted if there was invalid row encounted;
validate
function is provided and an invalid row is encountered.strictColumnHandling
is true
and a row with a different number of fields than headers is encountered.
csv.parse([options]): CsvParserStream
Creates a Csv Parsing Stream that can be piped or written to.
This is the main entrypoint and is used by all the other parsing helpers.
//creates a stream you can pipe
const stream = csv.parse()
stream
.on('error', error => console.error(error))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
To pipe to the stream from a file you can do the following.
const csv = require('fast-csv');
fs.createReadStream('my.csv')
.pipe(csv.parse())
.on('error', error => console.error(error))
.on('data', row => console.log(`ROW=${JSON.stringify(row)}`))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
const csv = require('fast-csv');
const fileStream = fs.createReadStream("my.csv");
const parser = csv.parse();
fileStream
.pipe(parser)
.on('error', error => console.error(error))
.on('readable', () => {
for (let row = parser.read(); row; row = parser.read()) {
console.log(`ROW=${JSON.stringify(row)}`);
}
})
.on('end', (rowCount) => console.log(`Parsed ${rowCount} rows`));
csv.parseStream(readableStream[, options]): CsvParserStream
Accepts a readable stream and pipes it to a CsvParserStream
.
const stream = fs.createReadStream('./path/to/my.csv');
csv
.parseStream(stream)
.on('error', error => console.error(error))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
csv.parseFile(path[, options]): CsvParserStream
Parses a file from the specified path and returns the CsvParserStream
.
const csv = require('fast-csv');
csv
.parseFile('./path/to/my.csv')
.on('error', error => console.error(error))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
csv.parseString(string[, options]): CsvParserStream
This method parses a string and returns the CsvParserStream
.
const { EOL } = require('os');
const csv = require('fast-csv');
const CSV_STRING = [
'a,b',
'a1,b1',
'a2,b2',
].join(EOL);
csv
.parseString(CSV_STRING, { headers: true })
.on('error', error => console.error(error))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
FAQs
fast-csv parsing package
We found that @fast-csv/parse demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.