Security News
JSR Working Group Kicks Off with Ambitious Roadmap and Plans for Open Governance
At its inaugural meeting, the JSR Working Group outlined plans for an open governance model and a roadmap to enhance JavaScript package management.
The readdirp npm package is a Node.js module that allows for reading the contents of directories recursively with a stream API. It provides a flexible and powerful way to filter, map, and reduce directory contents in an efficient manner. This package is particularly useful for tasks that involve file system operations, such as building file trees, searching for files with specific patterns, or processing files in batches.
Streaming directory contents
This feature allows you to stream the contents of a directory, filtering for specific file types (in this case, JavaScript files). It's useful for processing files as they are found.
const readdirp = require('readdirp');
readdirp('.', { fileFilter: '*.js' })
.on('data', (entry) => {
console.log(entry.path);
})
.on('end', () => console.log('Done'));
Promise API for directory reading
This feature provides a Promise-based API for reading directories, allowing for asynchronous file processing with better error handling and integration with async/await syntax.
const readdirp = require('readdirp');
readdirp('.', { depth: 1, fileFilter: '*.js' })
.then(files => {
files.forEach(file => console.log(file.path));
})
.catch(error => console.error('Error:', error));
Custom filter and entry formatting
This feature demonstrates how to use custom filters for file selection and how to format the entries returned by readdirp. It's useful for more complex file selection criteria and custom output formatting.
const readdirp = require('readdirp');
const options = {
fileFilter: (entry) => entry.basename.startsWith('test'),
entryType: 'files',
alwaysStat: true,
depth: 2
};
readdirp('.', options)
.on('data', (entry) => {
console.log(`${entry.path} - ${entry.stats.size} bytes`);
});
The 'glob' package provides pattern matching functionality to select files in directories. It's similar to readdirp in that it can be used to search for files, but it uses glob patterns instead of providing a stream API or Promise-based interface.
The 'fs-extra' package extends the built-in Node.js 'fs' module with additional file system methods, including recursive directory reading. While it offers similar functionality to readdirp, fs-extra provides a broader set of file system operations, making it more of a general-purpose library.
The 'walk' package is another Node.js module for recursively reading directory contents. It is similar to readdirp but focuses more on event-based directory walking. Compared to readdirp, 'walk' might offer a simpler API for some use cases but lacks the advanced filtering and mapping capabilities.
Recursive version of fs.readdir. Exposes a stream api.
var readdirp = require('readdirp');
, path = require('path')
, es = require('event-stream');
// print out all JavaScript files along with their size
var stream = readdirp({ root: path.join(__dirname), fileFilter: '*.js' });
stream
.on('warn', function (err) {
console.error('non-fatal error', err);
// optionally call stream.destroy() here in order to abort and cause 'close' to be emitted
})
.on('error', function (err) { console.error('fatal error', err); })
.pipe(es.mapSync(function (entry) {
return { path: entry.path, size: entry.stat.size };
}))
.pipe(es.stringify())
.pipe(process.stdout);
Meant to be one of the recursive versions of fs functions, e.g., like mkdirp.
Table of Contents generated with DocToc
npm install readdirp
var entryStream = readdirp (options)
Reads given root recursively and returns a stream
of entry infos.
Behaves as follows:
emit('data')
passes an entry info whenever one is foundemit('warn')
passes a non-fatal Error
that prevents a file/directory from being processed (i.e., if it is
inaccessible to the user)emit('error')
passes a fatal Error
which also ends the stream (i.e., when illegal options where passed)emit('end')
called when all entries were found and no more will be emitted (i.e., we are done)emit('close')
called when the stream is destroyed via stream.destroy()
(which could be useful if you want to
manually abort even on a non fatal error) - at that point the stream is no longer readable
and no more entries,
warning or errors are emittedpaused
initially in order to allow pipe
and on
handlers be connected before data or errors are
emittedresumed
automatically during the next event looproot: path in which to start reading and recursing into subdirectories
fileFilter: filter to include/exclude files found (see Filters for more)
directoryFilter: filter to include/exclude directories found and to recurse into (see Filters for more)
depth: depth at which to stop recursing even if more subdirectories are found
Has the following properties:
parentDir : directory in which entry was found (relative to given root)
fullParentDir : full path to parent directory
name : name of the file/directory
path : path to the file/directory (relative to given root)
fullPath : full path to the file/directory found
stat : built in stat object
Example: (assuming root was /User/dev/readdirp
)
parentDir : 'test/bed/root_dir1',
fullParentDir : '/User/dev/readdirp/test/bed/root_dir1',
name : 'root_dir1_subdir1',
path : 'test/bed/root_dir1/root_dir1_subdir1',
fullPath : '/User/dev/readdirp/test/bed/root_dir1/root_dir1_subdir1',
stat : [ ... ]
There are three different ways to specify filters for files and directories respectively.
function: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry
glob string: a string (e.g., *.js
) which is matched using minimatch, so go there for more
information.
Globstars (**
) are not supported since specifiying a recursive pattern for an already recursive function doesn't make sense.
Negated globs (as explained in the minimatch documentation) are allowed, e.g., !*.txt
matches everything but text files.
array of glob strings: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown.
[ '*.json', '*.js' ]
includes all JavaScript and Json files.
[ '!.git', '!node_modules' ]
includes all directories except the '.git' and 'node_modules'.
Directories that do not pass a filter will not be recursed into.
Although the stream api is recommended, readdirp also exposes a callback based api.
readdirp (options, callback1 [, callback2])
If callback2 is given, callback1 functions as the fileProcessed callback, and callback2 as the allProcessed callback.
If only callback1 is given, it functions as the allProcessed callback.
function (err, res) { ... }
function (entryInfo) { ... }
on('error', ..)
, on('warn', ..)
and on('end', ..)
handling omitted for brevity
var readdirp = require('readdirp');
// Glob file filter
readdirp({ root: './test/bed', fileFilter: '*.js' })
.on('data', function (entry) {
// do something with each JavaScript file entry
});
// Combined glob file filters
readdirp({ root: './test/bed', fileFilter: [ '*.js', '*.json' ] })
.on('data', function (entry) {
// do something with each JavaScript and Json file entry
});
// Combined negated directory filters
readdirp({ root: './test/bed', directoryFilter: [ '!.git', '!*modules' ] })
.on('data', function (entry) {
// do something with each file entry found outside '.git' or any modules directory
});
// Function directory filter
readdirp({ root: './test/bed', directoryFilter: function (di) { return di.name.length === 9; } })
.on('data', function (entry) {
// do something with each file entry found inside directories whose name has length 9
});
// Limiting depth
readdirp({ root: './test/bed', depth: 1 })
.on('data', function (entry) {
// do something with each file entry found up to 1 subdirectory deep
});
// callback api
readdirp(
{ root: '.' }
, function(fileInfo) {
// do something with file entry here
}
, function (err, res) {
// all done, move on or do final step for all file entries here
}
);
Try more examples by following instructions on how to get going.
Demonstrates error and data handling by listening to events emitted from the readdirp stream.
Demonstrates error handling by listening to events emitted from the readdirp stream and how to pipe the data stream into another destination stream.
Very naive implementation of grep, for demonstration purposes only.
Shows how to pass callbacks in order to handle errors and/or data.
The readdirp tests also will give you a good idea on how things work.
FAQs
Recursive version of fs.readdir with streaming API.
The npm package readdirp receives a total of 29,216,411 weekly downloads. As such, readdirp popularity was classified as popular.
We found that readdirp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
At its inaugural meeting, the JSR Working Group outlined plans for an open governance model and a roadmap to enhance JavaScript package management.
Security News
Research
An advanced npm supply chain attack is leveraging Ethereum smart contracts for decentralized, persistent malware control, evading traditional defenses.
Security News
Research
Attackers are impersonating Sindre Sorhus on npm with a fake 'chalk-node' package containing a malicious backdoor to compromise developers' projects.