
Security News
GitHub Actions Pricing Whiplash: Self-Hosted Actions Billing Change Postponed
GitHub postponed a new billing model for self-hosted Actions after developer pushback, but moved forward with hosted runner price cuts on January 1.

Recursive version of fs.readdir. Exposes a stream api.
var readdirp = require('readdirp')
, path = require('path')
, es = require('event-stream');
// print out all JavaScript files along with their size
var stream = readdirp({ root: path.join(__dirname), fileFilter: '*.js' });
stream
.on('warn', function (err) {
console.error('non-fatal error', err);
// optionally call stream.destroy() here in order to abort and cause 'close' to be emitted
})
.on('error', function (err) { console.error('fatal error', err); })
.pipe(es.mapSync(function (entry) {
return { path: entry.path, size: entry.stat.size };
}))
.pipe(es.stringify())
.pipe(process.stdout);
Meant to be one of the recursive versions of fs functions, e.g., like mkdirp.
Table of Contents generated with DocToc
npm install readdirp
var entryStream = readdirp (options)
Reads given root recursively and returns a stream of entry infos.
Behaves as follows:
emit('data') passes an entry info whenever one is foundemit('warn') passes a non-fatal Error that prevents a file/directory from being processed (i.e., if it is
inaccessible to the user)emit('error') passes a fatal Error which also ends the stream (i.e., when illegal options where passed)emit('end') called when all entries were found and no more will be emitted (i.e., we are done)emit('close') called when the stream is destroyed via stream.destroy() (which could be useful if you want to
manually abort even on a non fatal error) - at that point the stream is no longer readable and no more entries,
warning or errors are emittedpaused initially in order to allow pipe and on handlers be connected before data or errors are
emittedresumed automatically during the next event looproot: path in which to start reading and recursing into subdirectories
fileFilter: filter to include/exclude files found (see Filters for more)
directoryFilter: filter to include/exclude directories found and to recurse into (see Filters for more)
depth: depth at which to stop recursing even if more subdirectories are found
Has the following properties:
parentDir : directory in which entry was found (relative to given root)
fullParentDir : full path to parent directory
name : name of the file/directory
path : path to the file/directory (relative to given root)
fullPath : full path to the file/directory found
stat : built in stat object
Example: (assuming root was /User/dev/readdirp)
parentDir : 'test/bed/root_dir1',
fullParentDir : '/User/dev/readdirp/test/bed/root_dir1',
name : 'root_dir1_subdir1',
path : 'test/bed/root_dir1/root_dir1_subdir1',
fullPath : '/User/dev/readdirp/test/bed/root_dir1/root_dir1_subdir1',
stat : [ ... ]
There are three different ways to specify filters for files and directories respectively.
function: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry
glob string: a string (e.g., *.js) which is matched using minimatch, so go there for more
information.
Globstars (**) are not supported since specifiying a recursive pattern for an already recursive function doesn't make sense.
Negated globs (as explained in the minimatch documentation) are allowed, e.g., !*.txt matches everything but text files.
array of glob strings: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown.
[ '*.json', '*.js' ] includes all JavaScript and Json files.
[ '!.git', '!node_modules' ] includes all directories except the '.git' and 'node_modules'.
Directories that do not pass a filter will not be recursed into.
Although the stream api is recommended, readdirp also exposes a callback based api.
readdirp (options, callback1 [, callback2])
If callback2 is given, callback1 functions as the fileProcessed callback, and callback2 as the allProcessed callback.
If only callback1 is given, it functions as the allProcessed callback.
function (err, res) { ... }function (entryInfo) { ... }on('error', ..), on('warn', ..) and on('end', ..) handling omitted for brevity
var readdirp = require('readdirp');
// Glob file filter
readdirp({ root: './test/bed', fileFilter: '*.js' })
.on('data', function (entry) {
// do something with each JavaScript file entry
});
// Combined glob file filters
readdirp({ root: './test/bed', fileFilter: [ '*.js', '*.json' ] })
.on('data', function (entry) {
// do something with each JavaScript and Json file entry
});
// Combined negated directory filters
readdirp({ root: './test/bed', directoryFilter: [ '!.git', '!*modules' ] })
.on('data', function (entry) {
// do something with each file entry found outside '.git' or any modules directory
});
// Function directory filter
readdirp({ root: './test/bed', directoryFilter: function (di) { return di.name.length === 9; } })
.on('data', function (entry) {
// do something with each file entry found inside directories whose name has length 9
});
// Limiting depth
readdirp({ root: './test/bed', depth: 1 })
.on('data', function (entry) {
// do something with each file entry found up to 1 subdirectory deep
});
// callback api
readdirp(
{ root: '.' }
, function(fileInfo) {
// do something with file entry here
}
, function (err, res) {
// all done, move on or do final step for all file entries here
}
);
Try more examples by following instructions on how to get going.
Demonstrates error and data handling by listening to events emitted from the readdirp stream.
Demonstrates error handling by listening to events emitted from the readdirp stream and how to pipe the data stream into another destination stream.
Very naive implementation of grep, for demonstration purposes only.
Shows how to pass callbacks in order to handle errors and/or data.
The readdirp tests also will give you a good idea on how things work.
The 'glob' package provides pattern matching functionality to select files in directories. It's similar to readdirp in that it can be used to search for files, but it uses glob patterns instead of providing a stream API or Promise-based interface.
The 'fs-extra' package extends the built-in Node.js 'fs' module with additional file system methods, including recursive directory reading. While it offers similar functionality to readdirp, fs-extra provides a broader set of file system operations, making it more of a general-purpose library.
The 'walk' package is another Node.js module for recursively reading directory contents. It is similar to readdirp but focuses more on event-based directory walking. Compared to readdirp, 'walk' might offer a simpler API for some use cases but lacks the advanced filtering and mapping capabilities.
FAQs
Recursive version of fs.readdir with small RAM & CPU footprint
The npm package readdirp receives a total of 44,525,905 weekly downloads. As such, readdirp popularity was classified as popular.
We found that readdirp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
GitHub postponed a new billing model for self-hosted Actions after developer pushback, but moved forward with hosted runner price cuts on January 1.

Research
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.

Security News
Socket CTO Ahmad Nassri shares practical AI coding techniques, tools, and team workflows, plus what still feels noisy and why shipping remains human-led.