Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
folder-hash
Advanced tools
Create a hash checksum over a folder and its content - its children and their content
The folder-hash npm package is used to generate hash values for the contents of a folder. This can be useful for verifying the integrity of files, detecting changes, and ensuring consistency across different environments.
Generate Hash for a Folder
This feature allows you to generate a hash for the contents of a specified folder. The hash can be used to verify the integrity of the folder's contents.
const folderHash = require('folder-hash');
const options = { folders: { include: ['*'] } };
folderHash.hashElement('path/to/folder', options)
.then(hash => console.log(hash))
.catch(error => console.error('Error:', error));
Customizing Hash Options
This feature allows you to customize the hashing options, such as the algorithm, encoding, and which files or folders to include or exclude.
const folderHash = require('folder-hash');
const options = {
algo: 'sha256',
encoding: 'hex',
folders: { exclude: ['node_modules', 'test'] },
files: { include: ['*.js', '*.json'] }
};
folderHash.hashElement('path/to/folder', options)
.then(hash => console.log(hash))
.catch(error => console.error('Error:', error));
The hasha package is a versatile hashing library that can hash strings, buffers, and streams. It supports multiple algorithms and is highly configurable. Unlike folder-hash, hasha does not specifically target folder contents but can be used in conjunction with other tools to achieve similar results.
The built-in Node.js crypto module provides cryptographic functionality, including hashing. While it does not directly support hashing entire folders, it can be used to hash individual files and combined with file system operations to hash folder contents. It offers more control but requires more manual setup compared to folder-hash.
The checksum package is designed to generate checksums for files and directories. It offers similar functionality to folder-hash but with a focus on simplicity and ease of use. It supports various algorithms and can be a good alternative for basic hashing needs.
Create a hash checksum over a folder or a file.
The hashes are propagated upwards, the hash that is returned for a folder is generated over all the hashes of its children.
The hashes are generated with the sha1 algorithm and returned in base64 encoding by default.
Each file returns a name and a hash, and each folder returns additionally an array of children (file or folder elements).
First, install folder-hash with npm install --save folder-hash
or yarn add folder-hash
.
To see differences to the last version of this package, I would create hashes over all .js and .json files. But ignore everything inside folders starting wiht a dot, and also from the folders node_modules, test_coverage. The structure of the options object is documented on this page.
This example is also stored in ./examples/readme-example1.js.
const { hashElement } = require('folder-hash');
const options = {
folders: { exclude: ['.*', 'node_modules', 'test_coverage'] },
files: { include: ['*.js', '*.json'] }
};
console.log('Creating a hash over the current folder:');
hashElement('.', options)
.then(hash => {
console.log(hash.toString());
})
.catch(error => {
return console.error('hashing failed:', error);
});
The returned information looks for example like this:
Creating a hash over the current folder:
{ name: '.', hash: 'YZOrKDx9LCLd8X39PoFTflXGpRU=,'
children: [
{ name: 'examples', hash: 'aG8wg8np5SGddTnw1ex74PC9EnM=,'
children: [
{ name: 'readme-example1.js', hash: 'Xlw8S2iomJWbxOJmmDBnKcauyQ8=' }
{ name: 'readme-with-callbacks.js', hash: 'ybvTHLCQBvWHeKZtGYZK7+6VPUw=' }
{ name: 'readme-with-promises.js', hash: '43i9tE0kSFyJYd9J2O0nkKC+tmI=' }
{ name: 'sample.js', hash: 'PRTD9nsZw3l73O/w5B2FH2qniFk=' }
]}
{ name: 'index.js', hash: 'kQQWXdgKuGfBf7ND3rxjThTLVNA=' }
{ name: 'package.json', hash: 'w7F0S11l6VefDknvmIy8jmKx+Ng=' }
{ name: 'test', hash: 'H5x0JDoV7dEGxI65e8IsencDZ1A=,'
children: [
{ name: 'parameters.js', hash: '3gCEobqzHGzQiHmCDe5yX8weq7M=' }
{ name: 'test.js', hash: 'kg7p8lbaVf1CPtWLAIvkHkdu1oo=' }
]}
]}
And the structure may be traversed to e.g. create incremental backups.
It is also possible to only match the full path and not the basename. The same configuration could look like this:
You should be aware that *nix and Windows behave differently, so please use caution.
const options = {
folders: {
exclude: ['.*', '**.*', '**node_modules', '**test_coverage'],
matchBasename: false, matchPath: true
},
files: {
//include: ['**.js', '**.json' ], // Windows
include: ['*.js', '**/*.js', '*.json', '**/*.json'], // *nix
matchBasename: false, matchPath: true
}
};
Name | Type | Attributes | Description |
---|---|---|---|
name | string | element name or an element's path | |
dir | string |
<optional> | directory that contains the element (generated from name if omitted) |
options | Object |
<optional> | Options object (see below) |
callback | fn |
<optional> | Error-first callback function |
{
algo: 'sha1', // see crypto.getHashes() for options
encoding: 'base64', // 'base64', 'hex' or 'binary'
files: {
exclude: [],
include: [],
matchBasename: true,
matchPath: false,
ignoreBasename: false,
ignoreRootName: false
},
folders: {
exclude: [],
include: [],
matchBasename: true,
matchPath: false,
ignoreRootName: false
}
}
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
algo | string |
<optional> | 'sha1' | checksum algorithm, see options in crypto.getHashes() |
encoding | string |
<optional> | 'base64' | encoding of the resulting hash. One of 'base64', 'hex' or 'binary' |
files | Object |
<optional> | Rules object (see below) | |
folders | Object |
<optional> | Rules object (see below) |
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
exclude | Array.<string> || Function |
<optional> | [] | Array of optional exclude glob patterns, see minimatch doc. Can also be a function which returns true if the passed file is excluded. |
include | Array.<string> || Function |
<optional> | [] | Array of optional include glob patterns, see minimatch doc. Can also be a function which returns true if the passed file is included. |
matchBasename | bool |
<optional> | true | Match the glob patterns to the file/folder name |
matchPath | bool |
<optional> | false | Match the glob patterns to the file/folder path |
ignoreBasename | bool |
<optional> | false | Set to true to calculate the hash without the basename element |
ignoreRootName | bool |
<optional> | false | Set to true to calculate the hash without the basename of the root (first) element |
After installing it globally via
$ npm install -g folder-hash
You can use it like this:
# local folder
$ folder-hash -c config.json .
# local folder
$ folder-hash
# global folder
$ folder-hash /user/bin
It also allows to pass an optional JSON configuration file with the -c
or --config
flag, which should contain the same configuration as when using the JavaScript API.
You can also use a local version of folder-hash like this:
$ npx folder-hash --help
Use folder-hash on cli like this:
folder-hash [--config <json-file>] <file-or-folder>
See file ./examples/readme-with-promises.js
const path = require('path');
const { hashElement } = require('folder-hash');
// pass element name and folder path separately
hashElement('test', path.join(__dirname, '..'))
.then(hash => {
console.log('Result for folder "../test":', hash.toString(), '\n');
})
.catch(error => {
return console.error('hashing failed:', error);
});
// pass element path directly
hashElement(__dirname)
.then(hash => {
console.log(`Result for folder "${__dirname}":`);
console.log(hash.toString(), '\n');
})
.catch(error => {
return console.error('hashing failed:', error);
});
// pass options (example: exclude dotFolders)
const options = { encoding: 'hex', folders: { exclude: ['.*'] } };
hashElement(__dirname, options)
.then(hash => {
console.log('Result for folder "' + __dirname + '" (with options):');
console.log(hash.toString(), '\n');
})
.catch(error => {
return console.error('hashing failed:', error);
});
See ./examples/readme-with-callbacks.js
const path = require('path');
const { hashElement } = require('folder-hash');
// pass element name and folder path separately
hashElement('test', path.join(__dirname, '..'), (error, hash) => {
if (error) {
return console.error('hashing failed:', error);
} else {
console.log('Result for folder "../test":', hash.toString(), '\n');
}
});
// pass element path directly
hashElement(__dirname, (error, hash) => {
if (error) {
return console.error('hashing failed:', error);
} else {
console.log('Result for folder "' + __dirname + '":');
console.log(hash.toString(), '\n');
}
});
// pass options (example: exclude dotFiles)
const options = { algo: 'md5', files: { exclude: ['.*'], matchBasename: true } };
hashElement(__dirname, options, (error, hash) => {
if (error) {
return console.error('hashing failed:', error);
} else {
console.log('Result for folder "' + __dirname + '":');
console.log(hash.toString());
}
});
The behavior is documented and verified in the unit tests. Execute npm test
or mocha test
, and have a look at the test subfolder.
You can also have a look at the CircleCI report.
The hashes are the same if:
The hashes are different if:
Content means in this case a folder's children - both the files and the subfolders with their children.
The hashes are the same if:
The hashes are different if:
MIT, see LICENSE.txt
FAQs
Create a hash checksum over a folder and its content - its children and their content
The npm package folder-hash receives a total of 167,531 weekly downloads. As such, folder-hash popularity was classified as popular.
We found that folder-hash demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.