Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Implementation of Node.JS FS interface using Amazon Simple Storage Service (S3).
Implementation of Node.JS FS interface using Amazon Simple Storage Service (S3) for storage.
Lead Maintainer: David Pate
S3FS provides a drop-in replacement for the File System (FS) implementation that is available with Node.JS allowing a distributed file-system to be used by Node.JS applications through the well-known FS interface used by Node.JS.
Below is a policy for AWS Identity and Access Management which provides the minimum privileges needed to use S3FS.
{
"Statement": [
{
"Action": [
"s3:ListBucket"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::your-bucket"
]
},
{
"Action": [
"s3:AbortMultipartUpload",
"s3:CreateBucket",
"s3:DeleteBucket",
"s3:DeleteBucketPolicy",
"s3:DeleteObject",
"s3:GetBucketPolicy",
"s3:GetLifecycleConfiguration",
"s3:GetObject",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:ListMultipartUploadParts",
"s3:PutBucketPolicy",
"s3:PutLifecycleConfiguration",
"s3:PutObject"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::your-bucket/*"
]
}
]
}
The methods below from Node.JS's FS interface are the only currently supported methods
matching the signature and functionality of the fs
module. All of the methods support either usage through callbacks
or promises. There isn't any support for synchronous actions currently as there isn't a need.
Creating an instance of S3fs
takes in the bucketPath
and options
which are passed on to the S3 constructor.
var bucketPath = 'mySuperCoolBucket';
var s3Options = {
region: 'us-east-1',
};
var fsImpl = new S3FS(bucketPath, s3Options);
var S3FS = require('s3fs');
var fsImpl = new S3FS('test-bucket', options);
fsImpl.writeFile('message.txt', 'Hello Node', function (err) {
if (err) throw err;
console.log('It\'s saved!');
});
var S3FS = require('s3fs');
var fsImpl = new S3FS('test-bucket', options);
fsImpl.writeFile('message.txt', 'Hello Node').then(function() {
console.log('It\'s saved!');
}, function(reason) {
throw reason;
});
Besides the methods from Node.JS's FS interface we also support some custom expansions to the interface providing various methods such as recursive methods and S3 specific methods. They are described below.
Provides a location by concatenating the bucket with path(s).
String
. Optional. The relative path to the working directory or file // Create an instance of S3FS which has a current working directory of `test-folder` within the S3 bucket `test-bucket`
var fsImpl = new S3FS('test-bucket/test-folder', options);
// Returns location to directory `test-bucket/test-folder/styles
var fsImplStyles = fsImpl.getPath('styles');
// Returns location to file `test-bucket/test-folder/styles/main.css
var fsImplStyles = fsImpl.getPath('styles/main.css');
// Returns location to file `test-bucket/test-folder
var fsImplStyles = fsImpl.getPath();
Provides a clone of the instance of S3FS which has relative access to the specified directory.
String
. Optional. The relative path to extend the current working directory// Create an instance of S3FS which has a current working directory of `test-folder` within the S3 bucket `test-bucket`
var fsImpl = new S3FS('test-bucket/test-folder', options);
// Creates a copy (which uses the same instance of S3FS) which has a current working directory of `test-folder/styles`
var fsImplStyles = fsImpl.clone('styles');
Allows a file to be copied from one path to another path within the same bucket. Paths are relative to the bucket originally provided.
String
. Required. Relative path to the source fileString
. Required. Relative path to the destination fileObject
. Optional. The options to be used when copying the file. See AWS SDKFunction
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.copyFile('test-folder/test-file.txt', 'other-folder/test-file.txt').then(function(data) {
// File was successfully copied
// Data contains details such as the `ETag` about the object. See [AWS SDK](http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#copyObject-property) for details.
}, function(reason) {
// Something went wrong
});
Recursively copies a directory from the source path to the destination path.
String
. Required. The source directory to be copiedString
. Required. The destination directory to be copied toFunction
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.copyDir('test-folder', 'other-folder').then(function() {
// Directory was successfully copied
}, function(reason) {
// Something went wrong
});
Creates a new bucket on S3.
Object
. Optional. The options to be used when creating the bucket. See AWS SDKFunction
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.create().then(function() {
// Bucket was successfully created
}, function(reason) {
// Something went wrong
});
Deletes a bucket on S3, can only be deleted when empty. If you need to delete one that isn't empty use
destroy([callback])
instead.
Function
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.delete().then(function() {
// Bucket was successfully deleted
}, function(reason) {
// Something went wrong
});
Recursively deletes all files within the bucket and then deletes the bucket.
Function
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.destroy().then(function() {
// Bucket was successfully destroyed
}, function(reason) {
// Something went wrong
});
Retrieves the details about an object, but not the contents.
String
. Required. Path to the object to retrieve the head forFunction
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.headObject('test-file.txt').then(function(details) {
// Details contains details such as the `ETag` about the object. See [AWS SDK](http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#headObject-property) for details.
}, function(reason) {
// Something went wrong
});
Retrieves a list of all objects within the specific path. The result is similar to that of headObject(path[, callback])
expect that it contains an array of objects.
String
. Required. The path to list all of the objects forString
. Required. The key to start with when listing objectsFunction
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.listContents('/', '/').then(function(data) {
// Data.Contents contains details such as the `ETag` about the object. See [AWS SDK](http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#headObject-property) for details.
}, function(reason) {
// Something went wrong
});
Adds/Updates a lifecycle on a bucket.
String
. Required. The name of the lifecycle. The value cannot be longer than 255 characters.String
. Required. Prefix identifying one or more objects to which the rule applies.Function
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
// Remove the Cached contents in the `/cache` directory each day.
fsImpl.putBucketLifecycle('expire cache', 'cache', 1).then(function() {
// Bucket Lifecycle was successfully added/updated
}, function(reason) {
// Something went wrong
});
Recursively reads a directory.
String
. Required. The path to the directory to read fromvar fsImpl = new S3FS('test-bucket', options);
fsImpl.readdirp('test-folder').then(function(files) {
// Files contains a list of all of the files similar to [`fs.readdir(path, callback)`](http://nodejs.org/api/fs.html#fs_fs_readdir_path_callback) but with recursive contents
}, function(reason) {
// Something went wrong
});
Recursively creates a directory.
Function
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.mkdirp('test-folder').then(function() {
// Directory has been recursively created
}, function(reason) {
// Something went wrong
});
Recursively deletes a directory.
Function
. Optional. Callback to be used, if not provided will return a Promisevar fsImpl = new S3FS('test-bucket', options);
fsImpl.rmdirp('test-folder').then(function() {
// Directory has been recursively deleted
}, function(reason) {
// Something went wrong
});
This repository uses Mocha as its test runner. Tests can be run by executing the following command:
npm test
This will run all tests and report on their success/failure in the console, additionally it will include our Code Coverage.
This repository uses Istanbul as its code coverage tool. Code Coverage will be calculated when executing the following command:
npm test
This will report the Code Coverage to the console similar to the following:
=============================== Coverage summary ===============================
Statements : 78.07% ( 356/456 )
Branches : 50.23% ( 107/213 )
Functions : 74.77% ( 83/111 )
Lines : 78.07% ( 356/456 )
================================================================================
Additionally, an interactive HTML report will be generated in ./coverage/lcov-report/index.html
which allows browsing the coverage by file.
Copyright (c) 2015 Riptide Software Inc.
FAQs
Implementation of Node.JS FS interface using Amazon Simple Storage Service (S3).
The npm package s3fs receives a total of 1,647 weekly downloads. As such, s3fs popularity was classified as popular.
We found that s3fs demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.