
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
A Node.js library for S3 redunancy, making sure your calls to S3 keep working even if there is an issue with one S3 location. This library is intended to be used with two buckets set up with cross-region replication.
This library tries to look like a subset of the API from AWS.S3 for easy use. Normally, calls with this library will be sent to the primary S3 location. If there are any unexpected issues with an S3 call, however, it will use a secondary, failover S3 location.
Before using this library, you should have two buckets set up for cross-region replication. They need to be both replicating to each other. See Amazon's guide for setup. A few additional tips on setup:
Once you have two buckets to use, you can set up s3-s3 in your code by first setting up the two buckets using aws-sdk in the normal way you would set them up. Something like:
var aws = require('aws-sdk'),
// your location for the AWS config of the primary bucket
awsConfig = require('config.json');
// your location for the AWS config of the secondary bucket
awsSecondaryConfig = require('config.json');
// primary bucket S3 setup
s3Primary = new AWS.S3(awsConfig);
// secondary bucket S3 setup
s3Secondary = new AWS.S3(awsSecondaryConfig);
With the above, you can then set up the s3-s3 object:
var S3S3 = require('s3-s3'),
s3 = new S3S3(s3Primary, primaryBucketName, s3Secondary, secondaryBucketName);
You can then use s3 to make many of the same calls that you would make with AWS.S3:
var request = s3.putObject();
request.params = {
'Key': key
'Body' : cmdStream,
'ACL' : 'public-read'
};
request.on('success', function (response) {
console.log('success!');
callback();
}).on('error', function (err, response) {
console.log('error!');
callback(err);
}).on('failover', function (err) {
// if you are streaming data in a Body param, you will need to reinitialize
// request.params.Body here for it to work properly in failover
console.log('failover!');
// no callback, as we will still get an error or success
}).send();
While the API attempts to mimic AWS.S3, it's not exactly the same. Some differences:
Whenever you have a stream as part of your parameters, as the Body or elsewhere, you need to make sure this stream is reinitialized in failover for this to work properly. For example:
var request = s3.putObject(),
setupBody = function () {
// just pretend doing this makes sense
var getFile = child_process.spawn('cat', ['myfile.txt']);
return getFile.stdout;
};
request.params = {
'Key': key
'Body' : setupBody();
'ACL' : 'public-read'
};
request.on('success', function (response) {
console.log('success!');
callback();
}).on('error', function (err, response) {
console.log('error!');
callback(err);
}).on('failover', function (err, response) {
// reinitialize Body as needed during failover
request.params.Body = setupBody();
console.log('failover!');
// no callback, as we will still get an error or success
}).send();
New s3-s3 object:
var S3S3 = require('s3-s3'),
s3 = new S3S3(new AWS.S3(awsConfig), primaryBucketName, new AWS.S3(secondaryConfig), secondaryBucketName);
S3 APIs:
request = s3.putObject();
request = s3.deleteObject();
request = s3.deleteObjects();
request = s3.listObjects();
request = s3.getObject();
request events and send:
request.on('send', function(response) {})
.on('retry', function(response) {})
.on('extractError', function(response) {})
.on('extractData', function(response) {})
.on('success', function(response) {})
.on('httpData', function(chunk, response) {})
.on('complete', function(response) {})
.on('error', function(error, response) {})
.on('failover', function(error, response) {})
.send();
Thanks for considering making any updates to this project! Here are the steps to take in your fork:
FAQs
AWS S3 client
The npm package s3-s3 receives a total of 2 weekly downloads. As such, s3-s3 popularity was classified as not popular.
We found that s3-s3 demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.