
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
Simple AWS S3 Wrapper 🔥
$ npm install s3-bucket
S3_BUCKET_ACCESS_KEY_ID=value
S3_BUCKET_SECRET_ACCESS_KEY=value
S3_BUCKET_NAME=value
S3_BUCKET_REGION=value
// Don't forgot to import the function 😊
const {
updateCredentials,
updateRegion,
getAllBuckets,
getUploadUrl,
uploadFile,
listFiles,
deleteFiles,
} = require('s3-bucket');
Yep! Like you've already guessed. It'll list all the buckets in your AWS account.
// Request
getAllBuckets()
.then(buckets => console.log(buckets));
// Response
{
Buckets:
[
{ Name: 'your-bucket-name', CreationDate: '2017-09-14T13:14:01.000Z' },
],
Owner:{ ID: 'your-id-here' }
}
Get Signed Upload URL. Then use that to upload files to upload files directly to S3 without sending it to your server.
ContentType → content type of the file. Key → path of that file within your S3 bucket
Bucket ACL → public-read by default Expires → 60 seconds
// Request
getUploadUrl({
ContentType: 'application/javascript',
Key: 'your-dir/test.js'
}).then(res => console.log(res))
// Response
{ signedUrl: 'https://s3.ap-south-1.amazonaws.com/your-bucket-name/your-dir/test.js?all-query-strings' }
Upload files to your S3 bucket.
filePath → absolute path to the file Key → path of that file within your S3 bucket
Bucket ACL → public-read by default Expires → 60 seconds
// Request
uploadFile({
filePath: 'path/to/your/file.js',
Key: 'your-dir/test.js'})
.then(res => console.log(res));
// Response
{ ETag: '"9184ea01719a9444c823f1cb797529c9"',
url: 'https://your-bucket-name.s3.amazonaws.com/your-dir/test.js'
}
Just list all the files(objects) in your bucket.
Bucket
// Request
listFiles({}).then(files => console.log(files))
// Response
{ IsTruncated: false,
Contents:
[ { Key: 'your-dir/test.js',
LastModified: '2017-12-18T09:58:09.000Z',
ETag: '"fd131f0975cdb3b6422290261866bf01"',
Size: 383,
StorageClass: 'STANDARD' },
],
Name: 'your-bucket-name',
Prefix: '',
MaxKeys: 1000,
CommonPrefixes: [],
KeyCount: 31 }
Let's delete files 🗑️
files → path to files (Keys) in array
Bucket
// Request
deleteFiles({
files: ['your-dir/test.js']
})
.then(res => console.log(res));
// Response
{ Deleted: [ { Key: '/your-dir/test.js' } ], Errors: [] }
Sometimes we want to set our AWS credentials dynamically.
In that senario we can use updateCredentials() to set the credentials on the fly
const credentials = {
accessKeyId:'your-aws-access-key',
secretAccessKey:'your-aws-secret-key'
};
updateCredentials(credentials)
Setting our S3 region on the fly
updateRegion('ap-south-1')
Setting our S3 region on the fly
updateBucketName('new-bucket-name')
MIT © Ashik Nesin
FAQs
Simple AWS S3 Wrapper 🔥
We found that s3-bucket demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.