New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

s3-bucket

Package Overview
Dependencies
Maintainers
1
Versions
8
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

s3-bucket

Simple AWS S3 Wrapper 🔥

latest
Source
npmnpm
Version
1.0.3
Version published
Maintainers
1
Created
Source

Simple AWS S3 Wrapper 🔥

Install

$ npm install s3-bucket

Environment Variables

S3_BUCKET_ACCESS_KEY_ID=value
S3_BUCKET_SECRET_ACCESS_KEY=value

S3_BUCKET_NAME=value
S3_BUCKET_REGION=value

Usage

// Don't forgot to import the function 😊
const {
	updateCredentials,
	updateRegion,
	getAllBuckets,
	getUploadUrl,
	uploadFile,
	listFiles,
	deleteFiles,
} = require('s3-bucket');

getAllBuckets()

Yep! Like you've already guessed. It'll list all the buckets in your AWS account.

// Request
getAllBuckets()
	.then(buckets => console.log(buckets));
// Response
{
	Buckets:
	[
		 { Name: 'your-bucket-name', CreationDate: '2017-09-14T13:14:01.000Z' },
	],
  Owner:{ ID: 'your-id-here' }
}

getUploadUrl(customParams)

Get Signed Upload URL. Then use that to upload files to upload files directly to S3 without sending it to your server.

Required Params

ContentType → content type of the file. Key → path of that file within your S3 bucket

Optional Params

Bucket ACL → public-read by default Expires → 60 seconds

// Request
getUploadUrl({
	ContentType: 'application/javascript',
	Key: 'your-dir/test.js'
}).then(res => console.log(res))

// Response
{ signedUrl: 'https://s3.ap-south-1.amazonaws.com/your-bucket-name/your-dir/test.js?all-query-strings' }

uploadFile(customParams)

Upload files to your S3 bucket.

Required Params

filePath → absolute path to the file Key → path of that file within your S3 bucket

Optional Params

Bucket ACL → public-read by default Expires → 60 seconds

// Request
uploadFile({
	filePath: 'path/to/your/file.js',
	Key: 'your-dir/test.js'})
.then(res => console.log(res));

// Response
{ ETag: '"9184ea01719a9444c823f1cb797529c9"',
	url: 'https://your-bucket-name.s3.amazonaws.com/your-dir/test.js'
}

listFiles(customParams)

Just list all the files(objects) in your bucket.

Optional Params

Bucket

// Request
listFiles({}).then(files => console.log(files))

// Response
{ IsTruncated: false,
  Contents:
   [ { Key: 'your-dir/test.js',
       LastModified: '2017-12-18T09:58:09.000Z',
       ETag: '"fd131f0975cdb3b6422290261866bf01"',
       Size: 383,
       StorageClass: 'STANDARD' },
	],
  Name: 'your-bucket-name',
  Prefix: '',
  MaxKeys: 1000,
  CommonPrefixes: [],
  KeyCount: 31 }

deleteFiles(customParams)

Let's delete files 🗑️

Required Params

files → path to files (Keys) in array

Optional Params

Bucket

// Request
deleteFiles({
	files: ['your-dir/test.js']
})
.then(res => console.log(res));

// Response
{ Deleted: [ { Key: '/your-dir/test.js' } ], Errors: [] }

updateCredentials(credentials)

Sometimes we want to set our AWS credentials dynamically.

In that senario we can use updateCredentials() to set the credentials on the fly

	const credentials = {
		accessKeyId:'your-aws-access-key',
		secretAccessKey:'your-aws-secret-key'
	};
	updateCredentials(credentials)

updateRegion(region)

Setting our S3 region on the fly

	updateRegion('ap-south-1')

updateBucketName(bucketName)

Setting our S3 region on the fly

	updateBucketName('new-bucket-name')

TODO

  • Handle Missing credentials errors
  • File Upload Progress

License

MIT © Ashik Nesin

Keywords

s3

FAQs

Package last updated on 22 May 2018

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts