
Security News
Crates.io Implements Trusted Publishing Support
Crates.io adds Trusted Publishing support, enabling secure GitHub Actions-based crate releases without long-lived API tokens.
aws-s3-upload-ash
Advanced tools
Open Source Module to Upload your Media and files into AWS S3 Bucket directly from Front-end
Open Source Module to Upload your Media and files into AWS S3 Bucket directly from Front-end.
AWSS3UploadAsh - A Javascript Library for AWS S3 File Upload
http://bit.ly/doeismaelnascimento
Using NPM
npm install aws-s3-upload-ash
Using Yarn
yarn add aws-s3-upload-ash
import AWSS3UploadAsh from 'aws-s3-upload-ash';
const config = {
bucketName: 'bucketName',
dirName: 'media', /* optional - when use: e.g BUCKET_ROOT/dirName/fileName.extesion */
region: 'us-east-1',
accessKeyId: process.env.accessKeyId,
secretAccessKey: process.env.secretAccessKey,
s3Url: 'https://bucketName.s3.amazonaws.com/'
}
// if you bucket is public, you need of config
const S3CustomClient = new AWSS3UploadAsh(config);
const newFileNameWithExtesion = 'myPdf.pdf';
//file: File - required | e.g input html type file
//contentType: string | required e.g application/pdf
//presignedURL: string | optional
//newFileName: string | optional e.g myImage.png
//acl: string | optional default public-read
// if you use presignedURL, not need newFileName and acl parameters, can be undefined
S3CustomClient
.uploadFile(file, "application/pdf", undefined, newFileNameWithExtesion, undefined)
.then(data => console.log(data))
.catch(err => console.error(err))
/**
*
* {
* bucket: "bucketName",
* key: "media/myPdf.pdf",
* location: "https://bucketName.s3.amazonaws.com/media/myPdf.pdf",
* status: 204
* }
*
*/
});
import AWSS3UploadAsh from 'aws-s3-upload-ash';
// if you use presignedURL, dont need config on AWSS3UploadAsh constructor
const S3CustomClient = new AWSS3UploadAsh();
//file: File - required | e.g input html type file
//contentType: string | required e.g application/pdf
//presignedURL: string | optional
//newFileName: string | optional e.g myImage.png
//acl: string | optional default public-read
// if you use presignedURL, not need newFileName and acl parameters, can be null
S3CustomClient
.uploadFile(file, "application/png", "presignedlURL", undefined, undefined)
.then(data => console.log(data))
.catch(err => console.error(err))
/**
* Response se you use presignedURL parameter
* {
* Response: {
* status: 200,
* body: "Upload complete"
* }
* }
*/
});
import AWSS3UploadAsh from 'aws-s3-upload-ash';
const config = {
bucketName: 'bucketName',
region: 'us-east-1',
accessKeyId: process.env.accessKeyId,
secretAccessKey: process.env.secretAccessKey,
s3Url: 'https://bucketName.s3.amazonaws.com/'
}
// if you bucket is public, you need of config
const S3CustomClient = new AWSS3UploadAsh(config);
const newFileNameWithExtesion = 'myVideo.mp4';
//file: File - required | e.g input html type file
//contentType: string | required e.g application/pdf
//presignedURL: string | optional
//newFileName: string | optional e.g myImage.png
//acl: string | optional default public-read
// if you use presignedURL, not need newFileName and acl parameters, can be null
S3CustomClient
.uploadFile(file, "video/mp4", undefined, newFileNameWithExtesion, undefined)
.then(data => console.log(data))
.catch(err => console.error(err))
/**
*
* {
* bucket: "bucketName",
* key: "myVideo.mp4",
* location: "https://bucketName.s3.amazonaws.com/myVideo.mp4",
* status: 204
* }
*
*/
});
In this case the file that we want to delete is in the folder 'photos'
import AWSS3UploadAsh from 'aws-s3-upload-ash';
const config = {
bucketName: 'bucketName',
dirName: 'media',
region: 'us-east-1',
accessKeyId: process.env.accessKeyId,
secretAccessKey: process.env.secretAccessKey,
s3Url: 'https://bucketName.s3.amazonaws.com/'
}
const S3CustomClient = new AWSS3UploadAsh(config);
const newFileNameWithExtesion = 'fileName.extesion';
S3CustomClient
.deleteFile(newFileNameWithExtesion)
.then(response => console.log(response))
.catch(err => console.error(err))
/**
* {
* Response: {
* ok: true,
* status: 204,
* message: 'File deleted',
* fileName: 'media/fileName.extesion';
* }
* }
*/
});
import AWSS3UploadAsh from 'aws-s3-upload-ash';
const config = {
bucketName: 'bucketName',
region: 'us-east-1',
accessKeyId: process.env.accessKeyId,
secretAccessKey: process.env.secretAccessKey,
s3Url: 'https://bucketName.s3.amazonaws.com/'
}
const S3CustomClient = new AWSS3UploadAsh(config);
const newFileNameWithExtesion = 'fileName.extesion';
S3CustomClient
.deleteFile(newFileNameWithExtesion)
.then(response => console.log(response))
.catch(err => console.error(err))
/**
* {
* Response: {
* ok: true,
* status: 204,
* message: 'File deleted',
* fileName: 'fileName.extesion';
* }
* }
*/
});
MIT
FAQs
Open Source Module to Upload your Media and files into AWS S3 Bucket directly from Front-end
The npm package aws-s3-upload-ash receives a total of 165 weekly downloads. As such, aws-s3-upload-ash popularity was classified as not popular.
We found that aws-s3-upload-ash demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Crates.io adds Trusted Publishing support, enabling secure GitHub Actions-based crate releases without long-lived API tokens.
Research
/Security News
Undocumented protestware found in 28 npm packages disrupts UI for Russian-language users visiting Russian and Belarusian domains.
Research
/Security News
North Korean threat actors deploy 67 malicious npm packages using the newly discovered XORIndex malware loader.