aws-s3-uploaders
S3 compatible file uploader Plugin for Vite
🚀 Features
- ⚡ S3 Compatible: Support any S3 compatible provider (AWS, DO Spaces...)
- ✨ Uploads any files: can upload any files or directory not just build folder
📦 Install
$ npm i aws-s3-uploaders
🦄 Usage
uploadOptions
default to ACL: 'public-read'
so you may need to override if you have other needs.
Add vite-plugin-s3
plugin to vite.config.js / vite.config.ts
and configure it:
import { AwsS3Uploader } from 'aws-s3-uploaders';
const uploader = AwsS3Uploader({
basePath: '/build',
clientConfig: {
credentials: {
accessKeyId: '',
secretAccessKey: '',
},
region: 'eu-west-2',
},
uploadOptions: {
Bucket: 'my-bucket',
},
}),
],
});
uploader.apply();
👀 Options
Option | Description | Type | Default |
---|
exclude | A Pattern to match for excluded content | string,RegExp,Function,Array | null |
include | A Pattern to match for included content | string,RegExp,Function,Array | null |
clientConfig | The configuration interface of S3Client class constructor | S3ClientConfig | required |
uploadOptions | PutObjectRequest options except Body and `Key' | PutObjectRequest | required |
basePath | Namespace of uploaded files on S3 | string | null |
directory | Directory to upload | string | null |
Advanced include
and exclude rules
include
and exclude
in addition to a RegExp you can pass a function which will be called with the path as its first argument. Returning a truthy value will match the rule. You can also pass an Array of rules, all of which must pass for the file to be included or excluded.
import { AwsS3Uploader } from 'aws-s3-uploaders';
import isGitIgnored from 'is-gitignored';
var isPathOkToUpload = function (path) {
return require('my-projects-publishing-rules').checkFile(path);
};
const uploader = AwsS3Uploader({
include: [
/.*\.(css|js)/,
function (path) {
isPathOkToUpload(path);
},
],
exclude: isGitIgnored,
clientConfig: {
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
region: 'eu-west-2',
},
uploadOptions: {
Bucket: 'my-bucket',
},
});
uploader.apply();
💧 DigitalOcean Spaces Object Storage example
import { AwsS3Uploader } from 'aws-s3-uploaders';
const uploader = AwsS3Uploader({
clientConfig: {
credentials: {
accessKeyId: process.env.DO_ACCESS_KEY_ID,
secretAccessKey: process.env.DO_SECRET_ACCESS_KEY,
},
endpoint: 'https://fra1.digitaloceanspaces.com',
region: 'fra1',
},
uploadOptions: {
Bucket: 'my-bucket',
},
});
uploader.apply();