S3 Plugin
This plugin will upload all built assets to s3
Install Instructions
$ npm i @mxmul/webpack-s3-plugin
Note: This plugin needs NodeJS > 0.12.0
Usage Instructions
I notice a lot of people are setting the directory
option when the files are part of their build. Please don't set directory
if you're uploading your build. Using the directory
option reads the files after compilation to upload instead of from the build process.
You can also use a credentials file from AWS. To set the profile set your s3 options to the following:
s3Options: {
credentials: new AWS.SharedIniFileCredentials({profile: 'PROFILE_NAME'})
}
s3UploadOptions default to ACL: 'public-read'
so you may need to override if you have other needs. See #28
Require @mxmul/webpack-s3-plugin
var S3Plugin = require('@mxmul/webpack-s3-plugin')
With exclude
var config = {
plugins: [
new S3Plugin({
exclude: /.*\.html$/,
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'us-west-1'
},
s3UploadOptions: {
Bucket: 'MyBucket'
}
})
]
}
With include
var config = {
plugins: [
new S3Plugin({
include: /.*\.(css|js)/,
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
s3UploadOptions: {
Bucket: 'MyBucket'
}
})
]
}
Advanced include
and exclude rules
include
and exclude
rules behave similarly to Webpack's loader options. In addition to a RegExp you can pass a function which will be called with the path as its first argument. Returning a truthy value will match the rule. You can also pass an Array of rules, all of which must pass for the file to be included or excluded.
import isGitIgnored from 'is-gitignored'
var isPathOkToUpload = function(path) {
return require('my-projects-publishing-rules').checkFile(path)
}
var config = {
plugins: [
new S3Plugin({
include: [
/.*\.(css|js)/,
function(path) { isPathOkToUpload(path) }
],
exclude: isGitIgnored,
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
s3UploadOptions: {
Bucket: 'MyBucket'
}
})
]
}
With basePathTransform
import gitsha from 'gitsha'
var addSha = function() {
return new Promise(function(resolve, reject) {
gitsha(__dirname, function(error, output) {
if(error)
reject(error)
else
resolve(output.slice(0, 5))
})
})
}
var config = {
plugins: [
new S3Plugin({
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
s3UploadOptions: {
Bucket: 'MyBucket'
},
basePathTransform: addSha
})
]
}
With CloudFront invalidation
var config = {
plugins: [
new S3Plugin({
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
sessionToken: 'a234jasd'
},
s3UploadOptions: {
Bucket: 'MyBucket'
},
cloudfrontInvalidateOptions: {
DistributionId: process.env.CLOUDFRONT_DISTRIBUTION_ID,
Items: ["/*"]
}
})
]
}
With Dynamic Upload Options
var config = {
plugins: [
new S3Plugin({
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
s3UploadOptions: {
Bucket: 'MyBucket',
ContentEncoding(fileName) {
if (/\.gz/.test(fileName))
return 'gzip'
},
ContentType(fileName) {
if (/\.js/.test(fileName))
return 'application/javascript'
else
return 'text/plain'
}
}
})
]
}
Options
exclude
: A Pattern to match for excluded content. Behaves similarly to webpack's loader configuration.include
: A Pattern to match for included content. Behaves the same as exclude
.s3Options
: Provide keys for upload options of s3Configs3UploadOptions
: Provide upload options putObjectbasePath
: Provide the namespace of uploaded files on S3directory
: Provide a directory to upload (if not supplied, will upload js/css from compilation)htmlFiles
: Html files to cdnize (defaults to all in output directory)basePathTransform
: transform the base path to add a folder name. Can return a promise or a stringprogress
: Enable progress bar (defaults true)priority
: priority order to your files as regex array. The ones not matched by regex are uploaded first. This rule becomes useful when avoiding s3 eventual consistency issues
Contributing
All contributions are welcome. Please make a pull request and make sure things still pass after running npm run test
For tests you will need to either have the environment variables set or setup a .env file. There's a .env.sample so you can cp .env.sample .env
and fill it in. Make sure to add any new environment variables.
Commands to be aware of
WARNING: The test suit generates random files for certain checks. Ensure you delete files leftover on your Bucket.
npm run test
- Run test suit (You must have the .env file setup)npm run build
- Run build
Thanks
- Thanks to @Omer for fixing credentials from
~/.aws/credentials
- Thanks to @lostjimmy for pointing out
path.sep
for Windows compatibility