gulp-awspublish
awspublish plugin for gulp
Usage
First, install gulp-awspublish
as a development dependency:
npm install --save-dev gulp-awspublish
Then, add it to your gulpfile.js
:
var awspublish = require("gulp-awspublish");
gulp.task("publish", function() {
var publisher = awspublish.create(
{
region: "your-region-id",
params: {
Bucket: "..."
}
},
{
cacheFileName: "your-cache-location"
}
);
var headers = {
"Cache-Control": "max-age=315360000, no-transform, public"
};
return (
gulp
.src("./public/*.js")
.pipe(awspublish.gzip({ ext: ".gz" }))
.pipe(publisher.publish(headers))
.pipe(publisher.cache())
.pipe(awspublish.reporter())
);
});
-
Note: If you follow the aws-sdk suggestions for
providing your credentials you don't need to pass them in to create the publisher.
-
Note: In order for publish to work on S3, your policy has to allow the following S3 actions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::BUCKETNAME"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:GetObjectAcl",
"s3:DeleteObject",
"s3:ListMultipartUploadParts",
"s3:AbortMultipartUpload"
],
"Resource": [
"arn:aws:s3:::BUCKETNAME/*"
]
}
]
}
Testing
- Create an S3 bucket which will be used for the tests. Optionally create an IAM user for running the tests.
- Set the buckets Permission, so it can be edited by the IAM user who will run the tests.
- Add an aws-credentials.json file to the project directory with the name of your testing buckets
and the credentials of the user who will run the tests.
- Run
npm test
{
"params": {
"Bucket": "<test-bucket-name>"
},
"credentials": {
"accessKeyId": "<your-access-key-id>",
"secretAccessKey": "<your-secret-access-key>",
"signatureVersion": "v3"
}
}
API
awspublish.gzip(options)
create a through stream, that gzip file and add Content-Encoding header.
- Note: Node version 0.12.x or later is required in order to use
awspublish.gzip
. If you need an older node engine to work with gzipping, you can use v2.0.2.
Available options:
- ext: file extension to add to gzipped file (eg: { ext: '.gz' })
- smaller: gzip files only when result is smaller
- Any options that can be passed to zlib.gzip
awspublish.create(AWSConfig, cacheOptions)
Create a Publisher.
The AWSConfig object is used to create an aws-sdk
S3 client. At a minimum you must pass a Bucket
key, to define the site bucket. You can find all available options in the AWS SDK documentation.
The cacheOptions object allows you to define the location of the cached hash digests. By default, they will be saved in your projects root folder in a hidden file called '.awspublish-' + 'name-of-your-bucket'.
Adjusting upload timeout
The AWS client has a default timeout which may be too low when pushing large files (> 50mb).
To adjust timeout, add httpOptions: { timeout: 300000 }
to the AWSConfig object.
Credentials
By default, gulp-awspublish uses the credential chain specified in the AWS docs.
Here are some example credential configurations:
Hardcoded credentials (Note: We recommend you not hard-code credentials inside an application. Use this method only for small personal scripts or for testing purposes.):
var publisher = awspublish.create({
region: "your-region-id",
params: {
Bucket: "..."
},
accessKeyId: "akid",
secretAccessKey: "secret"
});
Using a profile by name from ~/.aws/credentials
:
var AWS = require("aws-sdk");
var publisher = awspublish.create({
region: "your-region-id",
params: {
Bucket: "..."
},
credentials: new AWS.SharedIniFileCredentials({ profile: "myprofile" })
});
Instead of putting anything in the configuration object, you can also provide the following environment variables: AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, AWS_SESSION_TOKEN
, AWS_PROFILE
. You can also define a [default]
profile in ~/.aws/credentials
which the SDK will use transparently without needing to set anything.
Create a through stream, that push files to s3.
- header: hash of headers to add or override to existing s3 headers.
- options: optional additional publishing options
- force: bypass cache / skip
- noAcl: do not set x-amz-acl by default
- simulate: debugging option to simulate s3 upload
- createOnly: skip file updates
Files that go through the stream receive extra properties:
- s3.path: s3 path
- s3.etag: file etag
- s3.date: file last modified date
- s3.state: publication state (create, update, delete, cache or skip)
- s3.headers: s3 headers for this file. Defaults headers are:
- x-amz-acl: public-read
- Content-Type
- Content-Length
Note: publish
will never delete files remotely. To clean up unused remote files use sync
.
publisher.cache()
Create a through stream that create or update a cache file using file s3 path and file etag.
Consecutive runs of publish will use this file to avoid reuploading identical files.
Cache file is save in the current working dir and is named .awspublish-<bucket>
. The cache file is flushed to disk every 10 files just to be safe.
Publisher.sync([prefix], [whitelistedFiles])
create a transform stream that delete old files from the bucket.
- prefix: prefix to sync a specific directory
- whitelistedFiles: array that can contain regular expressions or strings that match against filenames that
should never be deleted from the bucket.
e.g.
gulp
.src("./public/*")
.pipe(publisher.publish())
.pipe(publisher.sync("bar", [/^foo\/bar/, "baz.txt"]))
.pipe(awspublish.reporter());
warning sync
will delete files in your bucket that are not in your local folder unless they're whitelisted.
gulp
.src("./public/*")
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter());
Publisher.client
The aws-sdk
S3 client is exposed to let you do other s3 operations.
awspublish.reporter([options])
Create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip).
Available options:
- states: list of state to log (default to all)
gulp
.src("./public/*")
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(
awspublish.reporter({
states: ["create", "update", "delete"]
})
);
Examples
You can use gulp-rename
to rename your files on s3
gulp
.src("examples/fixtures/*.js")
.pipe(
rename(function(path) {
path.dirname += "/s3-examples";
path.basename += "-s3";
})
)
.pipe(publisher.publish())
.pipe(awspublish.reporter());
You can use concurrent-transform
to upload files in parallel to your amazon bucket
var parallelize = require("concurrent-transform");
gulp
.src("examples/fixtures/*.js")
.pipe(parallelize(publisher.publish(), 10))
.pipe(awspublish.reporter());
Upload both gzipped and plain files in one stream
You can use the merge-stream
plugin
to upload two streams in parallel, allowing sync
to work with mixed file
types
var merge = require("merge-stream");
var gzip = gulp.src("public/**/*.js").pipe(awspublish.gzip());
var plain = gulp.src(["public/**/*", "!public/**/*.js"]);
merge(gzip, plain)
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter());
Plugins
gulp-awspublish-router
A router for defining file-specific rules
https://www.npmjs.org/package/gulp-awspublish-router
gulp-cloudfront-invalidate-aws-publish
Invalidate cloudfront cache based on output from awspublish
https://www.npmjs.com/package/gulp-cloudfront-invalidate-aws-publish
License
MIT License