gulp-awspublish
awspublish plugin for gulp
Usage
First, install gulp-awspublish
as a development dependency:
npm install --save-dev gulp-awspublish
Then, add it to your gulpfile.js
:
var awspublish = require('gulp-awspublish');
gulp.task('publish', function() {
var publisher = awspublish.create({ key: '...', secret: '...', bucket: '...' });
var headers = {
'Cache-Control': 'max-age=315360000, no-transform, public'
};
return gulp.src('./public/*.js')
.pipe(awspublish.gzip({ ext: '.gz' }))
.pipe(publisher.publish(headers))
.pipe(publisher.cache())
.pipe(awspublish.reporter());
});
Testing
add an aws-credentials.json json file to the project directory
with your bucket credentials, then run mocha.
{
"key": "...",
"secret": "...",
"bucket": "..."
}
API
awspublish.gzip(options)
create a through stream, that gzip file and add Content-Encoding header.
Available options:
- ext: file extension to add to gzipped file (eg: { ext: '.gz' })
awspublish.create(options)
Create a Publisher. Options are passed to knox to create a s3 client.
Create a through stream, that push files to s3.
- header: hash of headers to add or override to existing s3 headers.
- options: optional additional publishing options
- force: bypass cache / skip
Files that go through the stream receive extra properties:
- s3.path: s3 path
- s3.etag: file etag
- s3.date: file last modified date
- s3.state: publication state (create, update, cache or skip)
- s3.headers: s3 headers for this file. Defaults headers are:
- x-amz-acl: public-read
- Content-Type
- Content-Length
publisher.cache()
Create a through stream that create or update a cache file using file s3 path and file etag.
Consecutive runs of publish will use this file to avoid reuploading identical files.
Cache file is save in the current working dir and is named.awspublish-bucket. The cache file is flushed to disk every 10 files just to be safe :).
Publisher.sync([prefix])
create a transform stream that delete old files from the bucket.
You can speficy a prefix to sync a specific directory.
Both new and delete files are written to the stream. Deleted file will have s3.state property set to delete.
warning sync
will delete files in your bucket that are not in your local folder.
gulp.src('./public/*')
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter());
Publisher.client
The knox client object exposed to let you do other s3 operations.
awspublish.reporter([options])
Create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip).
Available options:
- states: list of state to log (default to all)
gulp.src('./public/*')
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter({
states: ['create', 'update', 'delete']
}));
Example
rename file & directory
You can use gulp-rename
to rename your files on s3
gulp.src('examples/fixtures/*.js')
.pipe(rename(function (path) {
path.dirname += '/s3-examples';
path.basename += '-s3';
}))
.pipe(publisher.publish())
.pipe(awspublish.reporter());
upload file in parallel
You can use concurrent-transform
to upload files in parallel to your amazon bucket
gulp
.src('examples/fixtures/*.js')
.pipe(parallelize(publisher.publish(), 10))
.pipe(awspublish.reporter());
Plugins
gulp-awspublish-router
A router for defining file-specific rules
https://www.npmjs.org/package/gulp-awspublish-router
License
MIT License