Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

gulp-awspublish

Package Overview
Dependencies
Maintainers
1
Versions
59
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

gulp-awspublish

gulp plugin to publish files to amazon s3

  • 0.0.18
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

gulp-awspublish

NPM version Dependency Status

awspublish plugin for gulp

Usage

First, install gulp-awspublish as a development dependency:

npm install --save-dev gulp-awspublish

Then, add it to your gulpfile.js:

var awspublish = require('gulp-awspublish');

gulp.task('publish', function() {

  // create a new publisher
  var publisher = awspublish.create({ key: '...',  secret: '...', bucket: '...' });

  // define custom headers
  var headers = {
     'Cache-Control': 'max-age=315360000, no-transform, public'
     // ...
   };

  return gulp.src('./public/*.js')

     // gzip, Set Content-Encoding headers and add .gz extension
    .pipe(awspublish.gzip({ ext: '.gz' }))

    // publisher will add Content-Length, Content-Type and  headers specified above
    // If not specified it will set x-amz-acl to public-read by default
    .pipe(publisher.publish(headers))

    // create a cache file to speed up consecutive uploads
    .pipe(publisher.cache())

     // print upload updates to console
    .pipe(awspublish.reporter());
});

// output
// [gulp] [create] file1.js.gz
// [gulp] [create] file2.js.gz
// [gulp] [update] file3.js.gz
// [gulp] [cache]  file3.js.gz
// ...

Testing

add an aws-credentials.json json file to the project directory with your bucket credentials, then run mocha.

 {
  "key": "...",
  "secret": "...",
  "bucket": "..."
}

API

awspublish.gzip(options)

create a through stream, that gzip file and add Content-Encoding header.

Available options:

  • ext: file extension to add to gzipped file (eg: { ext: '.gz' })

awspublish.create(options)

Create a Publisher. Options are passed to knox to create a s3 client.

Publisher.publish([headers], [options])

Create a through stream, that push files to s3.

  • header: hash of headers to add or override to existing s3 headers.
  • options: optional additional publishing options
    • force: bypass cache / skip

Files that go through the stream receive extra properties:

  • s3.path: s3 path
  • s3.etag: file etag
  • s3.date: file last modified date
  • s3.state: publication state (create, update, cache or skip)
  • s3.headers: s3 headers for this file. Defaults headers are:
    • x-amz-acl: public-read
    • Content-Type
    • Content-Length
publisher.cache()

Create a through stream that create or update a cache file using file s3 path and file etag. Consecutive runs of publish will use this file to avoid reuploading identical files.

Cache file is save in the current working dir and is named.awspublish-bucket. The cache file is flushed to disk every 10 files just to be safe :).

Publisher.sync([prefix])

create a transform stream that delete old files from the bucket. You can speficy a prefix to sync a specific directory. Both new and delete files are written to the stream. Deleted file will have s3.state property set to delete.

warning sync will delete files in your bucket that are not in your local folder.

// this will publish and sync bucket files with the one in your public directory
gulp.src('./public/*')
  .pipe(publisher.publish())
  .pipe(publisher.sync())
  .pipe(awspublish.reporter());

// output
// [gulp] [create] file1.js
// [gulp] [update] file2.js
// [gulp] [delete] file3.js
// ...

Publisher.client

The knox client object exposed to let you do other s3 operations.

awspublish.reporter([options])

Create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip).

Available options:

  • states: list of state to log (default to all)
// this will publish,sync bucket files and print created, updated and deleted files
gulp.src('./public/*')
  .pipe(publisher.publish())
  .pipe(publisher.sync())
  .pipe(awspublish.reporter({
      states: ['create', 'update', 'delete']
    }));

Example

rename file & directory

You can use gulp-rename to rename your files on s3

// see examples/rename.js

gulp.src('examples/fixtures/*.js')
    .pipe(rename(function (path) {
        path.dirname += '/s3-examples';
        path.basename += '-s3';
    }))
    .pipe(publisher.publish())
    .pipe(awspublish.reporter());

// output
// [gulp] [create] s3-examples/bar-s3.js
// [gulp] [create] s3-examples/foo-s3.js

License

MIT License

Keywords

FAQs

Package last updated on 08 Jul 2014

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc