Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

gulp-awspublish

Package Overview
Dependencies
Maintainers
1
Versions
59
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

gulp-awspublish

gulp plugin to publish files to amazon s3

  • 0.0.10
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
28K
decreased by-21.48%
Maintainers
1
Weekly downloads
 
Created
Source

gulp-awspublish

NPM version Dependency Status

awspublish plugin for gulp

Usage

First, install gulp-awspublish as a development dependency:

npm install --save-dev gulp-awspublish

Then, add it to your gulpfile.js:

var es = require('event-stream'),
    awspublish = require('gulp-awspublish'),
    publisher = awspublish.create({ key: '...', secret: '...', bucket: '...'}),
    headers = { 'Cache-Control': 'max-age=315360000, no-transform, public' };

// publish all js files
// Set Content-Length, Content-Type and Cache-Control headers
// Set x-amz-acl to public-read by default
var js = gulp.src('./public/*.js')
  .pipe(publisher.publish(headers));

// gzip and publish all js files
// Content-Encoding headers will be added on top of other headers
// uploaded files will have a jsgz extension
var jsgz = gulp.src('./public/*.js')
  .pipe(awspublish.gzip({ ext: '.gz' }))
  .pipe(publisher.publish(headers));

// sync content of s3 bucket with files in the stream
// cache s3 etags locally to avoid unnecessary request next time
// print progress with reporter
es.merge(js, jsgz)
  .pipe(publisher.sync())
  .pipe(publisher.cache())
  .pipe(awspublish.reporter());

Testing

add an aws-credentials.json json file to the project directory with your bucket credentials, then run mocha.

 {
  "key": "...",
  "secret": "...",
  "bucket": "..."
}

API

awspublish.gzip()

create a through stream, that gzip file and add Content-Encoding header

awspublish.create(options)

Create a Publisher Options are passed to knox to create a s3 client

Publisher.publish(headers)

create a through stream, that push files to s3. publish take a header object that add or override existing s3 headers.

Files that go through the stream get extra properties s3.path: s3 path s3.etag: file etag s3.state: publication state (create, update, cache or skip) s3.headers: s3 headers for this file

Defaults headers are:

  • x-amz-acl: public-read
  • Content-Type
  • Content-Length
publisher.cache()

create a through stream that create or update a cache file using file s3 path and file etag. Consecutive runs of publish will use this file to avoid reuploading identical files

Cache file is save in the current working dir and is named.awspublish-bucket The cache file is flushed to disk every 10 files just to be safe :)

Publisher.sync()

create a transform stream that delete old files from the bucket Both new and delete files are written to the stream deleted file will have s3.state set to delete

Publisher.client

the knox client to let you do other s3 operations

awspublish.reporter([options])

create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip)

Available options:

  • states: list of state to log (default to all)

License

MIT License

Keywords

FAQs

Package last updated on 14 Feb 2014

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc