Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

gulp-awspublish

Package Overview
Dependencies
Maintainers
1
Versions
59
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

gulp-awspublish

A plugin for Gulp

  • 0.0.3
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
28K
decreased by-21.48%
Maintainers
1
Weekly downloads
 
Created
Source

gulp-awspublish

NPM version Dependency Status

awspublish plugin for gulp

Usage

First, install gulp-awspublish as a development dependency:

npm install --save-dev gulp-awspublish

Then, add it to your gulpfile.js:

var es = require('event-stream'),
    awspublish = require('gulp-awspublish'),
    publisher = awspublish({ key: '...', secret: '...', bucket: '...'}),
    headers = { 'Cache-Control': 'max-age=315360000, no-transform, public' };

// publish all js files
// Set Content-Length, Content-Type and Cache-Control headers
// Set x-amz-acl to public-read by default
var js = gulp.src('./public/*.js')
  .pipe(publisher.publish(headers));

// gzip and publish all js files
// Content-Encoding headers will be added on top of other headers
// uploaded files will have a jsgz extension
var jsgz = gulp.src('./public/*.js')
  .pipe(awspublish.gzip())
  .pipe(publisher.publish(headers));

// sync content of s3 bucket with files in the stream
// cache s3 etags locally to avoid unnecessary request next time
// print progress with reporter
publisher
  .sync(es.merge(js, jsgz)))
  .pipe(awspublish.cache())
  .pipe(publisher.reporter());

API

awspublish.gzip()

create a through stream, that gzip files and add Content-Encoding headers

awspublish.cache()

create a through stream that create or update an .awspublish cache file with the list of key value pair (s3.path/s3.etag)

awspublish.create(options)

Create a Publisher Options are passed to knox to create a s3 client

Publisher.publish(headers)

create a through stream, that push files to s3. Publish take a header hash that add or override existing s3 headers.

if there is an .awspublish cache file, we first compare disk file etag with the one in the cache, if etags match we dont query amazon and file.s3.state is set to 'cache'

we then make a header query and compare the remote etag with the local one if etags match we don't upload the file and file.s3.state is set to 'skip'

if there is a remote file.s3.state is set to 'update' otherwhise file.s3.state is set to 'create'

Files that go through the stream get extra properties s3.path: s3 path s3.etag: file etag s3.state: publication state (create, update, cache or skip) s3.headers: s3 headers for this file

Defaults headers are

  • x-amz-acl (default to public-read)
  • Content-Type
  • Content-Length
Publisher.sync(stream)

Take a stream of files and sync the content of the s3 bucket with these files. It return a readable stream with both input files and deleted files deleted file will have s3.state set to delete

awspublish.reporter()

create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip)

License

MIT License

Keywords

FAQs

Package last updated on 04 Feb 2014

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc