Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
gulp-awspublish
Advanced tools
awspublish plugin for gulp
First, install gulp-awspublish
as a development dependency:
npm install --save-dev gulp-awspublish
Then, add it to your gulpfile.js
:
var awspublish = require('gulp-awspublish');
gulp.task('publish', function() {
// create a new publisher using S3 options
// http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#constructor-property
var publisher = awspublish.create({
params: {
Bucket: '...'
}
});
// define custom headers
var headers = {
'Cache-Control': 'max-age=315360000, no-transform, public'
// ...
};
return gulp.src('./public/*.js')
// gzip, Set Content-Encoding headers and add .gz extension
.pipe(awspublish.gzip({ ext: '.gz' }))
// publisher will add Content-Length, Content-Type and headers specified above
// If not specified it will set x-amz-acl to public-read by default
.pipe(publisher.publish(headers))
// create a cache file to speed up consecutive uploads
.pipe(publisher.cache())
// print upload updates to console
.pipe(awspublish.reporter());
});
// output
// [gulp] [create] file1.js.gz
// [gulp] [create] file2.js.gz
// [gulp] [update] file3.js.gz
// [gulp] [cache] file3.js.gz
// ...
Note: If you follow the aws-sdk suggestions for providing your credentials you don't need to pass them in to create the publisher.
Note: In order for publish to work on S3, your policiy has to allow the following S3 actions:
add an aws-credentials.json json file to the project directory with your bucket credentials, then run mocha.
{
"params": {
"Bucket": "..."
},
"accessKeyId": "...",
"secretAccessKey": "..."
}
create a through stream, that gzip file and add Content-Encoding header.
Available options:
Create a Publisher.
Options are used to create an aws-sdk
S3 client. At a minimum you must pass
a bucket
option, to define the site bucket. If you are using the aws-sdk suggestions for credentials you do not need
to provide anything else.
Also supports credentials specified in the old knox
format, a profile
property for choosing a specific set of shared AWS creds, or and accessKeyId
and secretAccessKey
provided explicitly.
Create a through stream, that push files to s3.
Files that go through the stream receive extra properties:
Note:
publish
will never delete files remotely. To clean up unused remote files usesync
.
Create a through stream that create or update a cache file using file s3 path and file etag. Consecutive runs of publish will use this file to avoid reuploading identical files.
Cache file is save in the current working dir and is named .awspublish-<bucket>
. The cache file is flushed to disk every 10 files just to be safe.
create a transform stream that delete old files from the bucket. You can speficy a prefix to sync a specific directory.
warning
sync
will delete files in your bucket that are not in your local folder.
// this will publish and sync bucket files with the one in your public directory
gulp.src('./public/*')
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter());
// output
// [gulp] [create] file1.js
// [gulp] [update] file2.js
// [gulp] [delete] file3.js
// ...
The aws-sdk
S3 client is exposed to let you do other s3 operations.
Create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip).
Available options:
// this will publish,sync bucket files and print created, updated and deleted files
gulp.src('./public/*')
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter({
states: ['create', 'update', 'delete']
}));
You can use gulp-rename
to rename your files on s3
// see examples/rename.js
gulp.src('examples/fixtures/*.js')
.pipe(rename(function (path) {
path.dirname += '/s3-examples';
path.basename += '-s3';
}))
.pipe(publisher.publish())
.pipe(awspublish.reporter());
// output
// [gulp] [create] s3-examples/bar-s3.js
// [gulp] [create] s3-examples/foo-s3.js
You can use concurrent-transform
to upload files in parallel to your amazon bucket
var parallelize = require("concurrent-transform");
gulp
.src('examples/fixtures/*.js')
.pipe(parallelize(publisher.publish(), 10))
.pipe(awspublish.reporter());
You can use the merge-stream
plugin
to upload two streams in parallel, allowing sync
to work with mixed file
types
var merge = require('merge-stream');
var gzip = gulp.src('public/**/*.js').pipe(awspublish.gzip());
var plain = gulp.src([ 'public/**/*', '!public/**/*.js' ]);
merge(gzip, plain)
.pipe(publisher.publish())
.pipe(publisher.sync())
.pipe(awspublish.reporter());
A router for defining file-specific rules https://www.npmjs.org/package/gulp-awspublish-router
FAQs
gulp plugin to publish files to amazon s3
The npm package gulp-awspublish receives a total of 21,424 weekly downloads. As such, gulp-awspublish popularity was classified as popular.
We found that gulp-awspublish demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.