Introducing Socket Firewall: Free, Proactive Protection for Your Software Supply Chain.Learn More
Socket
Book a DemoInstallSign in
Socket

s3-multipart-stream

Package Overview
Dependencies
Maintainers
1
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

s3-multipart-stream

Fast writable stream to Amazon S3's Multipart API using streams2 and concurrent chunk uploads.

latest
npmnpm
Version
0.0.2
Version published
Weekly downloads
0
-100%
Maintainers
1
Weekly downloads
 
Created
Source

s3-multipart-stream

Build Status codecov.io NPM version

s3-multipart-stream uploads files to Amazon S3 using the multipart upload API. It uploads streams by separating it into chunks and concurently uploading those chunks to S3. The chunk size and maximum concurrent upload count is configurable via the options hash.

Each chunk upload is recorded in a working file whose location is specified by the workingDirectory option. Successful chunk uploads write their ETag, upload number, and chunk size to the file, while chunks that error out additionally write their data and error message into the file.

var AWS               = require("aws-sdk");
var S3MultipartStream = require("s3-multipart-stream");
var fs                = require("fs");

/* Configure the AWS client with your credentials */
var s3 = new AWS.S3({
  accessKeyId: "myAccessKey",
  secretAccessKey: "mySecretAccessKey",
  region: "us-east-1"
});

var options = {
  /* Upload at most 10 concurrent chunks of 5MB each */
  chunkUploadSize         :  5242880
  maxConcurrentUploads    :  10
  multipartCreationParams: {
    Bucket: "myBucket",
    Key: "myKey"
    /* Any params accepted by s3 multipart creation API */
  },
  workingDirectory : "/tmp"
};

var s3Stream = S3MultipartStream.create(s3, options, function(err, s3Stream) {
  if(err) {
    console.error(err);
    process.exit(1);
  }
  var fileStream = fs.createReadStream("someFile.txt");
  fileStream
  .pipe(s3Stream);  
});

TODO

  • Add retryUpload(dataFile, journalFile) method that retries a failed upload by retrying failed chunks.

Keywords

s3

FAQs

Package last updated on 16 Dec 2014

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts