
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
s3-multipart
Advanced tools
Features:
For large file uploads, larger than 5GB, S3 has the concept of multipart uploads, where the file is divided into smaller parts (max 5GB per chunk) and each part is transferred individually. This concept is well known, well documented, but if you want to do it directly from a browser, it is significantly less so.
To my knowledge, there are some other modules that does this, but had the wrong kind of dependencies for my taste, or were hard to configure.
s3-multipart aims to be very small, and very easy configure to your needs.
Note: To do multipart uploads from the browser, you need to use presigned URLs. These URLs most likely will have to be presigned by some kind of backend that you control. You need to set this up, it is not a part of this module.
The module exports a single class, S3Multipart:
import S3Multipart from "s3-multipart";
const s3multipart = new S3Multipart({
createUpload: (file) => {
/* ...return promise... */
},
getPartUrl: (file, uploadId, partNumber, partSize) => {
/* ...return promise... */
},
completeUpload: (file, uploadId, etags) => {
/* ...return promise... */
},
onProgress: (uploadedBytes, totalBytes) => {
/* ... */
},
});
s3multipart.upload(myFile).then(() => console.log("success!"));
The class has three mandatory options: createUpload, getPartUrl and completeUpload. Each of them are expected to return a promise.
createUpload: creates a multipart upload on S3; see CreateMultipartUpload for details. The function should return a promise that resolves to the newly created upload's id.getPartUrl: Returns a promise that resolves to a presigned URL for a PUT request to the given file and part numbercompleteUpload: completes the multipart upload after each part has been transferred; see CompleteMultipartUpload for details. In addition the file and upload id for the current transfer, the method also receives an array of the parts' ETag headers that is used by S3 to verify the integrity of each part you uploadedOptional configuration:
onProgress: callback to indicate how much of the file has been transferredpartSize: number of bytes for each part; must be at least 5 MB and maximum 5 GB; default is 5 GB.parallelism: number of parts to attempt to transfer simulatenously; defaults to 3.retries: the number of times a part will be retried if it fails; default to 3 (one initial attempt + three retries, four in total)retryBackoffTimeMs: number of seconds to wait after a failed upload attempt before retrying; can be a number or a function that accepts the current number of attempts and returns the time delay; defaults to a function that waits 1, 4 and 9 seconds (quadratic backoff)FAQs
Easy multipart uploads to S3 from the browser
The npm package s3-multipart receives a total of 751 weekly downloads. As such, s3-multipart popularity was classified as not popular.
We found that s3-multipart demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.