
Security News
OWASP 2025 Top 10 Adds Software Supply Chain Failures, Ranked Top Community Concern
OWASP’s 2025 Top 10 introduces Software Supply Chain Failures as a new category, reflecting rising concern over dependency and build system risks.
gcs-resumable-upload
Advanced tools
Upload a file to Google Cloud Storage with built-in resumable behavior
This repository has been deprecated. Support will end on November 1, 2023.
Upload a file to Google Cloud Storage with built-in resumable behavior
$ npm install gcs-resumable-upload
const {upload} = require('gcs-resumable-upload');
const fs = require('fs');
fs.createReadStream('titanic.mov')
.pipe(upload({ bucket: 'legally-owned-movies', file: 'titanic.mov' }))
.on('progress', (progress) => {
console.log('Progress event:')
console.log('\t bytes: ', progress.bytesWritten);
})
.on('finish', () => {
// Uploaded!
});
Or from the command line:
$ npm install -g gcs-resumable-upload
$ cat titanic.mov | gcs-upload legally-owned-movies titanic.mov
If somewhere during the operation, you lose your connection to the internet or your tough-guy brother slammed your laptop shut when he saw what you were uploading, the next time you try to upload to that file, it will resume automatically from where you left off.
This module stores a file using ConfigStore that is written to when you first start an upload. It is aliased by the file name you are uploading to and holds the first 16kb chunk of data* as well as the unique resumable upload URI. (Resumable uploads are complicated)
If your upload was interrupted, next time you run the code, we ask the API how much data it has already, then simply dump all of the data coming through the pipe that it already has.
After the upload completes, the entry in the config file is removed. Done!
* The first 16kb chunk is stored to validate if you are sending the same data when you resume the upload. If not, a new resumable upload is started with the new data.
Oh, right. This module uses google-auth-library and accepts all of the configuration that module does to strike up a connection as config.authConfig. See authConfig.
const {gcsResumableUpload} = require('gcs-resumable-upload')
const upload = gcsResumableUpload(config)
upload is an instance of Duplexify.
ErrorInvoked if the authorization failed or the request to start a resumable session failed.
StringThe resumable upload session URI.
This will remove the config data associated with the provided file.
objectConfiguration object.
GoogleAuthIf you want to re-use an auth client from google-auth-library, pass an instance here.
objectSee authConfig.
stringThe name of the destination bucket.
stringWhere the gcs-resumable-upload configuration file should be stored on your system. This maps to the configstore option by the same name.
objectFor each API request we send, you may specify custom request options that we'll add onto the request. The request options follow the gaxios API: https://github.com/googleapis/gaxios#request-options.
For example, to set your own HTTP headers:
const stream = upload({
customRequestOptions: {
headers: {
'X-My-Header': 'My custom value',
},
},
})
stringThe name of the destination file.
numberThis will cause the upload to fail if the current generation of the remote object does not match the one provided here.
string|bufferA customer-supplied encryption key.
stringResource name of the Cloud KMS key, of the form projects/my-project/locations/global/keyRings/my-kr/cryptoKeys/my-key, that will be used to encrypt the object. Overrides the object metadata's kms_key_name value, if any.
objectAny metadata you wish to set on the object.
Set the length of the file being uploaded.
Set the content type of the incoming data.
numberThe starting byte of the upload stream, for resuming an interrupted upload.
stringSet an Origin header when creating the resumable upload URI.
stringApply a predefined set of access controls to the created file.
Acceptable values are:
authenticatedRead - Object owner gets OWNER access, and allAuthenticatedUsers get READER access.bucketOwnerFullControl - Object owner gets OWNER access, and project team owners get OWNER access.bucketOwnerRead - Object owner gets OWNER access, and project team owners get READER access.private - Object owner gets OWNER access.projectPrivate - Object owner gets OWNER access, and project team members get access according to their roles.publicRead - Object owner gets OWNER access, and allUsers get READER access.booleanMake the uploaded file private. (Alias for config.predefinedAcl = 'private')
booleanMake the uploaded file public. (Alias for config.predefinedAcl = 'publicRead')
stringIf you already have a resumable URI from a previously-created resumable upload, just pass it in here and we'll use that.
stringIf the bucket being accessed has requesterPays functionality enabled, this can be set to control which project is billed for the access of this file.
objectParameters used to control retrying operations.
interface RetryOptions {
retryDelayMultiplier?: number;
totalTimeout?: number;
maxRetryDelay?: number;
autoRetry?: boolean;
maxRetries?: number;
retryableErrorFn?: (err: ApiError) => boolean;
}
numberBase number used for exponential backoff. Default 2.
numberUpper bound on the total amount of time to attempt retrying, in seconds. Default: 600.
numberThe maximum time to delay between retries, in seconds. Default: 64.
booleanWhether or not errors should be retried. Default: true.
numberThe maximum number of retries to attempt. Default: 5.
functionCustom function returning a boolean indicating whether or not to retry an error.
numberEnables Multiple chunk upload mode and sets each request size to this amount.
This only makes sense to use for larger files. The chunk size should be a multiple of 256 KiB (256 x 1024 bytes). Larger chunk sizes typically make uploads more efficient. We recommend using at least 8 MiB for the chunk size.
Review documentation for guidance and best practices.
ErrorInvoked if the authorization failed, the request failed, or the file wasn't successfully uploaded.
ObjectThe response object from Gaxios.
ObjectThe file's new metadata.
ObjectnumbernumberProgress event provides upload stats like Transferred Bytes and content length.
The file was uploaded successfully.
const {createURI} = require('gcs-resumable-upload')
ErrorInvoked if the authorization failed or the request to start a resumable session failed.
StringThe resumable upload session URI.
The gcs-upload package is another library for uploading files to Google Cloud Storage. It supports both simple and resumable uploads. While it offers similar functionality to gcs-resumable-upload, it may not be as widely used or maintained.
FAQs
Upload a file to Google Cloud Storage with built-in resumable behavior
The npm package gcs-resumable-upload receives a total of 175,809 weekly downloads. As such, gcs-resumable-upload popularity was classified as popular.
We found that gcs-resumable-upload demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 6 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OWASP’s 2025 Top 10 introduces Software Supply Chain Failures as a new category, reflecting rising concern over dependency and build system risks.

Research
/Security News
Socket researchers discovered nine malicious NuGet packages that use time-delayed payloads to crash applications and corrupt industrial control systems.

Security News
Socket CTO Ahmad Nassri discusses why supply chain attacks now target developer machines and what AI means for the future of enterprise security.