Security News
vlt Debuts New JavaScript Package Manager and Serverless Registry at NodeConf EU
vlt introduced its new package manager and a serverless registry this week, innovating in a space where npm has stagnated.
gcs-resumable-upload
Advanced tools
Upload a file to Google Cloud Storage with built-in resumable behavior
The gcs-resumable-upload npm package is designed to facilitate resumable uploads to Google Cloud Storage. It allows users to upload large files in chunks, which can be particularly useful for handling network interruptions and ensuring that uploads can be resumed from the point of failure.
Initiate a Resumable Upload
This feature allows you to initiate a resumable upload session by creating a unique upload URI. This URI can be used to upload the file in chunks.
const { Upload } = require('gcs-resumable-upload');
const upload = new Upload({
bucket: 'my-bucket',
file: 'my-file.txt',
authConfig: 'path/to/keyfile.json'
});
upload.createURI((err, uri) => {
if (err) {
console.error('Error creating URI:', err);
} else {
console.log('Upload URI:', uri);
}
});
Upload a File Chunk
This feature allows you to upload a file chunk to the previously created upload URI. It handles the actual data transfer to Google Cloud Storage.
const fs = require('fs');
const { Upload } = require('gcs-resumable-upload');
const upload = new Upload({
bucket: 'my-bucket',
file: 'my-file.txt',
authConfig: 'path/to/keyfile.json'
});
const uri = 'your-upload-uri';
const fileStream = fs.createReadStream('path/to/local/file');
upload.startUploading(uri, fileStream, (err, res) => {
if (err) {
console.error('Error uploading file:', err);
} else {
console.log('File uploaded successfully:', res);
}
});
Resume an Interrupted Upload
This feature allows you to resume an interrupted upload by continuing from the last successfully uploaded chunk. It ensures that the upload process can be completed even after a network failure.
const fs = require('fs');
const { Upload } = require('gcs-resumable-upload');
const upload = new Upload({
bucket: 'my-bucket',
file: 'my-file.txt',
authConfig: 'path/to/keyfile.json'
});
const uri = 'your-upload-uri';
const fileStream = fs.createReadStream('path/to/local/file');
upload.continueUploading(uri, fileStream, (err, res) => {
if (err) {
console.error('Error resuming upload:', err);
} else {
console.log('Upload resumed successfully:', res);
}
});
The gcs-upload package is another library for uploading files to Google Cloud Storage. It supports both simple and resumable uploads. While it offers similar functionality to gcs-resumable-upload, it may not be as widely used or maintained.
Upload a file to Google Cloud Storage with built-in resumable behavior
$ npm install --save gcs-resumable-upload
var upload = require('gcs-resumable-upload');
var fs = require('fs');
fs.createReadStream('titanic.mov')
.pipe(upload({ bucket: 'legally-owned-movies', file: 'titanic.mov' }))
.on('finish', function () {
// Uploaded!
});
Or from the command line:
$ npm install -g gcs-resumable-upload
$ cat titanic.mov | gcs-upload legally-owned-movies titanic.mov
If somewhere during the operation, you lose your connection to the internet or your tough-guy brother slammed your laptop shut when he saw what you were uploading, the next time you try to upload to that file, it will resume automatically from where you left off.
This module stores a file using ConfigStore that is written to when you first start an upload. It is aliased by the file name you are uploading to and holds the first 16kb chunk of data* as well as the unique resumable upload URI. (Resumable uploads are complicated)
If your upload was interrupted, next time you run the code, we ask the API how much data it has already, then simply dump all of the data coming through the pipe that it already has.
After the upload completes, the entry in the config file is removed. Done!
* The first 16kb chunk is stored to validate if you are sending the same data when you resume the upload. If not, a new resumable upload is started with the new data.
Oh, right. This module uses google-auto-auth and accepts all of the configuration that module does to strike up a connection as config.authConfig
. See authConfig
.
Duplexify
object
Configuration object.
GoogleAutoAuth
If you want to re-use an auth client from google-auto-auth, pass an instance here.
object
See authConfig
.
string
The name of the destination bucket.
string
The name of the destination file.
number
This will cause the upload to fail if the current generation of the remote object does not match the one provided here.
string|buffer
A customer-supplied encryption key.
string
Resource name of the Cloud KMS key, of the form projects/my-project/locations/global/keyRings/my-kr/cryptoKeys/my-key
, that will be used to encrypt the object. Overrides the object metadata's kms_key_name
value, if any.
object
Any metadata you wish to set on the object.
Set the length of the file being uploaded.
Set the content type of the incoming data.
number
The starting byte of the upload stream, for resuming an interrupted upload.
string
Set an Origin header when creating the resumable upload URI.
string
Apply a predefined set of access controls to the created file.
Acceptable values are:
authenticatedRead
- Object owner gets OWNER
access, and allAuthenticatedUsers
get READER
access.bucketOwnerFullControl
- Object owner gets OWNER
access, and project team owners get OWNER
access.bucketOwnerRead
- Object owner gets OWNER
access, and project team owners get READER
access.private
- Object owner gets OWNER
access.projectPrivate
- Object owner gets OWNER
access, and project team members get access according to their roles.publicRead
- Object owner gets OWNER
access, and allUsers
get READER
access.boolean
Make the uploaded file private. (Alias for config.predefinedAcl = 'private'
)
boolean
Make the uploaded file public. (Alias for config.predefinedAcl = 'publicRead'
)
string
If you already have a resumable URI from a previously-created resumable upload, just pass it in here and we'll use that.
string
If the bucket being accessed has requesterPays
functionality enabled, this can be set to control which project is billed for the access of this file.
--
Error
Invoked if the authorization failed, the request failed, or the file wasn't successfully uploaded.
Object
The HTTP response from request
.
Object
The file's new metadata.
The file was uploaded successfully.
Error
Invoked if the authorization failed or the request to start a resumable session failed.
String
The resumable upload session URI.
FAQs
Upload a file to Google Cloud Storage with built-in resumable behavior
The npm package gcs-resumable-upload receives a total of 211,146 weekly downloads. As such, gcs-resumable-upload popularity was classified as popular.
We found that gcs-resumable-upload demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 6 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
vlt introduced its new package manager and a serverless registry this week, innovating in a space where npm has stagnated.
Security News
Research
The Socket Research Team uncovered a malicious Python package typosquatting the popular 'fabric' SSH library, silently exfiltrating AWS credentials from unsuspecting developers.
Security News
At its inaugural meeting, the JSR Working Group outlined plans for an open governance model and a roadmap to enhance JavaScript package management.