
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
@google-cloud/storage
Advanced tools
The @google-cloud/storage npm package is a client library for using Google Cloud Storage, which is a service for storing and accessing data on Google's infrastructure. The package allows Node.js developers to interact with Google Cloud Storage in a server-side application.
Uploading files
This feature allows users to upload files to a Google Cloud Storage bucket. The code sample demonstrates how to upload a local file to a specified bucket.
const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
async function uploadFile() {
await storage.bucket('my-bucket').upload('local-file-path', {
destination: 'destination-file-path',
});
console.log('File uploaded.');
}
uploadFile().catch(console.error);
Downloading files
This feature enables users to download files from a Google Cloud Storage bucket. The code sample shows how to download a file from a bucket to the local file system.
const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
async function downloadFile() {
const options = {
destination: 'local-file-path',
};
await storage.bucket('my-bucket').file('remote-file-path').download(options);
console.log('File downloaded.');
}
downloadFile().catch(console.error);
Listing files
This feature provides the ability to list all files in a Google Cloud Storage bucket. The code sample lists the names of all files in a specified bucket.
const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
async function listFiles() {
const [files] = await storage.bucket('my-bucket').getFiles();
files.forEach(file => console.log(file.name));
}
listFiles().catch(console.error);
Deleting files
This feature allows users to delete files from a Google Cloud Storage bucket. The code sample demonstrates how to delete a specific file from a bucket.
const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
async function deleteFile() {
await storage.bucket('my-bucket').file('file-to-delete').delete();
console.log('File deleted.');
}
deleteFile().catch(console.error);
Managing buckets
This feature enables users to manage buckets, including creating and deleting buckets. The code sample shows how to create a new bucket in Google Cloud Storage.
const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
async function createBucket() {
await storage.createBucket('new-bucket');
console.log('Bucket created.');
}
createBucket().catch(console.error);
The aws-sdk package is a client library for Amazon Web Services (AWS), including Amazon S3, which is a similar object storage service to Google Cloud Storage. It provides a wide range of functionalities to interact with various AWS services.
The azure-storage package is a client library for Microsoft Azure Storage. Like Google Cloud Storage, Azure Storage offers blob storage for unstructured data, and this package allows Node.js developers to work with Azure blobs, files, queues, and tables.
The pkgcloud package is a multi-cloud provisioning library for Node.js that abstracts away differences among multiple cloud providers. It supports various cloud services, including storage services like Amazon S3 and Rackspace Cloud Files, and can be used as an alternative to provider-specific packages like @google-cloud/storage.
Google Cloud Storage Client Library for Node.js
Looking for more Google APIs than just Storage? You might want to check out google-cloud
.
$ npm install --save @google-cloud/storage
var fs = require('fs');
var gcs = require('@google-cloud/storage')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
// Create a new bucket.
gcs.createBucket('my-new-bucket', function(err, bucket) {
if (!err) {
// "my-new-bucket" was successfully created.
}
});
// Reference an existing bucket.
var bucket = gcs.bucket('my-existing-bucket');
// Upload a local file to a new file to be created in your bucket.
bucket.upload('/photos/zoo/zebra.jpg', function(err, file) {
if (!err) {
// "zebra.jpg" is now in your bucket.
}
});
// Download a file from your bucket.
bucket.file('giraffe.jpg').download({
destination: '/photos/zoo/giraffe.jpg'
}, function(err) {});
// Streams are also supported for reading and writing files.
var remoteReadStream = bucket.file('giraffe.jpg').createReadStream();
var localWriteStream = fs.createWriteStream('/photos/zoo/giraffe.jpg');
remoteReadStream.pipe(localWriteStream);
var localReadStream = fs.createReadStream('/photos/zoo/zebra.jpg');
var remoteWriteStream = bucket.file('zebra.jpg').createWriteStream();
localReadStream.pipe(remoteWriteStream);
// Promises are also supported by omitting callbacks.
bucket.upload('/photos/zoo/zebra.jpg').then(function(data) {
var file = data[0];
});
// It's also possible to integrate with third-party Promise libraries.
var gcs = require('@google-cloud/storage')({
promise: require('bluebird')
});
It's incredibly easy to get authenticated and start using Google's APIs. You can set your credentials on a global basis as well as on a per-API basis. See each individual API section below to see how you can auth on a per-API-basis. This is useful if you want to use different accounts for different Google Cloud services.
If you are running this client on Google Cloud Platform, we handle authentication for you with no configuration. You just need to make sure that when you set up the GCE instance, you add the correct scopes for the APIs you want to access.
var gcs = require('@google-cloud/storage')();
// ...you're good to go!
If you are not running this client on Google Cloud Platform, you need a Google Developers service account. To create a service account:
var projectId = process.env.GCLOUD_PROJECT; // E.g. 'grape-spaceship-123'
var gcs = require('@google-cloud/storage')({
projectId: projectId,
// The path to your key file:
keyFilename: '/path/to/keyfile.json'
// Or the contents of the key file:
credentials: require('./path/to/keyfile.json')
});
// ...you're good to go!
FAQs
Cloud Storage Client Library for Node.js
The npm package @google-cloud/storage receives a total of 4,030,049 weekly downloads. As such, @google-cloud/storage popularity was classified as popular.
We found that @google-cloud/storage demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.