Google Cloud Bucket ·
![Neap](https://neap.co/img/made_by_neap.svg)
Google Cloud Bucket is node.js package to add objects to a Google Cloud Bucket.
Table of Contents
Install
npm i google-cloud-bucket --save
How To Use It
Prerequisite
Before using this package, you must first:
-
Have a Google Cloud Account.
-
Have a Service Account set up with the following 2 roles:
roles/storage.objectCreator
roles/storage.objectAdmin
(only if you want to update access to object or create/delete buckets)roles/storage.admin
(only if you want to update access to an entire bucket)
-
Get the JSON keys file for that Service Account above
-
Save that JSON key into a service-account.json
file. Make sure it is located under a path that is accessible to your app (the root folder usually).
Basics
const { join } = require('path')
const { client } = require('google-cloud-bucket')
const storage = client.new({
jsonKeyFile: join(__dirname, './service-account.json')
})
storage.list().then(console.log)
const someObject = {
firstname: 'Nicolas',
lastname: 'Dao',
company: 'Neap Pty Ltd',
city: 'Sydney'
}
storage.bucket('your-globally-unique-bucket-name').create()
.then(data => console.log(data))
storage.bucket('your-globally-unique-bucket-name').create({ location: 'australia-southeast1' })
.then(data => console.log(data))
storage.bucket('your-globally-unique-bucket-name').delete()
.then(data => console.log(data))
storage.bucket('your-globally-unique-bucket-name').get()
.then(data => console.log(data))
storage.insert(someObject, 'your-bucket/a-path/filename.json')
.then(() => storage.get('your-bucket/a-path/filename.json'))
.then(res => console.log(JSON.stringify(res, null, ' ')))
const html = `
<!doctype html>
<html>
<body>
<h1>Hello Giiiiirls</h1>
</body>
</html>`
storage.insert(html, 'your-bucket/a-path/index.html')
storage.insert(imgBuffer, 'your-bucket/a-path/image.jpg')
storage.get('your-bucket/a-path/filename.json').then(obj => console.log(obj))
storage.get('your-bucket/a-path/index.html').then(htmlString => console.log(htmlString))
storage.get('your-bucket/a-path/image.jpg').then(imgBuffer => console.log(imgBuffer))
storage.get('your-bucket/a-path/image.jpg', { dst: 'some-path/image.jpg' })
.then(() => console.log(`Image successfully downloaded.`))
const { Writable } = require('stream')
const customReader = new Writable({
write(chunk, encoding, callback) {
console.log('Hello chunk of image')
callback()
}
})
storage.get('your-bucket/a-path/image.jpg', { streamReader: customReader })
.then(() => console.log(`Image successfully downloaded.`))
storage.exists('your-bucket/a-path/image.jpg')
.then(fileExists => fileExists ? console.log('File exists.') : console.log('File does not exist.'))
storage.list('your-bucket/a-path/')
.then(files => console.log(files))
Bucket API
The examples above demonstrate how to insert and query any storage. We've also included a variant of those APIs that are more focused on the bucket:
storage.insert(someObject, 'your-bucket/a-path/filename.json')
storage.bucket('your-bucket').object('a-path/filename.json').insert(someObject)
storage.get('your-bucket/a-path/filename.json').then(obj => console.log(obj))
storage.bucket('your-bucket').object('a-path/filename.json').get().then(obj => console.log(obj))
storage.exists('your-bucket/a-path/image.jpg')
.then(fileExists => fileExists ? console.log('File exists.') : console.log('File does not exist.'))
storage.bucket('your-bucket').object('a-path/image.jpg').exists()
.then(fileExists => fileExists ? console.log('File exists.') : console.log('File does not exist.'))
storage.list('your-bucket/a-path/')
.then(files => console.log(files))
storage.bucket('your-bucket').object('a-path/').list()
.then(files => console.log(files))
Configuring Your Bucket Or Your File
Publicly Readable Config
This allows to make any files publicly readable by anybody on the web. That's usefull if you want to host a website, or publish data (e.g., RSS feed).
Once your bucket is publicly readable, everyone can access it at this url: https://storage.googleapis.com/your-bucket/some-path/index.html
WARNING: If that bucket hosts files that hsould be accessible cross domain (e.g., an RSS feed), don't forget to also set up CORS (next section Configuring CORS On a Bucket).
const bucket = storage.bucket('your-bucket')
bucket.isPublic().then(isPublic => isPublic ? console.log(`Bucket '${bucket.name}' is public`) : console.log(`Bucket '${bucket.name}' is not public`))
bucket.addPublicAccess()
.then(({ publicUri }) => console.log(`Your web page is publicly available at: ${publicUri}`))
bucket.removePublicAccess()
bucket.object('a-path/private.html').addPublicAccess()
.then(({ publicUri }) => console.log(`Your web page is publicly available at: ${publicUri}`))
bucket.object('a-path/private.html').removePublicAccess()
Making A Single File Publicly Readable At Creation Time
It is also possible to make a single file publicly readable in a single command when the file is created:
storage.insert(html, 'your-bucket/a-path/index.html', { public: true })
.then(({ publicUri }) => console.log(`Your web page is publicly available at: ${publicUri}`))
Once your file is publicly readable, everyone can access it at this url: https://storage.googleapis.com/your-bucket/a-path/index.html
WARNING: If that bucket hosts files that hsould be accessible cross domain (e.g., an RSS feed), don't forget to also set up CORS (next section Configuring CORS On a Bucket).
Setting Single File Content Encoding At Creation Time
It is also possible to set a file's content encoding in a single command when the file is created:
storage.insert(html, 'your-bucket/a-path/index.html', { contentEncoding: 'gzip' })
.then(({ publicUri }) => console.log(`Your gzipped file is available at: ${publicUri}`))
Configuring CORS On a Bucket
If your files are publicly readable on the web, they might not be accessible when referenced from other websites. To enable other websites to access your files, you will have to configure CORS on your bucket:
bucket.cors.setup({
origin: ['*'],
method: ['GET', 'OPTIONS', 'HEAD', 'POST'],
responseHeader: ['Authorization', 'Origin', 'X-Requested-With', 'Content-Type', 'Accept'],
maxAgeSeconds: 3600
})
.then(() => console.log(`CORS successfully set up on your bucket.`))
If you want to check if CORS has already been set up on a bucket:
bucket.cors.exists().then(yes => yes
? console.log(`CORS already set up on bucket '${bucket.name}'.`)
: console.log(`CORS not set up yet on bucket '${bucket.name}'.`))
You can also check if a specific CORS config exists:
bucket.cors.exists({
origin: ['*'],
method: ['GET', 'OPTIONS', 'HEAD', 'POST'],
responseHeader: ['Authorization', 'Origin', 'X-Requested-With', 'Content-Type', 'Accept'],
maxAgeSeconds: 3600
}).then(yes => yes
? console.log(`CORS already set up on bucket '${bucket.name}'.`)
: console.log(`CORS not set up yet on bucket '${bucket.name}'.`))
To remove CORS from a bucket:
bucket.cors.disable().then(() => console.log(`CORS successfully disabled on bucket '${bucket.name}'.`))
Configuring A Bucket As A Static Website
To achieve this you need to setup 5 things:
-
You need to setup the service account that you've been using to manage your bucket (defined in your service-account.json
) as a domain owner. To achieve that, the first step is to prove your ownership using https://search.google.com/search-console/welcome. When that's done, open the settings and select User and permissions. There, you'll be able to add a new owner, which will allow you to add the email of your service account.
-
Create a bucket with a name matching your domain (e.g., www.your-domain-name.com
)
-
Make that bucket public. Refer to section Publicly Readable Config above.
-
Add a new CNAME record in your DNS similar to this:
Type | Name | Value |
---|
CNAME | www | c.storage.googleapis.com |
-
Configure the bucket so that each index.html and the 404.html page are the default pages (otherwise, you'll have to explicitly enter http://www.your-domain-name.com/index.html to reach your website instead of simply entering http://www.your-domain-name.com):
bucket.website.setup({
mainPageSuffix: 'index.html',
notFoundPage: '404.html'
}).then(console.log)
Zipping Files
const bucket = storage.bucket('your-bucket-name')
bucket.object('some-folder-path').zip({
to: {
local: 'some-path-on-your-local-machine',
bucket: {
name: 'another-existing-bucket-name',
path: 'some-folder-path.zip'
}
},
ignore:[/\.png$/, /\.jpg$/, /\.html$/]
})
.then(({ count, data }) => {
console.log(`${count} files have been zipped`)
if (data)
console.log(`The zip file's size is: ${data.length/1024} KB`)
})
Extra Options
You can also track the various steps of the zipping process with the optional on
object:
const bucket = storage.bucket('your-bucket-name')
bucket.object('some-folder-path').zip({
to: {
local: 'some-path-on-your-local-machine',
bucket: {
name: 'another-existing-bucket-name',
path: 'some-folder-path.zip'
}
},
on:{
'files-listed': (files) => {
console.log(`Total number of files to be zipped: ${files.count}`)
console.log(`Raw size: ${(files.size/1024/1024).toFixed(1)} MB`)
},
'file-received': ({ file, size }) => {
console.log(`File ${file} (byte size: ${size}) is being zipped`)
},
'finished': ({ size }) => {
console.log(`Zip process completed. The zip file's size is ${size} bytes`)
},
'saved': () => {
console.log('The zipped file has been saved')
},
'error': err => {
console.log(`${err.message}\n${err.stack}`)
}
}
})
.then(({ count, data }) => {
console.log(`${count} files have been zipped`)
if (data)
console.log(`The zip file's size is: ${data.length/1024} KB`)
})
3 Ways To Create a Client
1. Using A service-account.json
We assume that you have created a Service Account in your Google Cloud Account (using IAM) and that you've downloaded a service-account.json
(the name of the file does not matter as long as it is a valid json file). The first way to create a client is to provide the path to that service-account.json
as shown in the following example:
const storage = client.new({
jsonKeyFile: join(__dirname, './service-account.json')
})
2. Using a ClientEmail PrivateKey & ProjectId
This method is similar to the previous one. You should have dowloaded a service-account.json
, but instead of providing its path, you provide some of its details explicitly:
const storage = client.new({
clientEmail: 'some-client-email',
privateKey: 'some-secret-private-key',
projectId: 'your-project-id'
})
3. Using a ProjectId
If you're managing an Google Cloud OAuth2 token yourself (most likely using the google-auto-auth
library), you are not required to explicitly pass account details like what was done in the previous 2 approaches. You can simply specify the projectId
:
const storage = client.new({ projectId: 'your-project-id' })
Refer to the next section to see how to pass an OAuth2 token.
Avoiding Network Errors
Networks errors (e.g. socket hang up, connect ECONNREFUSED) are a fact of life. To deal with those undeterministic errors, this library uses a simple exponential back off retry strategy, which will reprocess your read or write request for 10 seconds by default. You can increase that retry period as follow:
storage.exists('your-bucket/a-path/image.jpg', { timeout: 30000 })
storage.insert(someObject, 'your-bucket/a-path/filename.json', { timeout: 30000 })
storage.get('your-bucket/a-path/filename.json', { timeout: 30000 })
Using An External OAuth2 Token
If you've used the 3rd method to create a client (i.e. 3. Using a ProjectId), then all the method you use require an explicit OAuth2 token:
storage.list({ token }).then(console.log)
All method accept a last optional argument object.
Full API Doc
Storage API
storage.get(filePath[, options]): <Promise>
Gets an object located under the filePath
path in a bucket.