Google Cloud Bucket ·
Google Cloud Bucket is node.js package to add objects to a Google Cloud Bucket.
Table of Contents
Install
npm i google-cloud-bucket --save
How To Use It
Prerequisite
Before using this package, you must first:
-
Have a Google Cloud Account.
-
Have a bucket in that Google Account.
-
Have a Service Account set up with the following 2 roles:
roles/storage.objectCreator
roles/storage.objectAdmin
(only if you want to update access to object or create/delete buckets)roles/storage.admin
(only if you want to update access to an entire bucket)
-
Get the JSON keys file for that Service Account above
-
Save that JSON key into a service-account.json
file. Make sure it is located under a path that is accessible to your app (the root folder usually).
Show Me The Code
Basics
const { join } = require('path')
const { client } = require('google-cloud-bucket')
const storage = client.new({
jsonKeyFile: join(__dirname, './service-account.json')
})
const someObject = {
firstname: 'Nicolas',
lastname: 'Dao',
company: 'Neap Pty Ltd',
city: 'Sydney'
}
storate.bucket('your-globally-unique-bucket-name').create()
.then(data => console.log(data))
storate.bucket('your-globally-unique-bucket-name').create({ location: 'australia-southeast1' })
.then(data => console.log(data))
storate.bucket('your-globally-unique-bucket-name').delete()
.then(data => console.log(data))
storage.insert(someObject, 'your-bucket/a-path/filename.json')
.then(() => storage.get('your-bucket/a-path/filename.json'))
.then(res => console.log(JSON.stringify(res, null, ' ')))
const html = `
<!doctype html>
<html>
<body>
<h1>Hello Giiiiirls</h1>
</body>
</html>`
storage.insert(html, 'your-bucket/a-path/index.html')
storage.insert(imgBuffer, 'your-bucket/a-path/image.jpg')
storage.get('your-bucket/a-path/filename.json').then(obj => console.log(obj))
storage.get('your-bucket/a-path/index.html').then(htmlString => console.log(htmlString))
storage.get('your-bucket/a-path/image.jpg').then(imgBuffer => console.log(imgBuffer))
storage.get('your-bucket/a-path/image.jpg', { dst: 'some-path/image.jpg' })
.then(() => console.log(`Image successfully downloaded.`))
const { Writable } = require('stream')
const customReader = new Writable({
write(chunk, encoding, callback) {
console.log('Hello chunk of image')
callback()
}
})
storage.get('your-bucket/a-path/image.jpg', { streamReader: customReader })
.then(() => console.log(`Image successfully downloaded.`))
storage.exists('your-bucket/a-path/image.jpg')
.then(fileExists => fileExists ? console.log('File exists.') : console.log('File does not exist.'))
storage.list('your-bucket/a-path/')
.then(files => console.log(files))
Bucket API
The examples above demonstrate how to insert and query any storage. We've also included a variant of those APIs that are more focused on the bucket:
storage.insert(someObject, 'your-bucket/a-path/filename.json')
storage.bucket('your-bucket').object('a-path/filename.json').insert(someObject)
storage.get('your-bucket/a-path/filename.json').then(obj => console.log(obj))
storage.bucket('your-bucket').object('a-path/filename.json').get().then(obj => console.log(obj))
storage.exists('your-bucket/a-path/image.jpg')
.then(fileExists => fileExists ? console.log('File exists.') : console.log('File does not exist.'))
storage.bucket('your-bucket').object('a-path/image.jpg').exists()
.then(fileExists => fileExists ? console.log('File exists.') : console.log('File does not exist.'))
storage.list('your-bucket/a-path/')
.then(files => console.log(files))
storage.bucket('your-bucket').object('a-path/').list()
.then(files => console.log(files))
Buckets & Files Configuration
Publicly Readable Config
This allows to make any files publicly readable by anybody on the web. That's usefull if you want to host a website, or publish data (e.g., RSS feed).
Once your bucket is publicly readable, everyone can access it at this url: https://storage.googleapis.com/your-bucket/some-path/index.html
WARNING: If that bucket hosts files that hsould be accessible cross domain (e.g., an RSS feed), don't forget to also set up CORS (next section Configuring CORS On a Bucket).
const bucket = storage.bucket('your-bucket')
bucket.isPublic().then(isPublic => isPublic ? console.log(`Bucket '${bucket.name}' is public`) : console.log(`Bucket '${bucket.name}' is not public`))
bucket.addPublicAccess()
.then(({ publicUri }) => console.log(`Your web page is publicly available at: ${publicUri}`))
bucket.removePublicAccess()
bucket.object('a-path/private.html').addPublicAccess()
.then(({ publicUri }) => console.log(`Your web page is publicly available at: ${publicUri}`))
bucket.object('a-path/private.html').removePublicAccess()
Making A Single File Publicly Readable At Creation Time
It is also possible to make a single file publicly readable in a single command when the file is created:
storage.insert(html, 'your-bucket/a-path/index.html', { public: true })
.then(({ publicUri }) => console.log(`Your web page is publicly available at: ${publicUri}`))
Once your file is publicly readable, everyone can access it at this url: https://storage.googleapis.com/your-bucket/a-path/index.html
WARNING: If that bucket hosts files that hsould be accessible cross domain (e.g., an RSS feed), don't forget to also set up CORS (next section Configuring CORS On a Bucket).
Configuring CORS On a Bucket
If your files are publicly readable on the web, they might not be accessible when referenced from other websites. To enable other websites to access your files, you will have to configure CORS on your bucket:
bucket.cors.setup({
origin: ['*'],
method: ['GET', 'OPTIONS', 'HEAD', 'POST'],
responseHeader: ['Authorization', 'Origin', 'X-Requested-With', 'Content-Type', 'Accept'],
maxAgeSeconds: 3600
})
.then(() => console.log(`CORS successfully set up on your bucket.`))
If you want to check if CORS has already been set up on a bucket:
bucket.cors.exists().then(yes => yes
? console.log(`CORS already set up on bucket '${bucket.name}'.`)
: console.log(`CORS not set up yet on bucket '${bucket.name}'.`))
You can also check if a specific CORS config exists:
bucket.cors.exists({
origin: ['*'],
method: ['GET', 'OPTIONS', 'HEAD', 'POST'],
responseHeader: ['Authorization', 'Origin', 'X-Requested-With', 'Content-Type', 'Accept'],
maxAgeSeconds: 3600
}).then(yes => yes
? console.log(`CORS already set up on bucket '${bucket.name}'.`)
: console.log(`CORS not set up yet on bucket '${bucket.name}'.`))
To remove CORS from a bucket:
bucket.cors.disable().then(() => console.log(`CORS successfully disabled on bucket '${bucket.name}'.`))
Avoiding Network Errors
Networks errors (e.g. socket hang up, connect ECONNREFUSED) are a fact of life. To deal with those undeterministic errors, this library uses a simple exponential back off retry strategy, which will reprocess your read or write request for 10 seconds by default. You can increase that retry period as follow:
storage.exists('your-bucket/a-path/image.jpg', { timeout: 30000 })
storage.insert(someObject, 'your-bucket/a-path/filename.json', { timeout: 30000 })
storage.get('your-bucket/a-path/filename.json', { timeout: 30000 })
Annex
List Of All Google Cloud Platform Locations
Single Regions
Single reagions are bucket locations that indicate that your data are replicated in multiple servers in that single region. Though it is unlikely that you would loose your data because all servers fail, it is however possible that a network failure brings that region inaccessbile. At this stage, your data would not be lost, but they would be unavailable for the period of that network outage. This type of storage is the cheapest.
Use this type of location if your data:
- Are not highly critical.
- Do not have to be quickly delivered wherever your clients are located (choosing us-west2 means that the data access will be fast for clients in Los Angeles, but slower for clients in Belgium).
- Are so big (multiple terabytes) that the storage cost is a primary concern.
If the above limits are too strict for your use case, then you should probably use a Multi Regions.
Location | Description |
---|
northamerica-northeast1 | Canada - Montréal |
us-central1 | US - Iowa |
us-east1 | US - South Carolina |
us-east4 | US - Northern Virginia |
us-west1 | US - Oregon |
us-west2 | US - Los Angeles |
southamerica-east1 | South America - Brazil |
europe-north1 | Europe - Finland |
europe-west1 | Europe - Belgium |
europe-west2 | Europe - England |
europe-west3 | Europe - Germany |
europe-west4 | Europe - Netherlands |
asia-east1 | Asia - Taiwan |
asia-east2 | Asia - Hong Kong |
asia-northeast1 | Asia - Japan |
asia-south1 | Asia - Mumbai |
asia-southeast1 | Asia - Singapore |
australia-southeast1 | Asia - Australia |
asia | Asia |
us | US |
eu | Europe |
Multi Regions
Multi regions are bucket locations where your data are not only replicated in multiple servers in the same regions, but also replicated across multiple locations (e.g., asia
will replicate your data across Taiwan, Hong Kong, Japan, Mumbai, Singapore, Australia). That means that your data are:
- Quickly accessible wherever your clients are in that continent.
- Highly available. Even if a region goes down, the others will carry on serving your data.
- A bit more expensive.
Location | Description |
---|
asia | Asia |
us | US |
eu | Europe |
This Is What We re Up To
We are Neap, an Australian Technology consultancy powering the startup ecosystem in Sydney. We simply love building Tech and also meeting new people, so don't hesitate to connect with us at https://neap.co.
Our other open-sourced projects:
GraphQL
- graphql-serverless: GraphQL (incl. a GraphiQL interface) middleware for webfunc.
- schemaglue: Naturally breaks down your monolithic graphql schema into bits and pieces and then glue them back together.
- graphql-s2s: Add GraphQL Schema support for type inheritance, generic typing, metadata decoration. Transpile the enriched GraphQL string schema into the standard string schema understood by graphql.js and the Apollo server client.
- graphql-authorize: Authorization middleware for graphql-serverless. Add inline authorization straight into your GraphQl schema to restrict access to certain fields based on your user's rights.
React & React Native
Tools
License
Copyright (c) 2018, Neap Pty Ltd.
All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
- Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
- Neither the name of Neap Pty Ltd nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL NEAP PTY LTD BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.