![require(esm) Backported to Node.js 20, Paving the Way for ESM-Only Packages](https://cdn.sanity.io/images/cgdhsj6q/production/be8ab80c8efa5907bc341c6fefe9aa20d239d890-1600x1097.png?w=400&fit=max&auto=format)
Security News
require(esm) Backported to Node.js 20, Paving the Way for ESM-Only Packages
require(esm) backported to Node.js 20, easing the transition to ESM-only packages and reducing complexity for developers as Node 18 nears end-of-life.
google-cloud-bigquery
Advanced tools
Node.js package to create BigQuery table from Google Cloud Storage or load data into Google Cloud BigQuery tables including automatically updating the tables' schema.
Google Cloud BigQuery is a node.js package to maintain BigQuery table, either explicitely or using a Google Cloud Storage (including automatically updating the tables' schema).
npm i google-cloud-bigquery --save
Before using this package, you must first:
Have a Google Cloud Account.
Have a both a BigQuery DB and a Bucket in the same region (the bucket is only in case you wish to maintain BigQuery schema using data stored a Google Cloud Storage). As of December 2018, BigQuery is only supported in the following locations:
Have a Service Account set up with the following 2 roles:
roles/bigquery.admin
roles/storage.objectAdmin
(only in case you wish to maintain BigQuery schema using data stored a Google Cloud Storage)Get the JSON keys file for that Service Account above
Save that JSON key into a service-account.json
file. Make sure it is located under a path that is accessible to your app (the root folder usually).
const { join } = require('path')
const { client } = require('google-cloud-bigquery')
const bigQuery = client.new({ jsonKeyFile: join(__dirname, './service-account.json') })
// Assumes that 'your-dataset-id' already exists
const db = bigQuery.db.get('your-dataset-id')
const userTbl = db.table('user')
userTbl.exists()
.then(yes => yes
? console.log(`Table '${userTbl.name}' already exists in DB '${db.name}'`)
: userTbl.create.new({
schema: {
id: 'integer',
username: 'string',
friends: [{
id: 'integer',
username: 'string',
score: 'float'
}],
country: {
code: 'string',
name: 'string'
},
married: 'boolean',
tags:['string'],
inserted_date: 'timestamp'
}
}).then(() => console.log(`Table '${userTbl.name}' successfully added to DB '${db.name}'`)))
userTbl.insert.values({ data:[{
id: 1,
username: 'Nicolas',
inserted_date: new Date()
}, {
id: 2,
username: 'Brendan',
country: {
code: 'AU',
name: 'Australia'
},
friends:[{
id: 1,
username: 'Nicolas',
score: 0.87
}, {
id: 3,
username: 'Boris',
score: 0.9
}],
inserted_date: new Date()
}, {
id: '3',
username: 'Boris',
tags:['admin',1],
inserted_date: Date.now()/1000
}]
})
Notice that the data
input accept both single objects or array of objects. Though BigQuery can ingest up to 10,000 rows per request and 100,000 rows per seconds, it is recommended to keep the maximum amount of rows per request to 500. You can read more about the quotas and limits at https://cloud.google.com/bigquery/quotas#streaming_inserts.
To prevent inserting more than 500 rows per request, you can either code it yourself, or rely on our own implementation using the safeMode
flag as follow:
userTbl.insert.values({ data: lotsOfUsers, safeMode: true })
.then(() => console.log(`All users inserted`))
This safeMode
flag will check that there is less than 500 items in the lotsOfUsers array. If there are more than 500 items, the array is broken down in batches of 500 items which are then inserted sequentially. That means that if you're inserting 5000 users, there will be 10 sequential request of 500 users.
db.query.execute({
sql:`select * from ${db.name}.${userTbl.name} where id = @id`,
params: { id: 2 }
})
.then(({ data }) => console.log(JSON.stringify(data, null, ' ')))
// Query Output
// ============
//
// [
// {
// "id": 2,
// "username": "Brendan",
// "friends": [
// {
// "id": 1,
// "username": "Nicolas",
// "score": 0.87
// },
// {
// "id": 3,
// "username": "Boris",
// "score": 0.9
// }
// ],
// "country": {
// "code": "AU",
// "name": "Australia"
// },
// "married": null,
// "tags": [],
// "inserted_date": "2018-11-14T03:17:16.830Z"
// }
// ]
With BigQuery, only 2 types of updates are possible:
REQUIRED
to NULLABLE
The second type of update is not usefull here as this project always creates nullable fields. The following example shows how to perform a schema update if the local schema is different from the current BigQuery schema:
// Let's add a new 'deleted_date' field to our local schema
const newSchema = {
id: 'integer',
username: 'string',
friends: [{
id: 'integer',
username: 'string',
score: 'float'
}],
country: {
code: 'string',
name: 'string'
},
married: 'boolean',
tags:['string'],
inserted_date: 'timestamp',
deleted_date: 'timestamp'
}
userTbl.schema.isDiff(newSchema)
.then(yes => yes
? Promise.resolve(console.log(`Schema changes detected. Updating now...`))
.then(() => userTbl.schema.update(newSchema))
.then(() => console.log(`Schema successfully updated.`))
: console.log(`No schema updates found`)
)
BigQuery casting capabilities are quite limited. When a type does not fit into the table, that row will either crashes the entire insert, or will be completely be ignored (we're using that last setting). To make sure that as much data is being inserted as possible, we've added an option called forcedSchema
in the db.table('some-table').insert.values
api:
userTbl.insert.values({
data:{
id: '123.34',
username: { hello: 'world' },
inserted_date: new Date(2018,10,14)
},
forcedSchema:{
id: 'integer',
username: 'string',
inserted_date: 'timestamp'
}
})
Under the hood, this code will transform the data payload to the following:
{
id: 123,
username: 'Object',
inserted_date: '2018-11-13T13:00:00.000Z'
}
This object is guaranteed to comply to the schema. This will guarantee that all the data is inserted.
Notice the usage of the
bigQuery.job.get
to check the status of the job. The signature of that api is as follow:bigQuery.job.get({ projectId: 'your-project-id', location: 'asia-northeast1', jobId: 'a-job-id' })
The code snippet below shows how you can create a new tables if they don't exist yet and update their schema if their schema has changed when compared with the local version.
const { join } = require('path')
const { client } = require('google-cloud-bigquery')
// The line below assumes you have a file 'schema.js' located under 'path-to-your-schema-file'
// organised in a way where the 'schema' object below is structured as follow:
// schema.table_01 This is the schema of 'table_01'
// schema.table_02 This is the schema of 'table_02'
const schema = require('path-to-your-schema-file/schema.js')
const bigQuery = client.new({ jsonKeyFile: join(__dirname, './service-account.json') })
const db = bigQuery.db.get('your-dataset-id')
const tbl_01 = db.table('table_01')
const tbl_02 = db.table('table_02')
const maintainTablesScript = () => {
console.log('\nChecking for BigQuery tables updates...')
return [tbl_01, tbl_02].map(table => ({ table, schema: schema[table.name] })).reduce((job, { table, schema }) => job
.then(() =>
table.exists()
.then(tableExists => tableExists
? console.log(` - Table '${table.name}': Table already exists in DB '${db.name}'.`)
: Promise.resolve(console.log(` - Table '${table.name}': Table not found. Creating it now...`))
.then(() => table.create.new({ schema }))
.then(() => console.log(` - Table '${table.name}': Table successfully created.`))
)
.then(() => table.schema.isDiff(schema))
.then(schemaHasChanged => schemaHasChanged
? Promise.resolve(console.log(` - Table '${table.name}': Schema changes detected in table. Updating now...`))
.then(() => table.schema.update(schema))
.then(() => console.log(` - Table '${table.name}': Schema successfully updated.`))
: console.log(` - Table '${table.name}': No schema updates found.`)
)
)
.catch(err => {
console.log(` - Table '${table.name}': Oops... An error occured: ${err.message}`)
}),
Promise.resolve(null))
}
maintainTablesScript()
We are Neap, an Australian Technology consultancy powering the startup ecosystem in Sydney. We simply love building Tech and also meeting new people, so don't hesitate to connect with us at https://neap.co.
Our other open-sourced projects:
Copyright (c) 2018, Neap Pty Ltd. All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL NEAP PTY LTD BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
0.2.11 (2018-12-10)
<a name="0.2.10"></a>
FAQs
Node.js package to create BigQuery table from Google Cloud Storage or load data into Google Cloud BigQuery tables including automatically updating the tables' schema.
The npm package google-cloud-bigquery receives a total of 43 weekly downloads. As such, google-cloud-bigquery popularity was classified as not popular.
We found that google-cloud-bigquery demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
require(esm) backported to Node.js 20, easing the transition to ESM-only packages and reducing complexity for developers as Node 18 nears end-of-life.
Security News
PyPI now supports iOS and Android wheels, making it easier for Python developers to distribute mobile packages.
Security News
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.