What is @google-cloud/bigquery?
@google-cloud/bigquery is a Node.js client library for Google BigQuery, a fully-managed, serverless data warehouse that enables scalable analysis over petabytes of data. This package allows you to interact with BigQuery to run queries, manage datasets, tables, and jobs, and perform data manipulations.
What are @google-cloud/bigquery's main functionalities?
Running Queries
This feature allows you to run SQL queries against your BigQuery datasets. The code sample demonstrates how to run a simple query to select names and ages from a table where the age is greater than 30.
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
async function query() {
const query = 'SELECT name, age FROM `my-dataset.my-table` WHERE age > 30';
const options = { query: query, location: 'US' };
const [rows] = await bigquery.query(options);
console.log('Rows:', rows);
}
query();
Managing Datasets
This feature allows you to create and manage datasets in BigQuery. The code sample demonstrates how to create a new dataset named 'my_new_dataset'.
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
async function createDataset() {
const [dataset] = await bigquery.createDataset('my_new_dataset');
console.log(`Dataset ${dataset.id} created.`);
}
createDataset();
Managing Tables
This feature allows you to create and manage tables within datasets. The code sample demonstrates how to create a new table named 'my_new_table' within the 'my_new_dataset' dataset.
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
async function createTable() {
const dataset = bigquery.dataset('my_new_dataset');
const [table] = await dataset.createTable('my_new_table');
console.log(`Table ${table.id} created.`);
}
createTable();
Loading Data
This feature allows you to load data from various sources into BigQuery tables. The code sample demonstrates how to load data from a local CSV file into a table named 'my_new_table' within the 'my_new_dataset' dataset.
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
async function loadData() {
const dataset = bigquery.dataset('my_new_dataset');
const table = dataset.table('my_new_table');
const [job] = await table.load('local-file.csv');
console.log(`Job ${job.id} completed.`);
}
loadData();
Other packages similar to @google-cloud/bigquery
aws-sdk
The aws-sdk package is the official AWS SDK for JavaScript, which allows you to interact with various AWS services, including Amazon Redshift, a data warehouse service similar to Google BigQuery. While aws-sdk provides a broader range of functionalities across different AWS services, it can be used to perform similar data warehousing tasks as @google-cloud/bigquery.
azure-sdk-for-js
The azure-sdk-for-js package is the official Azure SDK for JavaScript, which allows you to interact with various Azure services, including Azure Synapse Analytics, a data warehousing service similar to Google BigQuery. Like aws-sdk, azure-sdk-for-js offers a wide range of functionalities across different Azure services, including data warehousing capabilities.
snowflake-sdk
The snowflake-sdk package is the official Node.js driver for Snowflake, a cloud-based data warehousing service. It provides functionalities to connect to Snowflake, run queries, and manage data, similar to what @google-cloud/bigquery offers for Google BigQuery.
@google-cloud/bigquery
Google BigQuery Client Library for Node.js
Looking for more Google APIs than just BigQuery? You might want to check out google-cloud
.
$ npm install --save @google-cloud/bigquery
var bigquery = require('@google-cloud/bigquery')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
var schoolsDataset = bigquery.dataset('schools');
var schoolsTable = schoolsDataset.table('schoolsData');
schoolsTable.import('/local/file.json', function(err, job) {});
var job = bigquery.job('job-id');
job.getQueryResults(function(err, rows) {});
job.getQueryResults().on('data', function(row) {});
Authentication
It's incredibly easy to get authenticated and start using Google's APIs. You can set your credentials on a global basis as well as on a per-API basis. See each individual API section below to see how you can auth on a per-API-basis. This is useful if you want to use different accounts for different Google Cloud services.
On Google Compute Engine
If you are running this client on Google Compute Engine, we handle authentication for you with no configuration. You just need to make sure that when you set up the GCE instance, you add the correct scopes for the APIs you want to access.
var projectId = process.env.GCLOUD_PROJECT;
var bigQuery = require('@google-cloud/bigquery')({
projectId: projectId
});
Elsewhere
If you are not running this client on Google Compute Engine, you need a Google Developers service account. To create a service account:
- Visit the Google Developers Console.
- Create a new project or click on an existing project.
- Navigate to APIs & auth > APIs section and turn on the following APIs (you may need to enable billing in order to use these services):
- Navigate to APIs & auth > Credentials and then:
- If you want to use a new service account, click on Create new Client ID and select Service account. After the account is created, you will be prompted to download the JSON key file that the library uses to authenticate your requests.
- If you want to generate a new key for an existing service account, click on Generate new JSON key and download the JSON key file.
var projectId = process.env.GCLOUD_PROJECT;
var bigQuery = require('@google-cloud/bigquery')({
projectId: projectId,
keyFilename: '/path/to/keyfile.json'
credentials: require('./path/to/keyfile.json')
});