
Product
Introducing Webhook Events for Alert Changes
Add real-time Socket webhook events to your workflows to automatically receive software supply chain alert changes in real time.
firestore-to-bigquery-export
Advanced tools
NPM package for copying and converting Firestore data to BigQuery.
NPM package for copying and converting Cloud Firestore data to BigQuery.
Firestore is awesome. BigQuery is awesome. But transferring data from Firestore to BigQuery sucks. This package lets you plug and play your way out of config hell.
This package doesn't write anything to Firestore.
npm i firestore-to-bigquery-export
import bigExport from 'firestore-to-bigquery-export'
// or
const bigExport = require('firestore-to-bigquery-export')
// then
const GCPSA = require('./Your-Service-Account-File.json')
bigExport.setBigQueryConfig(GCPSA)
bigExport.setFirebaseConfig(GCPSA)
bigExport.setBigQueryConfig(
serviceAccountFile // JSON
)
bigExport.setFirebaseConfig(
serviceAccountFile // JSON
)
bigExport.createBigQueryTable(
datasetID, // String
collectionName, // String
verbose // boolean
)
// returns Promise<Array>
bigExport.copyToBigQuery(
datasetID, // String
collectionName, // String
snapshot // firebase.firestore.QuerySnapshot
)
// returns Promise<number>
bigExport.deleteBigQueryTable(
datasetID, // String
tableName // String
)
// returns Promise<Array>
/* Create table 'account' in BigQuery dataset 'firestore'. You have to create the dataset beforehand.
* The given table name has to match the Firestore collection name.
* Table schema will be autogenerated based on the datatypes found in the collections documents.
*/
await bigExport.createBigQueryTable('firestore', 'accounts')
Then, you can transport your data:
/* Copying and converting all documents in the given Firestore collection snapshot.
* Inserting each document as a row in tables with the same name as the collection, in the dataset named 'firestore'.
* Cells (document properties) that doesn't match the table schema will be rejected.
*/
const snapshot = await firebase.collection('payments').get()
const result = await bigExport.copyToBigQuery('firestore', 'payments', snapshot)
console.log('Copied ' + result + ' documents to BigQuery.')
/*
* You can do multiple collections async, like this.
* If you get error messages, you should probably copy fewer collections at a time.
*/
const collectionNames = ['payments', 'profiles', 'ratings', 'users']
for (const name of collectionNames) {
const snapshot = await firestore.collection(name).get()
await bigExport.copyToBigQuery('firestore', name, snapshot)
}
After that, you may want to refresh your data. For the time being, the quick and dirty way is to delete your tables and make new ones:
// Deleting the given BigQuery table.
await bigExport.deleteBigQueryTable('firestore', 'accounts')
deleteBigQueryTables(), then createBigQueryTables() and then copyCollectionsToBigQuery().createBigQueryTables() to create a table with a new schema.firebase serve --only functions.Please use the issue tracker.
FAQs
NPM package for copying and converting Firestore data to BigQuery.
We found that firestore-to-bigquery-export demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Add real-time Socket webhook events to your workflows to automatically receive software supply chain alert changes in real time.

Security News
ENISA has become a CVE Program Root, giving the EU a central authority for coordinating vulnerability reporting, disclosure, and cross-border response.

Product
Socket now scans OpenVSX extensions, giving teams early detection of risky behaviors, hidden capabilities, and supply chain threats in developer tools.