Giter VIP home page Giter VIP logo

firestore-to-bigquery-export's Introduction

Firestore to BigQuery export

NPM package for copying and converting Cloud Firestore data to BigQuery.

Software License Packagist Packagist Issues

Firestore is awesome. BigQuery is awesome. But transferring data from Firestore to BigQuery sucks. This package lets you plug and play your way out of config hell.

  • Create a BigQuery dataset with tables corresponding to your Firestore collections.
  • Table schemas are automatically generated based on your document property data types.
  • Convert and copy your Firestore collections to BigQuery.

This package doesn't write anything to Firestore.

Contents

Installation

npm i firestore-to-bigquery-export

import bigExport from 'firestore-to-bigquery-export'

// or

const bigExport = require('firestore-to-bigquery-export')

// then

const GCPSA = require('./Your-Service-Account-File.json')
bigExport.setBigQueryConfig(GCPSA)
bigExport.setFirebaseConfig(GCPSA)

How to

API

bigExport.setBigQueryConfig(
  serviceAccountFile // JSON
)
bigExport.setFirebaseConfig(
  serviceAccountFile // JSON
)
bigExport.createBigQueryTable(
  datasetID, // String
  collectionName, // String
  verbose // boolean
)
// returns Promise<Array>
bigExport.copyToBigQuery(
  datasetID, // String
  collectionName, // String
  snapshot // firebase.firestore.QuerySnapshot
)
// returns Promise<number>
bigExport.deleteBigQueryTable(
  datasetID, // String
  tableName // String
)
// returns Promise<Array>

Examples

/* Create table 'account' in BigQuery dataset 'firestore'. You have to create the dataset beforehand.
 * The given table name has to match the Firestore collection name.
 * Table schema will be autogenerated based on the datatypes found in the collections documents.
 */
await bigExport.createBigQueryTable('firestore', 'accounts')

Then, you can transport your data:

/* Copying and converting all documents in the given Firestore collection snapshot.
 * Inserting each document as a row in tables with the same name as the collection, in the dataset named 'firestore'.
 * Cells (document properties) that doesn't match the table schema will be rejected.
 */
const snapshot = await firebase.collection('payments').get()
const result = await bigExport.copyToBigQuery('firestore', 'payments', snapshot)
console.log('Copied ' + result + ' documents to BigQuery.')

/*
 * You can do multiple collections async, like this.
 * If you get error messages, you should probably copy fewer collections at a time.
 */
const collectionNames = ['payments', 'profiles', 'ratings', 'users']

for (const name of collectionNames) {
  const snapshot = await firestore.collection(name).get()
  await bigExport.copyToBigQuery('firestore', name, snapshot)
}

After that, you may want to refresh your data. For the time being, the quick and dirty way is to delete your tables and make new ones:

// Deleting the given BigQuery table.
await bigExport.deleteBigQueryTable('firestore', 'accounts')

Keep in mind

  • If there's even one prop value that's a FLOAT in your collection during schema generation, the column will be set to FLOAT.
  • If there are ONLY INTs, the column will be set to INTEGER.
  • All columns will be NULLABLE.

Limitations

  • Your Firestore data model should be consistent. If a property of documents in the same collection have different data types, you'll get errors.
  • Patching existing BigQuery sets isn't supported (yet). To refresh your datasets, you can deleteBigQueryTables(), then createBigQueryTables() and then copyCollectionsToBigQuery().
  • Changed your Firestore data model? Delete the corresponding BigQuery table and run createBigQueryTables() to create a table with a new schema.
  • When running this package via a Cloud Function, you may experience that your function times out if your Firestore is large, (Deadline Exceeded). You can then:

Issues

Please use the issue tracker.

To-do

  • Improve the handling of arrays.
  • Implement patching of tables.

firestore-to-bigquery-export's People

Contributors

dependabot[bot] avatar johannes-berggren avatar keito5656 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

firestore-to-bigquery-export's Issues

firebase is not defined

firebase.collection('users').get()
^

ReferenceError: firebase is not defined
at Object. (/var/www/Big-query/bigqu

Column data type set to INT even though documents with float property exists

When running createBigQueryTables:

If the first document in the collection contains a non-float number, the column data type for that property is set to INT, even if there are float values in later documents.

Solution:
Change getSchemaField:
If any float number is found, always set the data type to float.
Only use INT data type if no float values are found.

How to run this from a cloud function?

Hi,

I would like to schedule the bigquery data import using a cloud scheduler triggering a cloud function.

The problem is that your library (and other solutions I've encountered) require me to specify service credentials. I do not know how to access the credentials that Google uses when running the cloud function.

I could attach the credentials to the runtime myself as part of a functions.config secret, but that seems like a very odd thing to do.

Do you have an idea how to approach this?

Request payload size exceeds the limit: 10485760 bytes

First of all, great effort in making such a useful plugin.

However, as the data grows your plugin returns something like this,

"Request payload size exceeds the limit: 10485760 bytes."

Running this query in appEngine (with F4 instance) , instead of Cloud Function.

refresh the lockfile to automatically remove the vulnerability introduced in firestore-to-bigquery-export

Hi, @Johannes-Berggren, I have reported a vulnerability issue in package google-gax.

As far as I am aware, vulnerability(high severity) CVE-2020-7768 detected in package @grpc/grpc-js(<1.1.8) is directly referenced by  [email protected], on which your package [email protected] transitively depends. As such, this vulnerability can also affect [email protected] via the following path:
[email protected][email protected] ➔ @google-cloud/[email protected][email protected] ➔ @grpc/[email protected](vulnerable version)

Since google-gax has released a new patched version [email protected] to resolve this issue ([email protected] ➔ @grpc/[email protected](fix version)), then this vulnerability patch can be automatically propagated into your project only if you update your lockfile. The following is your new dependency path :
[email protected][email protected] ➔ @google-cloud/[email protected][email protected] ➔ @grpc/[email protected](vulnerability fix version).

A warm tip.^_^

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.