blob: 019ddd26cdebed24c811bd886cf042b45c4ed463 [file] [log] [blame]
:py:mod:`tests.system.providers.google.cloud.gcs.example_firestore`
===================================================================
.. py:module:: tests.system.providers.google.cloud.gcs.example_firestore
.. autoapi-nested-parse::
Example Airflow DAG that shows interactions with Google Cloud Firestore.
Prerequisites
=============
This example uses two Google Cloud projects:
* ``GCP_PROJECT_ID`` - It contains a bucket and a firestore database.
* ``G_FIRESTORE_PROJECT_ID`` - it contains the Data Warehouse based on the BigQuery service.
Saving in a bucket should be possible from the ``G_FIRESTORE_PROJECT_ID`` project.
Reading from a bucket should be possible from the ``GCP_PROJECT_ID`` project.
The bucket and dataset should be located in the same region.
If you want to run this example, you must do the following:
1. Create Google Cloud project and enable the BigQuery API
2. Create the Firebase project
3. Create a bucket in the same location as the Firebase project
4. Grant Firebase admin account permissions to manage BigQuery. This is required to create a dataset.
5. Create a bucket in Firebase project and
6. Give read/write access for Firebase admin to bucket to step no. 5.
7. Create collection in the Firestore database.
Module Contents
---------------
.. py:data:: ENV_ID
.. py:data:: DAG_ID
:annotation: = example_google_firestore
.. py:data:: GCP_PROJECT_ID
.. py:data:: FIRESTORE_PROJECT_ID
.. py:data:: BUCKET_NAME
.. py:data:: DATASET_NAME
.. py:data:: EXPORT_DESTINATION_URL
.. py:data:: EXPORT_PREFIX
.. py:data:: EXPORT_COLLECTION_ID
.. py:data:: DATASET_LOCATION
.. py:data:: create_bucket
.. py:data:: test_run