This directory holds terraform code to provision resources that org.apache.beam.testinfra.pipelines.ReadDataflowApiWriteBigQuery reads from and writes to.
The following lists all provisioned resources and their rationale categorized by GCP service.
| resource | reason |
|---|---|
| Eventarc Workflow | Intended for listening to Dataflow Status Changes |
| Pub/Sub topic | Required for Eventarc Workflow |
| Pub/Sub subscription | Intended as a source; subscribes to Eventarc Workflow |
| Google Cloud Storage Bucket | Intended for temporary storage |
| BigQuery Datasets | Intended as sink |
Follow terraform workflow convention to apply this module. It assumes the working directory is at .test-infra/pipelines
This module uses a Google Cloud Storage bucket backend.
Initialize the terraform workspace for the apache-beam-testing project:
DIR=infrastructure/03.io/dataflow-to-bigquery terraform -chdir=$DIR init -backend-config=apache-beam-testing.tfbackend
or for your own Google Cloud project:
DIR=infrastructure/03.io/dataflow-to-bigquery terraform init -backend-config=path/to/your/backend-config-file.tfbackend
where your backend-config-file.tfbackend contains:
bucket = <Google Cloud Storage Bucket Name>
Notice the -var-file flag referencing common.tfvars that provides opinionated variable defaults.
For apache-beam-testing:
DIR=infrastructure/03.io/dataflow-to-bigquery terraform -chdir=$DIR apply -var-file=common.tfvars -var-file=apache-beam-testing.tfvars
or for your own Google Cloud project:
DIR=infrastructure/03.io/dataflow-to-bigquery terraform -chdir=$DIR apply -var-file=common.tfvars