tree: 25371e75c0fdc22f07774ba2f309d47c5440038a [path history] [tgz]
  1. apache-beam-testing.tfvars
  2. common.tfvars
  3. data.tf
  4. dataflow-template.json
  5. output.tf
  6. provider.tf
  7. README.md
  8. template.tf
  9. variables.tf
.test-infra/pipelines/infrastructure/04.template/dataflow-to-bigquery/README.md

Overview

This directory holds terraform code to build the org.apache.beam.testinfra.pipelines.ReadDataflowApiWriteBigQuery pipeline for use as a Dataflow Flex Template.

Why terraform?

As of this README's writing, there is no resource block for the Google Cloud terraform provider to provision Dataflow Templates. Therefore, this solution makes use of the null_resource along with the local-exec provisioner.

The benefit of using terraform is that it provides clean resource lookups, within its workflow, known to make submitting the Dataflow job cumbersome through other means, such as through a gradle command, bash script, etc.

Usage

Follow terraform workflow convention to apply this module. It assumes the working directory is at .test-infra/pipelines

This module does not use a state backend.

Notice the -var-file flag referencing common.tfvars that provides opinionated variable defaults.

For apache-beam-testing:

DIR=infrastructure/04.template/dataflow-to-bigquery
terraform -chdir=$DIR init
terraform -chdir=$DIR apply -var-file=common.tfvars -var-file=apache-beam-testing.tfvars

or for your own Google Cloud project:

DIR=infrastructure/04.template/dataflow-to-bigquery
terraform -chdir=$DIR init
terraform -chdir=$DIR apply -var-file=common.tfvars