This directory sets up the Google Cloud project environment for Dataflow usage.
The following table lists all provisioned resources and their rationale.
| Resource | Reason |
|---|---|
| API services | Required by GCP to provision resources |
| Dataflow Worker Service Account | Use GCP service account other than default |
| Worker IAM Roles | Follow principle of least privilege |
| Artifact Registry Repository | Required to store template artifacts |
| Google Cloud Storage bucket | Required for various storage needs |
Follow terraform workflow convention to apply this module. It assumes the working directory is at .test-infra/pipelines.
This module uses a Google Cloud Storage bucket backend.
Initialize the terraform workspace for the apache-beam-testing project:
DIR=infrastructure/01.setup terraform -chdir=$DIR init -backend-config=apache-beam-testing.tfbackend
or for your own Google Cloud project:
DIR=infrastructure/01.setup terraform -chdir=$DIR init -backend-config=path/to/your/backend-config-file.tfbackend
where your backend-config-file.tfbackend contains:
bucket = <Google Cloud Storage Bucket Name>
Notice the -var-file flag referencing common.tfvars that provides opinionated variable defaults.
For apache-beam-testing:
DIR=infrastructure/01.setup terraform -chdir=$DIR apply -var-file=common.tfvars -var-file=apache-beam-testing.tfvars
or for your own Google Cloud project:
DIR=infrastructure/01.setup terraform -chdir=$DIR apply -var-file=common.tfvars