tree: c510c877ab34010dbe6bbab11151a617efbed96c [path history] [tgz]
  1. .terraform.lock.hcl
  2. apache-beam-testing.tfbackend
  3. apache-beam-testing.tfvars
  4. artifactregistry.tf
  5. common.tfvars
  6. iam.tf
  7. provider.tf
  8. README.md
  9. services.tf
  10. state.tf
  11. storage.tf
  12. variables.tf
.test-infra/pipelines/infrastructure/01.setup/README.md

Overview

This directory sets up the Google Cloud project environment for Dataflow usage.

List of all provision GCP resources

The following table lists all provisioned resources and their rationale.

ResourceReason
API servicesRequired by GCP to provision resources
Dataflow Worker Service AccountUse GCP service account other than default
Worker IAM RolesFollow principle of least privilege
Artifact Registry RepositoryRequired to store template artifacts
Google Cloud Storage bucketRequired for various storage needs

Usage

Follow terraform workflow convention to apply this module. It assumes the working directory is at .test-infra/pipelines.

Terraform Init

This module uses a Google Cloud Storage bucket backend.

Initialize the terraform workspace for the apache-beam-testing project:

DIR=infrastructure/01.setup
terraform -chdir=$DIR init -backend-config=apache-beam-testing.tfbackend

or for your own Google Cloud project:

DIR=infrastructure/01.setup
terraform -chdir=$DIR init -backend-config=path/to/your/backend-config-file.tfbackend

where your backend-config-file.tfbackend contains:

bucket = <Google Cloud Storage Bucket Name>

Terraform Apply

Notice the -var-file flag referencing common.tfvars that provides opinionated variable defaults.

For apache-beam-testing:

DIR=infrastructure/01.setup
terraform -chdir=$DIR apply -var-file=common.tfvars -var-file=apache-beam-testing.tfvars

or for your own Google Cloud project:

DIR=infrastructure/01.setup
terraform -chdir=$DIR apply -var-file=common.tfvars