Merge pull request #29102: [flink] Flush buffer during drain operation for requiresStableInput operator
diff --git a/.github/actions/setup-action/action.yml b/.github/actions/setup-action/action.yml
index da69dd9..743e89a 100644
--- a/.github/actions/setup-action/action.yml
+++ b/.github/actions/setup-action/action.yml
@@ -69,6 +69,4 @@
- name: expose gcloud path
shell: bash
run: |
- echo KUBELET_GCLOUD_CONFIG_PATH=/var/lib/kubelet/pods/$POD_UID/volumes/kubernetes.io~empty-dir/gcloud >> $GITHUB_ENV
- - name: Setup environment
- uses: ./.github/actions/setup-environment-action
+ echo KUBELET_GCLOUD_CONFIG_PATH=/var/lib/kubelet/pods/$POD_UID/volumes/kubernetes.io~empty-dir/gcloud >> $GITHUB_ENV
\ No newline at end of file
diff --git a/.github/workflows/README.md b/.github/workflows/README.md
index ea497ae..5777a84 100644
--- a/.github/workflows/README.md
+++ b/.github/workflows/README.md
@@ -178,262 +178,285 @@
# Workflows
Please note that jobs with matrix need to have matrix element in the comment. Example:
```Run Python PreCommit (3.8)```
+
+### PreCommit Jobs
+
| Workflow name | Matrix | Trigger Phrase | Cron Status |
|:-------------:|:------:|:--------------:|:-----------:|
-| [ Cancel Stale Dataflow Jobs ](https://github.com/apache/beam/actions/workflows/beam_CancelStaleDataflowJobs.yml) | N/A | `Run Cancel Stale Dataflow Jobs` | [](https://github.com/apache/beam/actions/workflows/beam_CancelStaleDataflowJobs.yml) |
-| [ Clean Up GCP Resources ](https://github.com/apache/beam/actions/workflows/beam_CleanUpGCPResources.yml) | N/A | `Run Clean GCP Resources` | [](https://github.com/apache/beam/actions/workflows/beam_CleanUpGCPResources.yml) |
-| [ Clean Up Prebuilt SDK Images ](https://github.com/apache/beam/actions/workflows/beam_CleanUpPrebuiltSDKImages.yml) | N/A | `Run Clean Prebuilt Images` | [](https://github.com/apache/beam/actions/workflows/beam_CleanUpPrebuiltSDKImages.yml) |
-| [ Cleanup Dataproc Resources ](https://github.com/apache/beam/actions/workflows/beam_CleanUpDataprocResources.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_CleanUpDataprocResources.yml)
-| [ CloudML Benchmarks Dataflow ](https://github.com/apache/beam/actions/workflows/beam_CloudML_Benchmarks_Dataflow.yml) | N/A |`Run TFT Criteo Benchmarks`| [](https://github.com/apache/beam/actions/workflows/beam_CloudML_Benchmarks_Dataflow.yml)
-| [ Community Metrics Prober ](https://github.com/apache/beam/actions/workflows/beam_Prober_CommunityMetrics.yml) | N/A |`Run Community Metrics Prober`| [](https://github.com/apache/beam/actions/workflows/beam_Prober_CommunityMetrics.yml)
-| [ Inference Python Benchmarks Dataflow ](https://github.com/apache/beam/actions/workflows/beam_Inference_Python_Benchmarks_Dataflow.yml) | N/A |`Run Inference Benchmarks`| [](https://github.com/apache/beam/actions/workflows/beam_Inference_Python_Benchmarks_Dataflow.yml)
-| [ Java InfluxDbIO Integration Test ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml) | N/A |`Run Java InfluxDbIO_IT`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml)
-| [ Java JMH ](https://github.com/apache/beam/actions/workflows/beam_Java_JMH.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Java_JMH.yml)
-| [ Load Tests GBK Dataflow Batch Go ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml)
-| [ Load Tests CoGBK Dataflow Batch Go ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml)
-| [ Load Tests CoGBK Dataflow Streaming Java ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml)
-| [ Load Tests Combine Dataflow Batch Java ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml)
-| [ Load Tests Combine Dataflow Batch Python ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml)
-| [ Load Tests Combine Dataflow Batch Python ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml)
-| [ Load Tests FnApiRunner Microbenchmark Python ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml)
-| [ Load Tests Go CoGBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Flink_Batch.yml)
-| [ Load Tests Go Combine Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_Combine_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_Combine_Flink_Batch.yml)
-| [ Load Tests Go GBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml)
-| [ Load Tests Go ParDo Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml)
-| [ Load Tests Go SideInput Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml)
-| [ LoadTests Java CoGBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml)
-| [ LoadTests Java CoGBK Dataflow V2 Batch JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_JavaVersions.yml) | ['11','17'] | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_JavaVersions.yml)
-| [ LoadTests Java CoGBK Dataflow V2 Streaming JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_JavaVersions.yml) | ['11','17'] | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_JavaVersions.yml)
-| [ LoadTests Java CoGBK SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml)
-| [ Load Tests Java Combine Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml)
-| [ Load Tests Java Combine SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml)
-| [ Load Tests Java ParDo Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml)
-| [ Load Tests Java ParDo Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml)
-| [ LoadTests Java ParDo Dataflow V2 Batch JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_JavaVersions.yml) | ['11','17'] | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_JavaVersions.yml)
-| [ LoadTests Java ParDo Dataflow V2 Streaming JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_JavaVersions.yml) | ['11','17'] | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_JavaVersions.yml)
-| [ Load Tests Java ParDo SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml)
-| [ Load Tests ParDo Dataflow Batch Go ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml)
-| [ Load Tests SideInput Dataflow Batch Go ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml)
-| [ LoadTests Java Combine Smoke ](https://github.com/apache/beam/actions/workflows/beam_Java_LoadTests_Combine_Smoke.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Java_LoadTests_Combine_Smoke.yml)
-| [ LoadTests Java GBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml)
-| [ LoadTests Java GBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml)
-| [ LoadTests Java GBK Dataflow V2 Batch Java11 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java11.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java11.yml)
-| [ LoadTests Java GBK Dataflow V2 Batch Java17 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java17.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java17.yml)
-| [ LoadTests Java GBK Dataflow V2 Streaming Java11 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11.yml)
-| [ LoadTests Java GBK Dataflow V2 Streaming Java17 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17.yml)
-| [ LoadTests Java GBK Smoke ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Smoke.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Smoke.yml)
-| [ LoadTests Python GBK reiterate Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch.yml)
-| [ LoadTests Python GBK reiterate Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Streaming.yml)
-| [ LoadTests Python CoGBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Batch.yml)
-| [ LoadTests Python CoGBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Streaming.yml)
-| [ LoadTests Python Combine Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Streaming.yml)
-| [ LoadTests Python Combine Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Batch.yml)
-| [ LoadTests Python Combine Flink Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Streaming.yml)
-| [ LoadTests Python GBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Batch.yml)
-| [ LoadTests Python GBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Streaming.yml)
-| [ LoadTests Python GBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Flink_Batch.yml)
-| [ LoadTests Python CoGBK Dataflow Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml)
-| [ LoadTests Python ParDo Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Batch.yml)
-| [ LoadTests Python ParDo Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Streaming.yml)
-| [ LoadTests Python ParDo Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Batch.yml)
-| [ LoadTests Python ParDo Flink Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Streaming.yml)
-| [ LoadTests Python SideInput Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_SideInput_Dataflow_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_SideInput_Dataflow_Batch.yml)
-| [ LoadTests Python Smoke ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Smoke.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Smoke.yml)
-| [ Performance Tests AvroIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml)
-| [ Performance Tests AvroIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT.yml)
-| [ Performance Tests BigQueryIO Batch Java Avro ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml)
-| [ Performance Tests BigQueryIO Batch Java Json ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml)
-| [ Performance Tests BigQueryIO Streaming Java ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml)
-| [ Performance Tests BigQueryIO Read Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Read_Python.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Read_Python.yml)
-| [ Performance Tests BigQueryIO Write Python Batch ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch.yml)
-| [ PerformanceTests Cdap ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Cdap.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Cdap.yml)
-| [ PerformanceTests Compressed TextIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml)
-| [ PerformanceTests Compressed TextIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml)
-| [ PerformanceTests HadoopFormat ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_HadoopFormat.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_HadoopFormat.yml)
-| [ PerformanceTests JDBC ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_JDBC.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_JDBC.yml)
-| [ PerformanceTests Kafka IO ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Kafka_IO.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Kafka_IO.yml)
-| [ PerformanceTests ManyFiles TextIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml)
-| [ PerformanceTests ManyFiles TextIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml)
-| [ PerformanceTests MongoDBIO IT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_MongoDBIO_IT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_MongoDBIO_IT.yml)
-| [ PerformanceTests ParquetIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml)
-| [ PerformanceTests ParquetIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT.yml)
-| [ PerformanceTests PubsubIOIT Python Streaming ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_PubsubIOIT_Python_Streaming.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_PubsubIOIT_Python_Streaming.yml)
-| [ PerformanceTests SingleStoreIO ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SingleStoreIO.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SingleStoreIO.yml)
-| [ PerformanceTests SpannerIO Read 2GB Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Read_2GB_Python.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Read_2GB_Python.yml)
-| [ PerformanceTests SpannerIO Write 2GB Python Batch ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Write_2GB_Python_Batch.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Write_2GB_Python_Batch.yml)
-| [ PerformanceTests SparkReceiver IO ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SparkReceiver_IO.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SparkReceiver_IO.yml)
-| [ PerformanceTests SQLBigQueryIO Batch Java ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SQLBigQueryIO_Batch_Java.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SQLBigQueryIO_Batch_Java.yml)
-| [ PerformanceTests TextIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml)
-| [ PerformanceTests TextIOIT Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_Python.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_Python.yml)
-| [ PerformanceTests TextIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT.yml)
-| [ PerformanceTests TFRecordIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml)
-| [ PerformanceTests TFRecordIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT.yml)
-| [ PerformanceTests WordCountIT PythonVersions ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml) | ['3.8'] | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml)
-| [ PerformanceTests XmlIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml)
-| [ PerformanceTests XmlIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT.yml)
-| [ PerformanceTests xlang KafkaIO Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_xlang_KafkaIO_Python.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_xlang_KafkaIO_Python.yml)
-| [ PostCommit BeamMetrics Publish ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_BeamMetrics_Publish.yml) | N/A |`Run Beam Metrics Deployment`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_BeamMetrics_Publish.yml)
-| [ PostCommit Go ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go.yml) | N/A |`Run Go PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go.yml) |
-| [ PostCommit Go Dataflow ARM](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_Dataflow_ARM.yml) | N/A |`Run Go PostCommit Dataflow ARM`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_Dataflow_ARM.yml) |
-| [ PostCommit Go VR Flink](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Flink.yml) | N/A |`Run Go Flink ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Flink.yml) |
-| [ PostCommit Go VR Samza](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Samza.yml) | N/A |`Run Go Samza ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Samza.yml) |
-| [ PostCommit Go VR Spark](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Spark.yml) | N/A |`Run Go Spark ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Spark.yml) |
-| [ PostCommit Java Avro Versions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Avro_Versions.yml) | N/A |`Run Java Avro Versions PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Avro_Versions.yml) |
-| [ PostCommit Java Dataflow V1 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV1.yml) | N/A |`Run PostCommit_Java_Dataflow`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV1.yml) |
-| [ PostCommit Java Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV2.yml) | N/A |`Run PostCommit_Java_DataflowV2`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV2.yml) |
-| [ PostCommit Java Examples Dataflow ARM ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_ARM.yml) | ['8','11','17'] |`Run Java_Examples_Dataflow_ARM PostCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_ARM.yml) |
-| [ PostCommit Java Examples Dataflow](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow.yml) | N/A |`Run Java examples on Dataflow`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow.yml) |
-| [ PostCommit Java Examples Dataflow Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml) | ['11','17'] |`Run Java examples on Dataflow Java (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml) |
-| [ PostCommit Java Examples Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml) | N/A |`Run Java Examples on Dataflow Runner V2`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml) |
-| [ PostCommit Java Examples Dataflow V2 Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml) | ['11','17'] |`Run Java (matrix_element) Examples on Dataflow Runner V2`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml) |
-| [ PostCommit Java Examples Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Direct.yml) | N/A |`Run Java Examples_Direct`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Direct.yml) |
-| [ PostCommit Java Examples Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Flink.yml) | N/A |`Run Java Examples_Flink`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Flink.yml) |
-| [ PostCommit Java Examples Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Spark.yml) | N/A |`Run Java Examples_Spark`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Spark.yml) |
-| [ PostCommit Java Hadoop Versions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Hadoop_Versions.yml) | N/A |`Run PostCommit_Java_Hadoop_Versions`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Hadoop_Versions.yml) |
-| [ PostCommit Java Jpms Dataflow Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml) | N/A |`Run Jpms Dataflow Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml) |
-| [ PostCommit Java Jpms Dataflow Java17 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml) | N/A |`Run Jpms Dataflow Java 17 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml) |
-| [ PostCommit Java Jpms Direct Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml) | N/A |`Run Jpms Direct Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml) |
-| [ PostCommit Java Jpms Direct Java17 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml) | N/A |`Run Jpms Direct Java 17 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml) |
-| [ PostCommit Java Jpms Flink Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml) | N/A |`Run Jpms Flink Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml) |
-| [ PostCommit Java Jpms Spark Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml) | N/A |`Run Jpms Spark Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml) |
-| [ PostCommit Java Nexmark Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml) | N/A |`Run Dataflow Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml) |
-| [ PostCommit Java Nexmark Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml) | N/A |`Run Dataflow Runner V2 Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml) |
-| [ PostCommit Java Nexmark Dataflow V2 Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml) | ['11','17'] |`Run Dataflow Runner V2 Java (matrix) Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml) |
-| [ PostCommit Java Nexmark Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Direct.yml) | N/A |`Run Direct Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Direct.yml) |
-| [ PostCommit Java Nexmark Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Flink.yml) | N/A |`Run Flink Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Flink.yml) |
-| [ PostCommit Java Nexmark Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Spark.yml) | N/A |`Run Spark Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Spark.yml) |
-| [ PostCommit Java PVR Flink Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml) | N/A |`Run Java Flink PortableValidatesRunner Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml) |
-| [ PostCommit Java PVR Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Samza.yml) | N/A |`Run Java Samza PortableValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Samza.yml) |
-| [ PostCommit Java PVR Spark3 Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml) | N/A |`Run Java Spark v3 PortableValidatesRunner Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml) |
-| [ PostCommit Java PVR Spark Batch ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark_Batch.yml) | N/A |`Run Java Spark PortableValidatesRunner Batch`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark_Batch.yml) |
-| [ PostCommit Java Sickbay ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Sickbay.yml) | N/A |`Run Java Sickbay`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Sickbay.yml) |
-| [ PostCommit Java Tpcds Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml) | N/A |`Run Dataflow Runner Tpcds Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml) |
-| [ PostCommit Java Tpcds Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Flink.yml) | N/A |`Run Flink Runner Tpcds Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Flink.yml) |
-| [ PostCommit Java Tpcds Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Spark.yml) | N/A |`Run Spark Runner Tpcds Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Spark.yml) |
-| [ PostCommit Java ValidatesRunner Dataflow JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml) | ['11','17'] |`Run Dataflow ValidatesRunner Java (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml) |
-| [ PostCommit Java ValidatesRunner Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml) | N/A |`Run Dataflow Streaming ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml) |
-| [ PostCommit Java ValidatesRunner Dataflow V2 Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml) | N/A |`Run Java Dataflow V2 ValidatesRunner Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml) |
-| [ PostCommit Java ValidatesRunner Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml) | N/A |`Run Java Dataflow V2 ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml) |
-| [ PostCommit Java ValidatesRunner Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml) | N/A |`Run Dataflow ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml) |
-| [ PostCommit Java ValidatesRunner Direct JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml) | ['11','17'] |`Run Direct ValidatesRunner Java (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml) |
-| [ PostCommit Java ValidatesRunner Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml) | N/A |`Run Direct ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml) |
-| [ PostCommit Java ValidatesRunner Flink Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml) | N/A |`Run Flink ValidatesRunner Java 11`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml) |
-| [ PostCommit Java ValidatesRunner Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink.yml) | N/A |`Run Flink ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink.yml) |
-| [ PostCommit Java ValidatesRunner Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml) | N/A |`Run Samza ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml) |
-| [ PostCommit Java ValidatesRunner Spark Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml) | N/A |`Run Spark ValidatesRunner Java 11`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml) |
-| [ PostCommit Java ValidatesRunner Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml) | N/A |`Run Spark ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml) |
-| [ PostCommit Java ValidatesRunner SparkStructuredStreaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml) | N/A |`Run Spark StructuredStreaming ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml) |
-| [ PostCommit Java ValidatesRunner Twister2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml) | N/A |`Run Twister2 ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml) |
-| [ PostCommit Java ValidatesRunner ULR ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml) | N/A |`Run ULR Loopback ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml) |
-| [ PostCommit Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml) | N/A |`Run Java PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml) |
-| [ PostCommit Javadoc ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Javadoc.yml) | N/A |`Run Javadoc PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Javadoc.yml) |
-| [ PostCommit PortableJar Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Flink.yml) | N/A |`Run PortableJar_Flink PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Flink.yml) |
-| [ PostCommit PortableJar Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Spark.yml) | N/A |`Run PortableJar_Spark PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Spark.yml) |
-| [ PostCommit Python ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python PostCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python.yml) |
-| [ PostCommit Python Arm](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Arm.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python PostCommit Arm (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Arm.yml) |
-| [ PostCommit Python Examples Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Dataflow.yml) | N/A |`Run Python Examples_Dataflow`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Dataflow.yml) |
-| [ PostCommit Python Examples Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Direct.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python Examples_Direct (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Direct.yml) |
-| [ PostCommit Python Examples Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Flink.yml) | ['3.8','3.11'] |`Run Python Examples_Flink (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Flink.yml) |
-| [ PostCommit Python Examples Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Spark.yml) | ['3.8','3.11'] |`Run Python Examples_Spark (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Spark.yml) |
-| [ PostCommit Python MongoDBIO IT ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml) | N/A |`Run Python MongoDBIO_IT`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml) |
-| [ PostCommit Python Nexmark Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Nexmark_Direct.yml) | N/A |`Run Python Direct Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Nexmark_Direct.yml) |
-| [ PostCommit Python ValidatesContainer Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python Dataflow ValidatesContainer (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml) |
-| [ PostCommit Python ValidatesContainer Dataflow With RC ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python RC Dataflow ValidatesContainer (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml) |
-| [ PostCommit Python ValidatesRunner Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml) | ['3.8','3.11'] |`Run Python Dataflow ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml) |
-| [ PostCommit Python ValidatesRunner Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml) | ['3.8','3.11'] |`Run Python Flink ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml) |
-| [ PostCommit Python ValidatesRunner Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml) | ['3.8','3.11'] |`Run Python Samza ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml) |
-| [ PostCommit Python ValidatesRunner Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml) | ['3.8','3.9','3.11'] |`Run Python Spark ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml) |
-| [ PostCommit Python Xlang Gcp Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml) | N/A |`Run Python_Xlang_Gcp_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml) |
-| [ PostCommit Python Xlang Gcp Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml) | N/A |`Run Python_Xlang_Gcp_Direct PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml) |
-| [ PostCommit Python Xlang IO Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml) | N/A |`Run Python_Xlang_IO_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml) |
-| [ PostCommit Sickbay Python ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Sickbay_Python.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python (matrix_element) PostCommit Sickbay`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Sickbay_Python.yml) |
-| [ PostCommit SQL ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_SQL.yml) | N/A |`Run SQL PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_SQL.yml) |
-| [ PostCommit TransformService Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_TransformService_Direct.yml) | N/A |`Run TransformService_Direct PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_TransformService_Direct.yml)
-| [ PostCommit Website Publish ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Publish.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Publish.yml) |
-| [ PostCommit Website Test](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Test.yml) | N/A |`Run Full Website Test`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Test.yml) |
-| [ PostCommit XVR GoUsingJava Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml) | N/A |`Run XVR_GoUsingJava_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml) |
-| [ PostCommit XVR Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Direct.yml) | N/A |`Run XVR_Direct PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Direct.yml) |
-| [ PostCommit XVR Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Flink.yml) | N/A |`Run XVR_Flink PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Flink.yml) |
-| [ PostCommit XVR JavaUsingPython Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml) | N/A |`Run XVR_JavaUsingPython_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml) |
-| [ PostCommit XVR PythonUsingJava Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml) | N/A |`Run XVR_PythonUsingJava_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml) |
-| [ PostCommit XVR PythonUsingJavaSQL Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml) | N/A |`Run XVR_PythonUsingJavaSQL_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml) |
-| [ PostCommit XVR Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Samza.yml) | N/A |`Run XVR_Samza PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Samza.yml) |
-| [ PostCommit XVR Spark3 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Spark3.yml) | N/A |`Run XVR_Spark3 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Spark3.yml) |
-| [ PreCommit Community Metrics ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_CommunityMetrics.yml) | N/A |`Run CommunityMetrics PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_CommunityMetrics.yml) |
-| [ PreCommit GHA ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GHA.yml) | N/A |`Run GHA PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GHA.yml) |
-| [ PreCommit Go ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Go.yml) | N/A |`Run Go PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Go.yml) |
-| [ PreCommit Java ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java.yml) | N/A |`Run Java PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java.yml) |
-| [ PreCommit Java Amazon Web Services IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml) | N/A |`Run Java_Amazon-Web-Services_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml) |
-| [ PreCommit Java Amazon Web Services2 IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml) | N/A |`Run Java_Amazon-Web-Services2_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml) |
-| [ PreCommit Java Amqp IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml) | N/A |`Run Java_Amqp_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml) |
-| [ PreCommit Java Azure IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml) | N/A |`Run Java_Azure_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml) |
-| [ PreCommit Java Cassandra IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml) | N/A |`Run Java_Cassandra_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml) |
-| [ PreCommit Java Cdap IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml) | N/A |`Run Java_Cdap_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml) |
-| [ PreCommit Java Clickhouse IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml) | N/A |`Run Java_Clickhouse_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml) |
-| [ PreCommit Java Csv IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml) | N/A |`Run Java_Csv_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml) |
-| [ PreCommit Java Debezium IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml) | N/A |`Run Java_Debezium_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml) |
-| [ PreCommit Java ElasticSearch IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml) | N/A |`Run Java_ElasticSearch_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml) |
-| [ PreCommit Java Examples Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow.yml) | N/A |`Run Java_Examples_Dataflow PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow.yml) |
-| [ PreCommit Java Examples Dataflow Java11 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml) | N/A | `Run Java_Examples_Dataflow_Java11 PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml) |
-| [ PreCommit Java Examples Dataflow Java17 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml) | N/A | `Run Java_Examples_Dataflow_Java17 PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml) |
-| [ PreCommit Java File-schema-transform IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml) | N/A |`Run Java_File-schema-transform_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml) |
-| [ PreCommit Java Flink Versions ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Flink_Versions.yml) | N/A |`Run Java_Flink_Versions PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Flink_Versions.yml) |
-| [ PreCommit Java GCP IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml) | N/A |`Run Java_GCP_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml) |
-| [ PreCommit Java Google-ads IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Google-ads_IO_Direct.yml) | N/A |`Run Java_Google-ads_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Google-ads_IO_Direct.yml) |
-| [ PreCommit Java Hadoop IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml) | N/A |`Run Java_Hadoop_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml) |
-| [ PreCommit Java HBase IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml) | N/A |`Run Java_HBase_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml) |
-| [ PreCommit Java HCatalog IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml) | N/A |`Run Java_HCatalog_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml) |
-| [ PreCommit Java Kafka IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml) | N/A |`Run Java_Kafka_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml) |
-| [ PreCommit Java InfluxDb IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml) | N/A |`Run Java_InfluxDb_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml) |
+| [ PreCommit Community Metrics ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_CommunityMetrics.yml) | N/A |`Run CommunityMetrics PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_CommunityMetrics.yml?query=event%3Aschedule) |
+| [ PreCommit GHA ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GHA.yml) | N/A |`Run GHA PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GHA.yml?query=event%3Aschedule) |
+| [ PreCommit Go ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Go.yml) | N/A |`Run Go PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Go.yml?query=event%3Aschedule) |
+| [ PreCommit GoPortable ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GoPortable.yml) | N/A |`Run GoPortable PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GoPortable.yml?query=event%3Aschedule) |
+| [ PreCommit Java ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java.yml) | N/A |`Run Java PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java.yml?query=event%3Aschedule) |
+| [ PreCommit Java Amazon Web Services IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml) | N/A |`Run Java_Amazon-Web-Services_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Amazon Web Services2 IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml) | N/A |`Run Java_Amazon-Web-Services2_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Amqp IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml) | N/A |`Run Java_Amqp_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Azure IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml) | N/A |`Run Java_Azure_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Cassandra IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml) | N/A |`Run Java_Cassandra_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Cdap IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml) | N/A |`Run Java_Cdap_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Clickhouse IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml) | N/A |`Run Java_Clickhouse_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Csv IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml) | N/A |`Run Java_Csv_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Debezium IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml) | N/A |`Run Java_Debezium_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java ElasticSearch IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml) | N/A |`Run Java_ElasticSearch_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Examples Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow.yml) | N/A |`Run Java_Examples_Dataflow PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow.yml?query=event%3Aschedule) |
+| [ PreCommit Java Examples Dataflow Java11 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml) | N/A | `Run Java_Examples_Dataflow_Java11 PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml?query=event%3Aschedule) |
+| [ PreCommit Java Examples Dataflow Java17 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml) | N/A | `Run Java_Examples_Dataflow_Java17 PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml?query=event%3Aschedule) |
+| [ PreCommit Java File-schema-transform IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml) | N/A |`Run Java_File-schema-transform_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Flink Versions ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Flink_Versions.yml) | N/A |`Run Java_Flink_Versions PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Flink_Versions.yml?query=event%3Aschedule) |
+| [ PreCommit Java GCP IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml) | N/A |`Run Java_GCP_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Google-ads IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Google-ads_IO_Direct.yml) | N/A |`Run Java_Google-ads_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Google-ads_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Hadoop IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml) | N/A |`Run Java_Hadoop_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java HBase IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml) | N/A |`Run Java_HBase_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java HCatalog IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml) | N/A |`Run Java_HCatalog_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Kafka IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml) | N/A |`Run Java_Kafka_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java InfluxDb IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml) | N/A |`Run Java_InfluxDb_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml?query=event%3Aschedule) |
| [ PreCommit Java IOs Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_IOs_Direct.yml) | N/A |`Run Java_IOs_Direct PreCommit`| N/A |
-| [ PreCommit Java JDBC IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml) | N/A |`Run Java_JDBC_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml) |
-| [ PreCommit Java Jms IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml) | N/A |`Run Java_Jms_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml) |
-| [ PreCommit Java Kinesis IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml) | N/A |`Run Java_Kinesis_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml) |
-| [ PreCommit Java Kudu IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml) | N/A |`Run Java_Kudu_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml) |
-| [ PreCommit Java MongoDb IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml) | N/A |`Run Java_MongoDb_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml) |
-| [ PreCommit Java Mqtt IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml) | N/A |`Run Java_Mqtt_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml) |
-| [ PreCommit Java Neo4j IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml) | N/A |`Run Java_Neo4j_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml) |
-| [ PreCommit Java Parquet IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml) | N/A |`Run Java_Parquet_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml) |
-| [ PreCommit Java Pulsar IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml) | N/A |`Run Java_Pulsar_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml) |
-| [ PreCommit Java PVR Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml) | N/A |`Run Java_PVR_Flink_Batch PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml) |
-| [ PreCommit Java PVR Flink Docker ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml) | N/A |`Run Java_PVR_Flink_Docker PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml) |
-| [ PreCommit Java RabbitMq IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml) | N/A |`Run Java_RabbitMq_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml) |
-| [ PreCommit Java Redis IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml) | N/A |`Run Java_Redis_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml) |
-| [ PreCommit Java RequestResponse IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RequestResponse_IO_Direct.yml) | N/A |`Run Java_RequestResponse_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RequestResponse_IO_Direct.yml) |
-| [ PreCommit Java SingleStore IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml) | N/A |`Run Java_SingleStore_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml) |
-| [ PreCommit Java Snowflake IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml) | N/A |`Run Java_Snowflake_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml) |
-| [ PreCommit Java Solr IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml) | N/A |`Run Java_Solr_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml) |
-| [ PreCommit Java Spark3 Versions ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Spark3_Versions.yml) | N/A | `Run Java_Spark3_Versions PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Spark3_Versions.yml) |
-| [ PreCommit Java Splunk IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml) | N/A |`Run Java_Splunk_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml) |
-| [ PreCommit Java Thrift IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml) | N/A |`Run Java_Thrift_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml) |
-| [ PreCommit Java Tika IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml) | N/A |`Run Java_Tika_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml) |
-| [ PreCommit Python ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python PreCommit (matrix_element)` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python.yml) |
-| [ PreCommit Python Coverage ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Coverage.yml) | N/A | `Run Python_Coverage PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Coverage.yml) |
-| [ PreCommit Python Dataframes ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Dataframes.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Dataframes PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Dataframes.yml) |
-| [ PreCommit Python Docker ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocker.yml) | ['3.8','3.9','3.10','3.11'] | `Run PythonDocker PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocker.yml) |
-| [ PreCommit Python Docs ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocs.yml) | N/A | `Run PythonDocs PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocs.yml) |
-| [ PreCommit Python Examples ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Examples.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Examples PreCommit (matrix_element)` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Examples.yml) |
-| [ PreCommit Python Formatter ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonFormatter.yml) | N/A | `Run PythonFormatter PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonFormatter.yml) |
-| [ PreCommit Python Integration](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Integration.yml) | ['3.8','3.11'] | `Run Python_Integration PreCommit (matrix_element)` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Integration.yml) |
-| [ PreCommit Python Lint ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonLint.yml) | N/A | `Run PythonLint PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonLint.yml) |
-| [ PreCommit Python PVR Flink ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_PVR_Flink.yml) | N/A | `Run Python_PVR_Flink PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_PVR_Flink.yml) |
-| [ PreCommit Python Runners ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Runners.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Runners PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Runners.yml) |
-| [ PreCommit Python Transforms ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Transforms.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Transforms PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Transforms.yml) |
-| [ PreCommit RAT ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_RAT.yml) | N/A | `Run RAT PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_RAT.yml) |
-| [ PreCommit Spotless ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Spotless.yml) | N/A | `Run Spotless PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Spotless.yml) |
-| [ PreCommit SQL ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL.yml) | N/A |`Run SQL PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL.yml) |
-| [ PreCommit SQL Java11 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java11.yml) | N/A |`Run SQL_Java11 PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java11.yml) |
-| [ PreCommit SQL Java17 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java17.yml) | N/A |`Run SQL_Java17 PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java17.yml) |
-| [ PreCommit Typescript ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Typescript.yml) | N/A |`Run Typescript PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Typescript.yml) |
-| [ PreCommit Website ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website.yml) | N/A |`Run Website PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website.yml) |
-| [ PreCommit Website Stage GCS ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website_Stage_GCS.yml) | N/A |`Run Website_Stage_GCS PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website_Stage_GCS.yml) |
-| [ PreCommit Whitespace ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Whitespace.yml) | N/A |`Run Whitespace PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Whitespace.yml) |
-| [ Python Validates Container Dataflow ARM ](https://github.com/apache/beam/actions/workflows/beam_Python_ValidatesContainer_Dataflow_ARM.yml) | ['3.8','3.9','3.10','3.11'] |beam_Python_ValidatesContainer_Dataflow_ARM.yml
-| [](https://github.com/apache/beam/actions/workflows/beam_Python_ValidatesContainer_Dataflow_ARM.yml) |
-| [ PreCommit GoPortable ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GoPortable.yml) | N/A |`Run GoPortable PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_GoPortable.yml) |
-| [ PreCommit Kotlin Examples ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Kotlin_Examples.yml) | N/A | `Run Kotlin_Examples PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Kotlin_Examples.yml) |
-| [ PreCommit Portable Python ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Portable_Python.yml) | ['3.8','3.11'] | `Run Portable_Python PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Portable_Python.yml) |
-| [ Publish Beam SDK Snapshots ](https://github.com/apache/beam/actions/workflows/beam_Publish_Beam_SDK_Snapshots.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Publish_Beam_SDK_Snapshots.yml) |
-| [ Publish Docker Snapshots ](https://github.com/apache/beam/actions/workflows/beam_Publish_Docker_Snapshots.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Publish_Docker_Snapshots.yml) |
-| [ Rotate IO-Datastores Cluster Credentials ](https://github.com/apache/beam/actions/workflows/beam_IODatastoresCredentialsRotation.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_IODatastoresCredentialsRotation.yml) |
-| [ Rotate Metrics Cluster Credentials ](https://github.com/apache/beam/actions/workflows/beam_MetricsCredentialsRotation.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_MetricsCredentialsRotation.yml) |
+| [ PreCommit Java JDBC IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml) | N/A |`Run Java_JDBC_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Jms IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml) | N/A |`Run Java_Jms_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Kinesis IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml) | N/A |`Run Java_Kinesis_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Kudu IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml) | N/A |`Run Java_Kudu_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java MongoDb IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml) | N/A |`Run Java_MongoDb_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Mqtt IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml) | N/A |`Run Java_Mqtt_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Neo4j IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml) | N/A |`Run Java_Neo4j_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Parquet IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml) | N/A |`Run Java_Parquet_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Pulsar IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml) | N/A |`Run Java_Pulsar_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java PVR Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml) | N/A |`Run Java_PVR_Flink_Batch PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml?query=event%3Aschedule) |
+| [ PreCommit Java PVR Flink Docker ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml) | N/A |`Run Java_PVR_Flink_Docker PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml?query=event%3Aschedule) |
+| [ PreCommit Java RabbitMq IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml) | N/A |`Run Java_RabbitMq_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Redis IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml) | N/A |`Run Java_Redis_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java RequestResponse IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RequestResponse_IO_Direct.yml) | N/A |`Run Java_RequestResponse_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_RequestResponse_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java SingleStore IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml) | N/A |`Run Java_SingleStore_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Snowflake IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml) | N/A |`Run Java_Snowflake_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Solr IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml) | N/A |`Run Java_Solr_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Spark3 Versions ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Spark3_Versions.yml) | N/A | `Run Java_Spark3_Versions PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Spark3_Versions.yml?query=event%3Aschedule) |
+| [ PreCommit Java Splunk IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml) | N/A |`Run Java_Splunk_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Thrift IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml) | N/A |`Run Java_Thrift_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Java Tika IO Direct ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml) | N/A |`Run Java_Tika_IO_Direct PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml?query=event%3Aschedule) |
+| [ PreCommit Kotlin Examples ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Kotlin_Examples.yml) | N/A | `Run Kotlin_Examples PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Kotlin_Examples.yml?query=event%3Aschedule) |
+| [ PreCommit Portable Python ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Portable_Python.yml) | ['3.8','3.11'] | `Run Portable_Python PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Portable_Python.yml?query=event%3Aschedule) |
+| [ PreCommit Python ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python PreCommit (matrix_element)` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python.yml?query=event%3Aschedule) |
+| [ PreCommit Python Coverage ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Coverage.yml) | N/A | `Run Python_Coverage PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Coverage.yml?query=event%3Aschedule) |
+| [ PreCommit Python Dataframes ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Dataframes.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Dataframes PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Dataframes.yml?query=event%3Aschedule) |
+| [ PreCommit Python Docker ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocker.yml) | ['3.8','3.9','3.10','3.11'] | `Run PythonDocker PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocker.yml?query=event%3Aschedule) |
+| [ PreCommit Python Docs ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocs.yml) | N/A | `Run PythonDocs PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonDocs.yml?query=event%3Aschedule) |
+| [ PreCommit Python Examples ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Examples.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Examples PreCommit (matrix_element)` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Examples.yml?query=event%3Aschedule) |
+| [ PreCommit Python Formatter ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonFormatter.yml) | N/A | `Run PythonFormatter PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonFormatter.yml?query=event%3Aschedule) |
+| [ PreCommit Python Integration](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Integration.yml) | ['3.8','3.11'] | `Run Python_Integration PreCommit (matrix_element)` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Integration.yml?query=event%3Aschedule) |
+| [ PreCommit Python Lint ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonLint.yml) | N/A | `Run PythonLint PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_PythonLint.yml?query=event%3Aschedule) |
+| [ PreCommit Python PVR Flink ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_PVR_Flink.yml) | N/A | `Run Python_PVR_Flink PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_PVR_Flink.yml?query=event%3Aschedule) |
+| [ PreCommit Python Runners ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Runners.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Runners PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Runners.yml?query=event%3Aschedule) |
+| [ PreCommit Python Transforms ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Transforms.yml) | ['3.8','3.9','3.10','3.11'] | `Run Python_Transforms PreCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_Transforms.yml?query=event%3Aschedule) |
+| [ PreCommit RAT ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_RAT.yml) | N/A | `Run RAT PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_RAT.yml?query=event%3Aschedule) |
+| [ PreCommit Spotless ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Spotless.yml) | N/A | `Run Spotless PreCommit` | [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Spotless.yml?query=event%3Aschedule) |
+| [ PreCommit SQL ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL.yml) | N/A |`Run SQL PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL.yml?query=event%3Aschedule) |
+| [ PreCommit SQL Java11 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java11.yml) | N/A |`Run SQL_Java11 PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java11.yml?query=event%3Aschedule) |
+| [ PreCommit SQL Java17 ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java17.yml) | N/A |`Run SQL_Java17 PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_SQL_Java17.yml?query=event%3Aschedule) |
+| [ PreCommit Typescript ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Typescript.yml) | N/A |`Run Typescript PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Typescript.yml?query=event%3Aschedule) |
+| [ PreCommit Website ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website.yml) | N/A |`Run Website PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website.yml?query=event%3Aschedule) |
+| [ PreCommit Website Stage GCS ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website_Stage_GCS.yml) | N/A |`Run Website_Stage_GCS PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Website_Stage_GCS.yml?query=event%3Aschedule) |
+| [ PreCommit Whitespace ](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Whitespace.yml) | N/A |`Run Whitespace PreCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PreCommit_Whitespace.yml?query=event%3Aschedule) |
+
+### PostCommit Jobs
+
+| Workflow name | Matrix | Trigger Phrase | Cron Status |
+|:-------------:|:------:|:--------------:|:-----------:|
+| [ PostCommit BeamMetrics Publish ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_BeamMetrics_Publish.yml) | N/A |`Run Beam Metrics Deployment`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_BeamMetrics_Publish.yml?query=event%3Aschedule)
+| [ PostCommit Go ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go.yml) | N/A |`Run Go PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go.yml?query=event%3Aschedule) |
+| [ PostCommit Go Dataflow ARM](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_Dataflow_ARM.yml) | N/A |`Run Go PostCommit Dataflow ARM`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_Dataflow_ARM.yml?query=event%3Aschedule) |
+| [ PostCommit Go VR Flink](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Flink.yml) | N/A |`Run Go Flink ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Go VR Samza](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Samza.yml) | N/A |`Run Go Samza ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Samza.yml?query=event%3Aschedule) |
+| [ PostCommit Go VR Spark](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Spark.yml) | N/A |`Run Go Spark ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Go_VR_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Java Avro Versions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Avro_Versions.yml) | N/A |`Run Java Avro Versions PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Avro_Versions.yml?query=event%3Aschedule) |
+| [ PostCommit Java Dataflow V1 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV1.yml) | N/A |`Run PostCommit_Java_Dataflow`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV1.yml?query=event%3Aschedule) |
+| [ PostCommit Java Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV2.yml) | N/A |`Run PostCommit_Java_DataflowV2`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_DataflowV2.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Dataflow ARM ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_ARM.yml) | ['8','11','17'] |`Run Java_Examples_Dataflow_ARM PostCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_ARM.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Dataflow](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow.yml) | N/A |`Run Java examples on Dataflow`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Dataflow Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml) | ['11','17'] |`Run Java examples on Dataflow Java (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml) | N/A |`Run Java Examples on Dataflow Runner V2`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Dataflow V2 Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml) | ['11','17'] |`Run Java (matrix_element) Examples on Dataflow Runner V2`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Direct.yml) | N/A |`Run Java Examples_Direct`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Flink.yml) | N/A |`Run Java Examples_Flink`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Java Examples Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Spark.yml) | N/A |`Run Java Examples_Spark`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Examples_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Java Hadoop Versions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Hadoop_Versions.yml) | N/A |`Run PostCommit_Java_Hadoop_Versions`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Hadoop_Versions.yml?query=event%3Aschedule) |
+| [ PostCommit Java InfluxDbIO Integration Test ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml) | N/A |`Run Java InfluxDbIO_IT`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml?query=event%3Aschedule)
+| [ PostCommit Java Jpms Dataflow Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml) | N/A |`Run Jpms Dataflow Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml?query=event%3Aschedule) |
+| [ PostCommit Java Jpms Dataflow Java17 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml) | N/A |`Run Jpms Dataflow Java 17 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml?query=event%3Aschedule) |
+| [ PostCommit Java Jpms Direct Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml) | N/A |`Run Jpms Direct Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml?query=event%3Aschedule) |
+| [ PostCommit Java Jpms Direct Java17 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml) | N/A |`Run Jpms Direct Java 17 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml?query=event%3Aschedule) |
+| [ PostCommit Java Jpms Flink Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml) | N/A |`Run Jpms Flink Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml?query=event%3Aschedule) |
+| [ PostCommit Java Jpms Spark Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml) | N/A |`Run Jpms Spark Java 11 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml?query=event%3Aschedule) |
+| [ PostCommit Java Nexmark Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml) | N/A |`Run Dataflow Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Java Nexmark Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml) | N/A |`Run Dataflow Runner V2 Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml?query=event%3Aschedule) |
+| [ PostCommit Java Nexmark Dataflow V2 Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml) | ['11','17'] |`Run Dataflow Runner V2 Java (matrix) Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml?query=event%3Aschedule) |
+| [ PostCommit Java Nexmark Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Direct.yml) | N/A |`Run Direct Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit Java Nexmark Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Flink.yml) | N/A |`Run Flink Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Java Nexmark Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Spark.yml) | N/A |`Run Spark Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Nexmark_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Java PVR Flink Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml) | N/A |`Run Java Flink PortableValidatesRunner Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml?query=event%3Aschedule) |
+| [ PostCommit Java PVR Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Samza.yml) | N/A |`Run Java Samza PortableValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Samza.yml?query=event%3Aschedule) |
+| [ PostCommit Java PVR Spark3 Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml) | N/A |`Run Java Spark v3 PortableValidatesRunner Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml?query=event%3Aschedule) |
+| [ PostCommit Java PVR Spark Batch ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark_Batch.yml) | N/A |`Run Java Spark PortableValidatesRunner Batch`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_PVR_Spark_Batch.yml?query=event%3Aschedule) |
+| [ PostCommit Java Sickbay ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Sickbay.yml) | N/A |`Run Java Sickbay`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Sickbay.yml?query=event%3Aschedule) |
+| [ PostCommit Java Tpcds Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml) | N/A |`Run Dataflow Runner Tpcds Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Java Tpcds Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Flink.yml) | N/A |`Run Flink Runner Tpcds Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Java Tpcds Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Spark.yml) | N/A |`Run Spark Runner Tpcds Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_Tpcds_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Dataflow JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml) | ['11','17'] |`Run Dataflow ValidatesRunner Java (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml) | N/A |`Run Dataflow Streaming ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Dataflow V2 Streaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml) | N/A |`Run Java Dataflow V2 ValidatesRunner Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Dataflow V2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml) | N/A |`Run Java Dataflow V2 ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml) | N/A |`Run Dataflow ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Direct JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml) | ['11','17'] |`Run Direct ValidatesRunner Java (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml) | N/A |`Run Direct ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Flink Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml) | N/A |`Run Flink ValidatesRunner Java 11`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink.yml) | N/A |`Run Flink ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml) | N/A |`Run Samza ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Spark Java11 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml) | N/A |`Run Spark ValidatesRunner Java 11`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml) | N/A |`Run Spark ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner SparkStructuredStreaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml) | N/A |`Run Spark StructuredStreaming ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner Twister2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml) | N/A |`Run Twister2 ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml?query=event%3Aschedule) |
+| [ PostCommit Java ValidatesRunner ULR ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml) | N/A |`Run ULR Loopback ValidatesRunner`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml?query=event%3Aschedule) |
+| [ PostCommit Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml) | N/A |`Run Java PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml?query=event%3Aschedule) |
+| [ PostCommit Javadoc ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Javadoc.yml) | N/A |`Run Javadoc PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Javadoc.yml?query=event%3Aschedule) |
+| [ PostCommit PortableJar Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Flink.yml) | N/A |`Run PortableJar_Flink PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit PortableJar Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Spark.yml) | N/A |`Run PortableJar_Spark PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_PortableJar_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Python ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python PostCommit (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python.yml?query=event%3Aschedule) |
+| [ PostCommit Python Arm](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Arm.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python PostCommit Arm (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Arm.yml?query=event%3Aschedule) |
+| [ PostCommit Python Examples Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Dataflow.yml) | N/A |`Run Python Examples_Dataflow`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Python Examples Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Direct.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python Examples_Direct (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit Python Examples Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Flink.yml) | ['3.8','3.11'] |`Run Python Examples_Flink (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Python Examples Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Spark.yml) | ['3.8','3.11'] |`Run Python Examples_Spark (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Examples_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Python MongoDBIO IT ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml) | N/A |`Run Python MongoDBIO_IT`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml?query=event%3Aschedule) |
+| [ PostCommit Python Nexmark Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Nexmark_Direct.yml) | N/A |`Run Python Direct Runner Nexmark Tests`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Nexmark_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit Python ValidatesContainer Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python Dataflow ValidatesContainer (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Python ValidatesContainer Dataflow With RC ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python RC Dataflow ValidatesContainer (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml?query=event%3Aschedule) |
+| [ PostCommit Python ValidatesRunner Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml) | ['3.8','3.11'] |`Run Python Dataflow ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Python ValidatesRunner Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml) | ['3.8','3.11'] |`Run Python Flink ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit Python ValidatesRunner Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml) | ['3.8','3.11'] |`Run Python Samza ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml?query=event%3Aschedule) |
+| [ PostCommit Python ValidatesRunner Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml) | ['3.8','3.9','3.11'] |`Run Python Spark ValidatesRunner (matrix_element)`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml?query=event%3Aschedule) |
+| [ PostCommit Python Xlang Gcp Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml) | N/A |`Run Python_Xlang_Gcp_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Python Xlang Gcp Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml) | N/A |`Run Python_Xlang_Gcp_Direct PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit Python Xlang IO Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml) | N/A |`Run Python_Xlang_IO_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit Sickbay Python ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Sickbay_Python.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python (matrix_element) PostCommit Sickbay`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Sickbay_Python.yml?query=event%3Aschedule) |
+| [ PostCommit SQL ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_SQL.yml) | N/A |`Run SQL PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_SQL.yml?query=event%3Aschedule) |
+| [ PostCommit TransformService Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_TransformService_Direct.yml) | N/A |`Run TransformService_Direct PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_TransformService_Direct.yml?query=event%3Aschedule)
+| [ PostCommit Website Publish ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Publish.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Publish.yml?query=event%3Aschedule) |
+| [ PostCommit Website Test](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Test.yml) | N/A |`Run Full Website Test`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Website_Test.yml?query=event%3Aschedule) |
+| [ PostCommit XVR GoUsingJava Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml) | N/A |`Run XVR_GoUsingJava_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit XVR Direct ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Direct.yml) | N/A |`Run XVR_Direct PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Direct.yml?query=event%3Aschedule) |
+| [ PostCommit XVR Flink ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Flink.yml) | N/A |`Run XVR_Flink PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Flink.yml?query=event%3Aschedule) |
+| [ PostCommit XVR JavaUsingPython Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml) | N/A |`Run XVR_JavaUsingPython_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit XVR PythonUsingJava Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml) | N/A |`Run XVR_PythonUsingJava_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit XVR PythonUsingJavaSQL Dataflow ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml) | N/A |`Run XVR_PythonUsingJavaSQL_Dataflow PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml?query=event%3Aschedule) |
+| [ PostCommit XVR Samza ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Samza.yml) | N/A |`Run XVR_Samza PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Samza.yml?query=event%3Aschedule) |
+| [ PostCommit XVR Spark3 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Spark3.yml) | N/A |`Run XVR_Spark3 PostCommit`| [](https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Spark3.yml?query=event%3Aschedule) |
+| [ Python Validates Container Dataflow ARM ](https://github.com/apache/beam/actions/workflows/beam_Python_ValidatesContainer_Dataflow_ARM.yml) | ['3.8','3.9','3.10','3.11'] |`Run Python ValidatesContainer Dataflow ARM (matrix_element)`|[](https://github.com/apache/beam/actions/workflows/beam_Python_ValidatesContainer_Dataflow_ARM.yml?query=event%3Aschedule) |
+
+### PerformanceTests and Benchmark Jobs
+
+| Workflow name | Matrix | Trigger Phrase | Cron Status |
+|:-------------:|:------:|:--------------:|:-----------:|
+| [ CloudML Benchmarks Dataflow ](https://github.com/apache/beam/actions/workflows/beam_CloudML_Benchmarks_Dataflow.yml) | N/A |`Run TFT Criteo Benchmarks`| [](https://github.com/apache/beam/actions/workflows/beam_CloudML_Benchmarks_Dataflow.yml?query=event%3Aschedule)
+| [ Inference Python Benchmarks Dataflow ](https://github.com/apache/beam/actions/workflows/beam_Inference_Python_Benchmarks_Dataflow.yml) | N/A |`Run Inference Benchmarks`| [](https://github.com/apache/beam/actions/workflows/beam_Inference_Python_Benchmarks_Dataflow.yml?query=event%3Aschedule)
+| [ Java JMH ](https://github.com/apache/beam/actions/workflows/beam_Java_JMH.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Java_JMH.yml?query=event%3Aschedule)
+| [ Performance Tests AvroIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml) | N/A |`Run Java AvroIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml?query=event%3Aschedule)
+| [ Performance Tests AvroIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT.yml) | N/A |`Run Java AvroIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_AvroIOIT.yml?query=event%3Aschedule)
+| [ Performance Tests BigQueryIO Batch Java Avro ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml) | N/A |`Run BigQueryIO Batch Performance Test Java Avro`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml?query=event%3Aschedule)
+| [ Performance Tests BigQueryIO Batch Java Json ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml) | N/A |`Run BigQueryIO Batch Performance Test Java Json`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml?query=event%3Aschedule)
+| [ Performance Tests BigQueryIO Streaming Java ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml) | N/A |`Run BigQueryIO Streaming Performance Test Java`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml?query=event%3Aschedule)
+| [ Performance Tests BigQueryIO Read Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Read_Python.yml) | N/A |`Run BigQueryIO Read Performance Test Python`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Read_Python.yml?query=event%3Aschedule)
+| [ Performance Tests BigQueryIO Write Python Batch ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch.yml) | N/A |`Run BigQueryIO Write Performance Test Python`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch.yml?query=event%3Aschedule)
+| [ PerformanceTests Cdap ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Cdap.yml) | N/A |`Run Java CdapIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Cdap.yml?query=event%3Aschedule)
+| [ PerformanceTests Compressed TextIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml) | N/A |`Run Java CompressedTextIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml?query=event%3Aschedule)
+| [ PerformanceTests Compressed TextIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml) | N/A |`Run Java CompressedTextIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml?query=event%3Aschedule)
+| [ PerformanceTests HadoopFormat ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_HadoopFormat.yml) | N/A |`Run Java HadoopFormatIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_HadoopFormat.yml?query=event%3Aschedule)
+| [ PerformanceTests JDBC ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_JDBC.yml) | N/A |`Run Java JdbcIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_JDBC.yml?query=event%3Aschedule)
+| [ PerformanceTests Kafka IO ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Kafka_IO.yml) | N/A |`Run Java KafkaIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_Kafka_IO.yml?query=event%3Aschedule)
+| [ PerformanceTests ManyFiles TextIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml) | N/A |`Run Java ManyFilesTextIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml?query=event%3Aschedule)
+| [ PerformanceTests ManyFiles TextIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml) | N/A |`Run Java ManyFilesTextIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml?query=event%3Aschedule)
+| [ PerformanceTests MongoDBIO IT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_MongoDBIO_IT.yml) | N/A |`Run Java MongoDBIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_MongoDBIO_IT.yml?query=event%3Aschedule)
+| [ PerformanceTests ParquetIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml) | N/A |`Run Java ParquetIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml?query=event%3Aschedule)
+| [ PerformanceTests ParquetIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT.yml) | N/A |`Run Java ParquetIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_ParquetIOIT.yml?query=event%3Aschedule)
+| [ PerformanceTests PubsubIOIT Python Streaming ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_PubsubIOIT_Python_Streaming.yml) | N/A |`Run PubsubIO Performance Test Python`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_PubsubIOIT_Python_Streaming.yml?query=event%3Aschedule)
+| [ PerformanceTests SingleStoreIO ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SingleStoreIO.yml) | N/A |`Run Java SingleStoreIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SingleStoreIO.yml?query=event%3Aschedule)
+| [ PerformanceTests SpannerIO Read 2GB Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Read_2GB_Python.yml) | N/A |`Run SpannerIO Read 2GB Performance Test Python`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Read_2GB_Python.yml?query=event%3Aschedule)
+| [ PerformanceTests SpannerIO Write 2GB Python Batch ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Write_2GB_Python_Batch.yml) | N/A |`Run SpannerIO Write 2GB Performance Test Python Batch`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SpannerIO_Write_2GB_Python_Batch.yml?query=event%3Aschedule)
+| [ PerformanceTests SparkReceiver IO ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SparkReceiver_IO.yml) | N/A |`Run Java SparkReceiverIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SparkReceiver_IO.yml?query=event%3Aschedule)
+| [ PerformanceTests SQLBigQueryIO Batch Java ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SQLBigQueryIO_Batch_Java.yml) | N/A |`Run SQLBigQueryIO Batch Performance Test Java`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_SQLBigQueryIO_Batch_Java.yml?query=event%3Aschedule)
+| [ PerformanceTests TextIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml) | N/A |`Run Java TextIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml?query=event%3Aschedule)
+| [ PerformanceTests TextIOIT Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_Python.yml) | N/A |`Run Python TextIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT_Python.yml?query=event%3Aschedule)
+| [ PerformanceTests TextIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT.yml) | N/A |`Run Java TextIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TextIOIT.yml?query=event%3Aschedule)
+| [ PerformanceTests TFRecordIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml) | N/A |`Run Java TFRecordIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml?query=event%3Aschedule)
+| [ PerformanceTests TFRecordIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT.yml) | N/A |`Run Java TFRecordIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_TFRecordIOIT.yml?query=event%3Aschedule)
+| [ PerformanceTests WordCountIT PythonVersions ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml) | ['3.8'] |`Run Python (matrix_element) WordCountIT Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml?query=event%3Aschedule)
+| [ PerformanceTests XmlIOIT HDFS ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml) | N/A |`Run Java XmlIO Performance Test HDFS`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml?query=event%3Aschedule)
+| [ PerformanceTests XmlIOIT ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT.yml) | N/A |`Run Java XmlIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_XmlIOIT.yml?query=event%3Aschedule)
+| [ PerformanceTests xlang KafkaIO Python ](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_xlang_KafkaIO_Python.yml) | N/A |`Run Python xlang KafkaIO Performance Test`| [](https://github.com/apache/beam/actions/workflows/beam_PerformanceTests_xlang_KafkaIO_Python.yml?query=event%3Aschedule)
+
+### LoadTests Jobs
+
+| Workflow name | Matrix | Trigger Phrase | Cron Status |
+|:-------------:|:------:|:--------------:|:-----------:|
+| [ LoadTests Go CoGBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml) | N/A |`Run LoadTests Go CoGBK Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go CoGBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Flink_batch.yml) | N/A |`Run Load Tests Go CoGBK Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_CoGBK_Flink_batch.yml?query=event%3Aschedule)
+| [ LoadTests Go Combine Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_Combine_Dataflow_Batch.yml) | N/A |`Run Load Tests Go Combine Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_Combine_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go Combine Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_Combine_Flink_Batch.yml) | N/A |`Run Load Tests Go Combine Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_Combine_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go GBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml) | N/A |`Run Load Tests Go GBK Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go GBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml) | N/A |`Run Load Tests Go GBK Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go ParDo Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml) | N/A |`Run Load Tests Go ParDo Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go ParDo Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml) | N/A |`Run Load Tests Go ParDo Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go SideInput Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml) | N/A |`Run Load Tests Go SideInput Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Go SideInput Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml) | N/A |`Run Load Tests Go SideInput Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java CoGBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml) | N/A |`Run Load Tests Java CoGBK Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java CoGBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml) | N/A |`Run Load Tests Java CoGBK Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Java CoGBK Dataflow V2 Batch JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_JavaVersions.yml) | ['11','17'] |`Run Load Tests Java (matrix_element) CoGBK Dataflow V2 Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_JavaVersions.yml?query=event%3Aschedule)
+| [ LoadTests Java CoGBK Dataflow V2 Streaming JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_JavaVersions.yml) | ['11','17'] |`Run Load Tests Java (matrix_element) CoGBK Dataflow V2 Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_JavaVersions.yml?query=event%3Aschedule)
+| [ LoadTests Java CoGBK SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml) | N/A |`Run Load Tests Java CoGBK SparkStructuredStreaming Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java Combine Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml) | N/A |`Run Load Tests Java Combine Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java Combine Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml) | N/A |`Run Load Tests Java Combine Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Java Combine SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml) | N/A |`Run Load Tests Java Combine SparkStructuredStreaming Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml) | N/A |`Run Load Tests Java GBK Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml) | N/A |`Run Load Tests Java GBK Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Dataflow V2 Batch Java11 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java11.yml) | N/A |`Run Load Tests Java 11 GBK Dataflow V2 Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java11.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Dataflow V2 Batch Java17 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java17.yml) | N/A |`Run Load Tests Java 17 GBK Dataflow V2 Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Batch_Java17.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Dataflow V2 Streaming Java11 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11.yml) | N/A |`Run Load Tests Java 11 GBK Dataflow V2 Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Dataflow V2 Streaming Java17 ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17.yml) | N/A |`Run Load Tests Java 17 GBK Dataflow V2 Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK Smoke ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Smoke.yml) | N/A |`Run Java Load Tests GBK Smoke`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_Smoke.yml?query=event%3Aschedule)
+| [ LoadTests Java GBK SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_SparkStructuredStreaming_Batch.yml) | N/A |`Run Load Tests Java GBK SparkStructuredStreaming Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_GBK_SparkStructuredStreaming_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java ParDo Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml) | N/A |`Run Load Tests Java ParDo Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java ParDo Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml) | N/A |`Run Load Tests Java ParDo Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Java ParDo Dataflow V2 Batch JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_JavaVersions.yml) | ['11','17'] |`Run Load Tests Java (matrix_element) ParDo Dataflow V2 Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_JavaVersions.yml?query=event%3Aschedule)
+| [ LoadTests Java ParDo Dataflow V2 Streaming JavaVersions ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_JavaVersions.yml) | ['11','17'] |`Run Load Tests Java (matrix_element) ParDo Dataflow V2 Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_JavaVersions.yml?query=event%3Aschedule)
+| [ LoadTests Java ParDo SparkStructuredStreaming Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml) | N/A |`Run Load Tests Java ParDo SparkStructuredStreaming Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Java Combine Smoke ](https://github.com/apache/beam/actions/workflows/beam_Java_LoadTests_Combine_Smoke.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Java_LoadTests_Combine_Smoke.yml?query=event%3Aschedule)
+| [ LoadTests Python CoGBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Batch.yml) | N/A |`Run Load Tests Python CoGBK Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python CoGBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Streaming.yml) | N/A |`Run Load Tests Python CoGBK Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python CoGBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml) | N/A |`Run Load Tests Python CoGBK Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python Combine Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml) | N/A |`Run Load Tests Python Combine Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python Combine Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Streaming.yml) | N/A |`Run Load Tests Python Combine Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python Combine Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Batch.yml) | N/A |`Run Load Tests Python Combine Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python Combine Flink Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Streaming.yml) | N/A |`Run Load Tests Python Combine Flink Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Combine_Flink_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python FnApiRunner Microbenchmark ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml) | N/A |`Run Python Load Tests FnApiRunner Microbenchmark`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml?query=event%3Aschedule)
+| [ LoadTests Python GBK Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Batch.yml) | N/A |`Run Load Tests Python GBK Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python GBK Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Streaming.yml) | N/A |`Run Load Tests Python GBK Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python GBK Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Flink_Batch.yml) | N/A |`Run Load Tests Python GBK Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python GBK reiterate Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch.yml) | N/A |`Run Load Tests Python GBK reiterate Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python GBK reiterate Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Streaming.yml) | N/A |`Run Load Tests Python GBK reiterate Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_GBK_reiterate_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python ParDo Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Batch.yml) | N/A |`Run Load Tests Python ParDo Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python ParDo Dataflow Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Streaming.yml) | N/A |`Run Python Load Tests ParDo Dataflow Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Dataflow_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python ParDo Flink Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Batch.yml) | N/A |`Run Load Tests Python ParDo Flink Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python ParDo Flink Streaming ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Streaming.yml) | N/A |`Run Load Tests Python ParDo Flink Streaming`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_ParDo_Flink_Streaming.yml?query=event%3Aschedule)
+| [ LoadTests Python SideInput Dataflow Batch ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_SideInput_Dataflow_Batch.yml) | N/A |`Run Load Tests Python SideInput Dataflow Batch`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_SideInput_Dataflow_Batch.yml?query=event%3Aschedule)
+| [ LoadTests Python Smoke ](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Smoke.yml) | N/A |`Run Python Load Tests Smoke`| [](https://github.com/apache/beam/actions/workflows/beam_LoadTests_Python_Smoke.yml?query=event%3Aschedule)
+
+### Other Jobs
+
+| Workflow name | Matrix | Trigger Phrase | Cron Status |
+|:-------------:|:------:|:--------------:|:-----------:|
+| [ Cancel Stale Dataflow Jobs ](https://github.com/apache/beam/actions/workflows/beam_CancelStaleDataflowJobs.yml) | N/A | `Run Cancel Stale Dataflow Jobs` | [](https://github.com/apache/beam/actions/workflows/beam_CancelStaleDataflowJobs.yml?query=event%3Aschedule) |
+| [ Clean Up GCP Resources ](https://github.com/apache/beam/actions/workflows/beam_CleanUpGCPResources.yml) | N/A | `Run Clean GCP Resources` | [](https://github.com/apache/beam/actions/workflows/beam_CleanUpGCPResources.yml?query=event%3Aschedule) |
+| [ Clean Up Prebuilt SDK Images ](https://github.com/apache/beam/actions/workflows/beam_CleanUpPrebuiltSDKImages.yml) | N/A | `Run Clean Prebuilt Images` | [](https://github.com/apache/beam/actions/workflows/beam_CleanUpPrebuiltSDKImages.yml?query=event%3Aschedule) |
+| [ Cleanup Dataproc Resources ](https://github.com/apache/beam/actions/workflows/beam_CleanUpDataprocResources.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_CleanUpDataprocResources.yml?query=event%3Aschedule)
+| [ Community Metrics Prober ](https://github.com/apache/beam/actions/workflows/beam_Prober_CommunityMetrics.yml) | N/A |`Run Community Metrics Prober`| [](https://github.com/apache/beam/actions/workflows/beam_Prober_CommunityMetrics.yml?query=event%3Aschedule)
+| [ Publish Beam SDK Snapshots ](https://github.com/apache/beam/actions/workflows/beam_Publish_Beam_SDK_Snapshots.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_Publish_Beam_SDK_Snapshots.yml?query=event%3Aschedule) |
+| [ Publish Docker Snapshots ](https://github.com/apache/beam/actions/workflows/beam_Publish_Docker_Snapshots.yml) | N/A |`Publish Docker Snapshots`| [](https://github.com/apache/beam/actions/workflows/beam_Publish_Docker_Snapshots.yml?query=event%3Aschedule) |
+| [ Rotate IO-Datastores Cluster Credentials ](https://github.com/apache/beam/actions/workflows/beam_IODatastoresCredentialsRotation.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_IODatastoresCredentialsRotation.yml?query=event%3Aschedule) |
+| [ Rotate Metrics Cluster Credentials ](https://github.com/apache/beam/actions/workflows/beam_MetricsCredentialsRotation.yml) | N/A | N/A | [](https://github.com/apache/beam/actions/workflows/beam_MetricsCredentialsRotation.yml?query=event%3Aschedule) |
diff --git a/.github/workflows/beam_CancelStaleDataflowJobs.yml b/.github/workflows/beam_CancelStaleDataflowJobs.yml
index 63e780c..46ff76d 100644
--- a/.github/workflows/beam_CancelStaleDataflowJobs.yml
+++ b/.github/workflows/beam_CancelStaleDataflowJobs.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_CleanUpGCPResources.yml b/.github/workflows/beam_CleanUpGCPResources.yml
index 9aa92f0..a7267ad 100644
--- a/.github/workflows/beam_CleanUpGCPResources.yml
+++ b/.github/workflows/beam_CleanUpGCPResources.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_CleanUpPrebuiltSDKImages.yml b/.github/workflows/beam_CleanUpPrebuiltSDKImages.yml
index 345624f..f327f09 100644
--- a/.github/workflows/beam_CleanUpPrebuiltSDKImages.yml
+++ b/.github/workflows/beam_CleanUpPrebuiltSDKImages.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_IODatastoresCredentialsRotation.yml b/.github/workflows/beam_IODatastoresCredentialsRotation.yml
index 36e6b23..7a402e4 100644
--- a/.github/workflows/beam_IODatastoresCredentialsRotation.yml
+++ b/.github/workflows/beam_IODatastoresCredentialsRotation.yml
@@ -66,6 +66,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }}
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Starting credential rotation
run: |
gcloud container clusters update io-datastores --start-credential-rotation --zone=us-central1-a --quiet
diff --git a/.github/workflows/beam_Java_JMH.yml b/.github/workflows/beam_Java_JMH.yml
index 07beb1d..a25c3fa 100644
--- a/.github/workflows/beam_Java_JMH.yml
+++ b/.github/workflows/beam_Java_JMH.yml
@@ -58,6 +58,8 @@
name: "beam_Java_JMH"
steps:
- uses: actions/checkout@v3
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run the Java JMH micro-benchmark harness suite
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_Java_LoadTests_Combine_Smoke.yml b/.github/workflows/beam_Java_LoadTests_Combine_Smoke.yml
index 19d5b0f..b252f4e 100644
--- a/.github/workflows/beam_Java_LoadTests_Combine_Smoke.yml
+++ b/.github/workflows/beam_Java_LoadTests_Combine_Smoke.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml
index dac3733..e7e5b90 100644
--- a/.github/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_CoGBK_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests CoGBK Dataflow Batch Go
+name: LoadTests Go CoGBK Dataflow Batch
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_CoGBK_Flink_batch.yml b/.github/workflows/beam_LoadTests_Go_CoGBK_Flink_batch.yml
index 05b61f2..feec436 100644
--- a/.github/workflows/beam_LoadTests_Go_CoGBK_Flink_batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_CoGBK_Flink_batch.yml
@@ -79,6 +79,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_Combine_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Go_Combine_Dataflow_Batch.yml
index cb198bb..93b65ac 100644
--- a/.github/workflows/beam_LoadTests_Go_Combine_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_Combine_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests Combine Dataflow Batch Go
+name: LoadTests Go Combine Dataflow Batch
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml
index 9778b54..476338d 100644
--- a/.github/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_GBK_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests GBK Dataflow Batch Go
+name: LoadTests Go GBK Dataflow Batch
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml b/.github/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml
index 37ae1e4..6f17a4b 100644
--- a/.github/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_GBK_Flink_Batch.yml
@@ -79,6 +79,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml
index 99292c8..e5b33ba 100644
--- a/.github/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_ParDo_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests ParDo Dataflow Batch Go
+name: LoadTests Go ParDo Dataflow Batch
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml b/.github/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml
index 666799b..ba54442 100644
--- a/.github/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_ParDo_Flink_Batch.yml
@@ -79,6 +79,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml
index 123e839..439bb47 100644
--- a/.github/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_SideInput_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests SideInput Dataflow Batch Go
+name: LoadTests Go SideInput Dataflow Batch
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml b/.github/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml
index 592af7e..5be573e 100644
--- a/.github/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Go_SideInput_Flink_Batch.yml
@@ -79,6 +79,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml
index 8677f6f..27e6a8a 100644
--- a/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml b/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml
index bdd0ada..27cc983 100644
--- a/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml
+++ b/.github/workflows/beam_LoadTests_Java_CoGBK_Dataflow_Streaming.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests CoGBK Dataflow Streaming Java
+name: LoadTests Java CoGBK Dataflow Streaming
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml b/.github/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml
index 7f19352..b77c0ae 100644
--- a/.github/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml
index 02f5da3..91ccdce 100644
--- a/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests Combine Dataflow Batch Java
+name: LoadTests Java Combine Dataflow Batch
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml b/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml
index d652064..243cb94 100644
--- a/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml
+++ b/.github/workflows/beam_LoadTests_Java_Combine_Dataflow_Streaming.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml b/.github/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml
index d4862ad..dcf6015 100644
--- a/.github/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml
index 403deba..25a4878 100644
--- a/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml b/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml
index 483c06c..9d8ccf2 100644
--- a/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml
+++ b/.github/workflows/beam_LoadTests_Java_GBK_Dataflow_Streaming.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_GBK_Smoke.yml b/.github/workflows/beam_LoadTests_Java_GBK_Smoke.yml
index 2512911..cf31693 100644
--- a/.github/workflows/beam_LoadTests_Java_GBK_Smoke.yml
+++ b/.github/workflows/beam_LoadTests_Java_GBK_Smoke.yml
@@ -66,6 +66,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_GBK_SparkStructuredStreaming_Batch.yml b/.github/workflows/beam_LoadTests_Java_GBK_SparkStructuredStreaming_Batch.yml
index de66760..eeba909 100644
--- a/.github/workflows/beam_LoadTests_Java_GBK_SparkStructuredStreaming_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_GBK_SparkStructuredStreaming_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml
index ebc30fb..5f2eaab 100644
--- a/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml b/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml
index 6dec9be..8279643 100644
--- a/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml
+++ b/.github/workflows/beam_LoadTests_Java_ParDo_Dataflow_Streaming.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml b/.github/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml
index 26c1c88..8e37932 100644
--- a/.github/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml b/.github/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml
index cdf0c09..87de130 100644
--- a/.github/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Python_CoGBK_Flink_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: LoadTests Python CoGBK Dataflow Flink Batch
+name: LoadTests Python CoGBK Flink Batch
on:
schedule:
@@ -79,10 +79,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- python-version: '3.8'
+ python-version: 3.8
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml b/.github/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml
index 2a1220c..3aa9b35 100644
--- a/.github/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml
+++ b/.github/workflows/beam_LoadTests_Python_Combine_Dataflow_Batch.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests Combine Dataflow Batch Python
+name: LoadTests Python Combine Dataflow Batch
on:
schedule:
@@ -69,10 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- python-version: '3.8'
+ python-version: 3.8
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml b/.github/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml
index 674f37f..5755f25 100644
--- a/.github/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml
+++ b/.github/workflows/beam_LoadTests_Python_FnApiRunner_Microbenchmark.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Load Tests FnApiRunner Microbenchmark Python
+name: LoadTests Python FnApiRunner Microbenchmark
on:
schedule:
@@ -69,10 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- python-version: '3.8'
+ python-version: 3.8
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_MetricsCredentialsRotation.yml b/.github/workflows/beam_MetricsCredentialsRotation.yml
index 9bd795f..7b97270 100644
--- a/.github/workflows/beam_MetricsCredentialsRotation.yml
+++ b/.github/workflows/beam_MetricsCredentialsRotation.yml
@@ -66,6 +66,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }}
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Starting credential rotation
run: |
gcloud container clusters update metrics --start-credential-rotation --zone=us-central1-a --quiet
diff --git a/.github/workflows/beam_PerformanceTests_AvroIOIT.yml b/.github/workflows/beam_PerformanceTests_AvroIOIT.yml
index 011f8a7..8c781b8 100644
--- a/.github/workflows/beam_PerformanceTests_AvroIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_AvroIOIT.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Performance Tests AvroIOIT
+name: PerformanceTests AvroIOIT
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml
index c85c8d4..e137af0 100644
--- a/.github/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_AvroIOIT_HDFS.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Performance Tests AvroIOIT HDFS
+name: PerformanceTests AvroIOIT HDFS
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml b/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml
index 0f5f69d..00a817c 100644
--- a/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml
+++ b/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Avro.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Performance Tests BigQueryIO Batch Java Avro
+name: PerformanceTests BigQueryIO Batch Java Avro
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml b/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml
index df2f406..6cb5e8f 100644
--- a/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml
+++ b/.github/workflows/beam_PerformanceTests_BigQueryIO_Batch_Java_Json.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Performance Tests BigQueryIO Batch Java Json
+name: PerformanceTests BigQueryIO Batch Java Json
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml b/.github/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml
index cc62801..f8c978c 100644
--- a/.github/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml
+++ b/.github/workflows/beam_PerformanceTests_BigQueryIO_Streaming_Java.yml
@@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-name: Performance Tests BigQueryIO Streaming Java
+name: PerformanceTests BigQueryIO Streaming Java
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_Cdap.yml b/.github/workflows/beam_PerformanceTests_Cdap.yml
index 49184ab..b4e0b2e 100644
--- a/.github/workflows/beam_PerformanceTests_Cdap.yml
+++ b/.github/workflows/beam_PerformanceTests_Cdap.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml b/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml
index 7ea42f1..465a77b 100644
--- a/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
# The env variable is created and populated in the test-arguments-action as "<github.job>_test_arguments_<argument_file_paths_index>"
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
diff --git a/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml
index 38b28ea..f52adaa 100644
--- a/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_Compressed_TextIOIT_HDFS.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_HadoopFormat.yml b/.github/workflows/beam_PerformanceTests_HadoopFormat.yml
index 9e61600..d303a00 100644
--- a/.github/workflows/beam_PerformanceTests_HadoopFormat.yml
+++ b/.github/workflows/beam_PerformanceTests_HadoopFormat.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_JDBC.yml b/.github/workflows/beam_PerformanceTests_JDBC.yml
index 206f7cd..f6f5fce 100644
--- a/.github/workflows/beam_PerformanceTests_JDBC.yml
+++ b/.github/workflows/beam_PerformanceTests_JDBC.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_Kafka_IO.yml b/.github/workflows/beam_PerformanceTests_Kafka_IO.yml
index 47cafaa..772142a 100644
--- a/.github/workflows/beam_PerformanceTests_Kafka_IO.yml
+++ b/.github/workflows/beam_PerformanceTests_Kafka_IO.yml
@@ -71,6 +71,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Set k8s access
uses: ./.github/actions/setup-k8s-access
with:
diff --git a/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml b/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml
index a804f66..9482aae 100644
--- a/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml
index 9510a98..e8e5210 100644
--- a/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_ManyFiles_TextIOIT_HDFS.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_MongoDBIO_IT.yml b/.github/workflows/beam_PerformanceTests_MongoDBIO_IT.yml
index 4f3d15f..5b25bd8 100644
--- a/.github/workflows/beam_PerformanceTests_MongoDBIO_IT.yml
+++ b/.github/workflows/beam_PerformanceTests_MongoDBIO_IT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_ParquetIOIT.yml b/.github/workflows/beam_PerformanceTests_ParquetIOIT.yml
index 879d622..6ade6b2 100644
--- a/.github/workflows/beam_PerformanceTests_ParquetIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_ParquetIOIT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
# The env variable is created and populated in the test-arguments-action as "<github.job>_test_arguments_<argument_file_paths_index>"
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
diff --git a/.github/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml
index e9bc1f3..05f1198 100644
--- a/.github/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_ParquetIOIT_HDFS.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_SparkReceiver_IO.yml b/.github/workflows/beam_PerformanceTests_SparkReceiver_IO.yml
index bf750ac..7aa71c3 100644
--- a/.github/workflows/beam_PerformanceTests_SparkReceiver_IO.yml
+++ b/.github/workflows/beam_PerformanceTests_SparkReceiver_IO.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_TFRecordIOIT.yml b/.github/workflows/beam_PerformanceTests_TFRecordIOIT.yml
index 7088dab..0d80693 100644
--- a/.github/workflows/beam_PerformanceTests_TFRecordIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_TFRecordIOIT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml
index f49be83..10099f7 100644
--- a/.github/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_TFRecordIOIT_HDFS.yml
@@ -71,6 +71,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PerformanceTests_TextIOIT.yml b/.github/workflows/beam_PerformanceTests_TextIOIT.yml
index 8be4370..2c92ec3 100644
--- a/.github/workflows/beam_PerformanceTests_TextIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_TextIOIT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml
index e0f4b03..4caad4f 100644
--- a/.github/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_TextIOIT_HDFS.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Set k8s access
uses: ./.github/actions/setup-k8s-access
with:
diff --git a/.github/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml b/.github/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml
index 6df9ef4..7903bf0 100644
--- a/.github/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml
+++ b/.github/workflows/beam_PerformanceTests_WordCountIT_PythonVersions.yml
@@ -103,9 +103,16 @@
--info \
-Ptest=apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it \
"-Ptest-pipeline-options=${{ env.beam_PerformanceTests_WordCountIT_PythonVersions_test_arguments_1 }}"
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PerformanceTests_XmlIOIT.yml b/.github/workflows/beam_PerformanceTests_XmlIOIT.yml
index d610d36..1dd1efc 100644
--- a/.github/workflows/beam_PerformanceTests_XmlIOIT.yml
+++ b/.github/workflows/beam_PerformanceTests_XmlIOIT.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Prepare test arguments
uses: ./.github/actions/test-arguments-action
with:
diff --git a/.github/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml b/.github/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml
index 6ceb38c..b03a699 100644
--- a/.github/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml
+++ b/.github/workflows/beam_PerformanceTests_XmlIOIT_HDFS.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PostCommit_BeamMetrics_Publish.yml b/.github/workflows/beam_PostCommit_BeamMetrics_Publish.yml
index 643a159..9cff830 100644
--- a/.github/workflows/beam_PostCommit_BeamMetrics_Publish.yml
+++ b/.github/workflows/beam_PostCommit_BeamMetrics_Publish.yml
@@ -72,6 +72,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
uses: google-github-actions/setup-gcloud@v0
with:
diff --git a/.github/workflows/beam_PostCommit_Go.yml b/.github/workflows/beam_PostCommit_Go.yml
index 27d1012..1a0a616 100644
--- a/.github/workflows/beam_PostCommit_Go.yml
+++ b/.github/workflows/beam_PostCommit_Go.yml
@@ -67,6 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Authenticate on GCP
diff --git a/.github/workflows/beam_PostCommit_Go_VR_Flink.yml b/.github/workflows/beam_PostCommit_Go_VR_Flink.yml
index 8339334..6434f57 100644
--- a/.github/workflows/beam_PostCommit_Go_VR_Flink.yml
+++ b/.github/workflows/beam_PostCommit_Go_VR_Flink.yml
@@ -67,6 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Go Flink ValidatesRunner script
env:
CLOUDSDK_CONFIG: ${{ env.KUBELET_GCLOUD_CONFIG_PATH}}
diff --git a/.github/workflows/beam_PostCommit_Go_VR_Spark.yml b/.github/workflows/beam_PostCommit_Go_VR_Spark.yml
index b902919..7b80df1 100644
--- a/.github/workflows/beam_PostCommit_Go_VR_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Go_VR_Spark.yml
@@ -67,6 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Go Spark ValidatesRunner script
env:
CLOUDSDK_CONFIG: ${{ env.KUBELET_GCLOUD_CONFIG_PATH}}
diff --git a/.github/workflows/beam_PostCommit_Java.yml b/.github/workflows/beam_PostCommit_Java.yml
index 3f56b2c..905cb3a 100644
--- a/.github/workflows/beam_PostCommit_Java.yml
+++ b/.github/workflows/beam_PostCommit_Java.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Avro_Versions.yml b/.github/workflows/beam_PostCommit_Java_Avro_Versions.yml
index 02f309c..1cd59fe 100644
--- a/.github/workflows/beam_PostCommit_Java_Avro_Versions.yml
+++ b/.github/workflows/beam_PostCommit_Java_Avro_Versions.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Avro Versions script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_BigQueryEarlyRollout.yml b/.github/workflows/beam_PostCommit_Java_BigQueryEarlyRollout.yml
index 5edfe37..33748db 100644
--- a/.github/workflows/beam_PostCommit_Java_BigQueryEarlyRollout.yml
+++ b/.github/workflows/beam_PostCommit_Java_BigQueryEarlyRollout.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
uses: google-github-actions/setup-gcloud@v0
with:
diff --git a/.github/workflows/beam_PostCommit_Java_DataflowV1.yml b/.github/workflows/beam_PostCommit_Java_DataflowV1.yml
index 53c0b0b..a15730a 100644
--- a/.github/workflows/beam_PostCommit_Java_DataflowV1.yml
+++ b/.github/workflows/beam_PostCommit_Java_DataflowV1.yml
@@ -69,10 +69,9 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
11
8
diff --git a/.github/workflows/beam_PostCommit_Java_DataflowV2.yml b/.github/workflows/beam_PostCommit_Java_DataflowV2.yml
index 6c8d9aa..f90fc8f 100644
--- a/.github/workflows/beam_PostCommit_Java_DataflowV2.yml
+++ b/.github/workflows/beam_PostCommit_Java_DataflowV2.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Dataflow V2 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow.yml b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow.yml
index 03e9497..2c21374 100644
--- a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow.yml
@@ -71,6 +71,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Examples Dataflow script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml
index 6b92c18..ee6596a 100644
--- a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml
+++ b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_Java.yml
@@ -71,10 +71,9 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.java_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.java_version }})
- - name: Set up Java${{ matrix.java_version }}
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
${{ matrix.java_version }}
8
diff --git a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml
index 1f817a3..7283bbe 100644
--- a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml
+++ b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Examples Dataflow V2 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml
index 411cc65..1b00659 100644
--- a/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml
+++ b/.github/workflows/beam_PostCommit_Java_Examples_Dataflow_V2_Java.yml
@@ -73,10 +73,9 @@
comment_phrase: ${{ matrix.job_phrase_1 }} ${{ matrix.java_version }} ${{ matrix.job_phrase_2 }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase_1 }} ${{ matrix.java_version }} ${{ matrix.job_phrase_2 }})
- - name: Set up Java${{ matrix.java_version }}
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: ${{ matrix.java_version }}
- name: run PostCommit Java Examples Dataflow V2 Java${{ matrix.java_version }} script
uses: ./.github/actions/gradle-command-self-hosted-action
diff --git a/.github/workflows/beam_PostCommit_Java_Examples_Direct.yml b/.github/workflows/beam_PostCommit_Java_Examples_Direct.yml
index b0f3508..60f5c1d 100644
--- a/.github/workflows/beam_PostCommit_Java_Examples_Direct.yml
+++ b/.github/workflows/beam_PostCommit_Java_Examples_Direct.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run examplesIntegrationTest script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Examples_Spark.yml b/.github/workflows/beam_PostCommit_Java_Examples_Spark.yml
index 7570086..e3f70bd 100644
--- a/.github/workflows/beam_PostCommit_Java_Examples_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Java_Examples_Spark.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run examplesIntegrationTest script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Hadoop_Versions.yml b/.github/workflows/beam_PostCommit_Java_Hadoop_Versions.yml
index 29e9756..41eec12 100644
--- a/.github/workflows/beam_PostCommit_Java_Hadoop_Versions.yml
+++ b/.github/workflows/beam_PostCommit_Java_Hadoop_Versions.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunner script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml b/.github/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml
index fe0cfaf..c80ec6c 100644
--- a/.github/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml
+++ b/.github/workflows/beam_PostCommit_Java_InfluxDbIO_IT.yml
@@ -15,7 +15,7 @@
# specific language governing permissions and limitations
# under the License.
-name: Java InfluxDbIO Integration Test
+name: PostCommit Java InfluxDbIO Integration Test
on:
schedule:
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml b/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml
index 1fd46b9..57dca87 100644
--- a/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml
+++ b/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java11.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java 11
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
+ java-version: 11
- name: run PostCommit Java Jpms Dataflow Java11 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml b/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml
index ff27bae..0204606 100644
--- a/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml
+++ b/.github/workflows/beam_PostCommit_Java_Jpms_Dataflow_Java17.yml
@@ -67,10 +67,9 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
17
8
diff --git a/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml b/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml
index 1dec7b5..61148ac 100644
--- a/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml
+++ b/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java11.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java 11
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
+ java-version: 11
- name: run PostCommit Java Jpms Direct Java11 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml b/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml
index 748a74d..823cea0 100644
--- a/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml
+++ b/.github/workflows/beam_PostCommit_Java_Jpms_Direct_Java17.yml
@@ -67,10 +67,9 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
17
8
diff --git a/.github/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml b/.github/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml
index 6265651..2b21763 100644
--- a/.github/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml
+++ b/.github/workflows/beam_PostCommit_Java_Jpms_Flink_Java11.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java 11
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
+ java-version: 11
- name: run PostCommit Java Jpms Flink Java11 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml b/.github/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml
index 745bec6..4026ce9 100644
--- a/.github/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml
+++ b/.github/workflows/beam_PostCommit_Java_Jpms_Spark_Java11.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java 11
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
+ java-version: 11
- name: run PostCommit Java Jpms Spark Java11 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml b/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml
index 8a3ff9a..d40723a 100644
--- a/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow.yml
@@ -95,6 +95,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Nexmark Dataflow (${{ matrix.streaming }} ${{ matrix.queryLanguage }}) script
if: matrix.queryLanguage != 'none'
uses: ./.github/actions/gradle-command-self-hosted-action
diff --git a/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml b/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml
index 428ea23..787ca05 100644
--- a/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml
+++ b/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2.yml
@@ -95,6 +95,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Nexmark Dataflow V2 (streaming = ${{ matrix.streaming }}) script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml b/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml
index 076c030..f06302c 100644
--- a/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml
+++ b/.github/workflows/beam_PostCommit_Java_Nexmark_Dataflow_V2_Java.yml
@@ -97,10 +97,9 @@
comment_phrase: ${{ matrix.job_phrase_1 }} ${{ matrix.java_version }} ${{ matrix.job_phrase_2 }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase_1 }} ${{ matrix.java_version }} ${{ matrix.job_phrase_2 }})
- - name: Set up Java${{ matrix.java_version }}
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: ${{ matrix.java_version }}
- name: run PostCommit Java ${{ matrix.java_version }} Nexmark Dataflow V2 (streaming = ${{ matrix.streaming }}) script
uses: ./.github/actions/gradle-command-self-hosted-action
diff --git a/.github/workflows/beam_PostCommit_Java_Nexmark_Direct.yml b/.github/workflows/beam_PostCommit_Java_Nexmark_Direct.yml
index a1842c7..837582a 100644
--- a/.github/workflows/beam_PostCommit_Java_Nexmark_Direct.yml
+++ b/.github/workflows/beam_PostCommit_Java_Nexmark_Direct.yml
@@ -90,6 +90,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Nexmark Direct (${{ matrix.streaming }} ${{ matrix.queryLanguage }}) script
if: matrix.queryLanguage != 'none'
uses: ./.github/actions/gradle-command-self-hosted-action
diff --git a/.github/workflows/beam_PostCommit_Java_Nexmark_Flink.yml b/.github/workflows/beam_PostCommit_Java_Nexmark_Flink.yml
index 76278de..afcc906 100644
--- a/.github/workflows/beam_PostCommit_Java_Nexmark_Flink.yml
+++ b/.github/workflows/beam_PostCommit_Java_Nexmark_Flink.yml
@@ -89,6 +89,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Nexmark Flink (${{ matrix.streaming }} ${{ matrix.queryLanguage }}) script
if: matrix.queryLanguage != 'none'
uses: ./.github/actions/gradle-command-self-hosted-action
diff --git a/.github/workflows/beam_PostCommit_Java_Nexmark_Spark.yml b/.github/workflows/beam_PostCommit_Java_Nexmark_Spark.yml
index 110cb20..937a843 100644
--- a/.github/workflows/beam_PostCommit_Java_Nexmark_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Java_Nexmark_Spark.yml
@@ -89,6 +89,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Nexmark Spark (runner = ${{ matrix.runner }} queryLanguage = ${{ matrix.queryLanguage }}) script
if: matrix.queryLanguage != 'none'
uses: ./.github/actions/gradle-command-self-hosted-action
diff --git a/.github/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml b/.github/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml
index 0213785..de5253f 100644
--- a/.github/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml
+++ b/.github/workflows/beam_PostCommit_Java_PVR_Flink_Streaming.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Flink PortableValidatesRunner Streaming script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_PVR_Samza.yml b/.github/workflows/beam_PostCommit_Java_PVR_Samza.yml
index cf875306..912cba9 100644
--- a/.github/workflows/beam_PostCommit_Java_PVR_Samza.yml
+++ b/.github/workflows/beam_PostCommit_Java_PVR_Samza.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Samza script
env:
CLOUDSDK_CONFIG: ${{ env.KUBELET_GCLOUD_CONFIG_PATH}}
diff --git a/.github/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml b/.github/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml
index 79eadd7..b94e400 100644
--- a/.github/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml
+++ b/.github/workflows/beam_PostCommit_Java_PVR_Spark3_Streaming.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java PortableValidatesRunner Spark3 Streaming script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Sickbay.yml b/.github/workflows/beam_PostCommit_Java_Sickbay.yml
index 18ef48b..a9e21f4 100644
--- a/.github/workflows/beam_PostCommit_Java_Sickbay.yml
+++ b/.github/workflows/beam_PostCommit_Java_Sickbay.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Sickbay script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_SingleStoreIO_IT.yml b/.github/workflows/beam_PostCommit_Java_SingleStoreIO_IT.yml
index 5af333e..ba2c72f 100644
--- a/.github/workflows/beam_PostCommit_Java_SingleStoreIO_IT.yml
+++ b/.github/workflows/beam_PostCommit_Java_SingleStoreIO_IT.yml
@@ -71,6 +71,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
id: auth
uses: google-github-actions/auth@v1
diff --git a/.github/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml b/.github/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml
index d6b6375..5fb70ef 100644
--- a/.github/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Java_Tpcds_Dataflow.yml
@@ -92,6 +92,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Tpcds Dataflow script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Tpcds_Flink.yml b/.github/workflows/beam_PostCommit_Java_Tpcds_Flink.yml
index d2ba0ca..fcd87f4 100644
--- a/.github/workflows/beam_PostCommit_Java_Tpcds_Flink.yml
+++ b/.github/workflows/beam_PostCommit_Java_Tpcds_Flink.yml
@@ -89,6 +89,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Tpcds Flink script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_Tpcds_Spark.yml b/.github/workflows/beam_PostCommit_Java_Tpcds_Spark.yml
index d93f0e1..dd1b3a1 100644
--- a/.github/workflows/beam_PostCommit_Java_Tpcds_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Java_Tpcds_Spark.yml
@@ -88,6 +88,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Java Tpcds Spark script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml
index bd7f297..4d0357d 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunner script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml
index a6afa04..62f742f 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_JavaVersions.yml
@@ -71,10 +71,9 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.java_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }}) ${{ matrix.java_version }}
- - name: Set up Java${{ matrix.java_version }}
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
${{ matrix.java_version }}
8
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml
index 116f8f6..175893f 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_Streaming.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunnerStreaming script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml
index 5506469..723d17c 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunnerV2 script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml
index 8314db6..d81f7cf 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Dataflow_V2_Streaming.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunnerV2Streaming script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml
index cf62a78..03453f0 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct.yml
@@ -69,11 +69,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunner script
run: ./gradlew :runners:direct-java:validatesRunner
- name: Archive JUnit Test Results
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml
index 2d879fa..b16a20f6 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Direct_JavaVersions.yml
@@ -71,10 +71,9 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.java_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }}) ${{ matrix.java_version }}
- - name: Set up Java${{ matrix.java_version }}
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
${{ matrix.java_version }}
8
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml
index e96f8ef..1814835 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Flink_Java11.yml
@@ -69,10 +69,9 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
11
8
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml
index b577b58..986eab3 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Samza.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunner script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml
index 3465ff9..bc01dac 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunner script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml
index 6df7d3a..404c15e 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesStructuredStreamingRunnerBatch script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml
index 1f82a45..d8249a1 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java11.yml
@@ -69,10 +69,9 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
11
8
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml
index 3bcaff4..87db69c 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml
@@ -67,11 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
+ java-version: 8
- name: run validatesRunner script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml
index 9d42ecd..4e3b9c4 100644
--- a/.github/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml
+++ b/.github/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml
@@ -67,15 +67,11 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
- - name: Install Python
- uses: actions/setup-python@v4
- with:
- python-version: '3.8'
+ java-version: 8
+ python-version: 3.8
- name: run ulrLoopbackValidatesRunner script
run: ./gradlew :runners:portability:java:ulrLoopbackValidatesRunner
- name: Archive JUnit Test Results
diff --git a/.github/workflows/beam_PostCommit_Javadoc.yml b/.github/workflows/beam_PostCommit_Javadoc.yml
index a70b37c..fe72554 100644
--- a/.github/workflows/beam_PostCommit_Javadoc.yml
+++ b/.github/workflows/beam_PostCommit_Javadoc.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run aggregateJavadoc script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_PortableJar_Flink.yml b/.github/workflows/beam_PostCommit_PortableJar_Flink.yml
index ef27d16..a0d5f51 100644
--- a/.github/workflows/beam_PostCommit_PortableJar_Flink.yml
+++ b/.github/workflows/beam_PostCommit_PortableJar_Flink.yml
@@ -67,8 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: 3.8
- name: run testPipelineJarFlinkRunner script
@@ -79,9 +79,16 @@
gradle-command: :sdks:python:test-suites:portable:py38:testPipelineJarFlinkRunner
arguments: |
-PpythonVersion=3.8 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_PortableJar_Spark.yml b/.github/workflows/beam_PostCommit_PortableJar_Spark.yml
index a84384b..500c591 100644
--- a/.github/workflows/beam_PostCommit_PortableJar_Spark.yml
+++ b/.github/workflows/beam_PostCommit_PortableJar_Spark.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -67,8 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: 3.8
- name: run testPipelineJarSparkRunner script
@@ -79,9 +79,16 @@
gradle-command: :sdks:python:test-suites:portable:py38:testPipelineJarSparkRunner
arguments: |
-PpythonVersion=3.8 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python.yml b/.github/workflows/beam_PostCommit_Python.yml
index ffd2979..e3e4ce9 100644
--- a/.github/workflows/beam_PostCommit_Python.yml
+++ b/.github/workflows/beam_PostCommit_Python.yml
@@ -30,12 +30,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -71,8 +71,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: ${{ matrix.python_version }}
- name: Install docker compose
@@ -94,9 +94,16 @@
-PpythonVersion=${{ matrix.python_version }} \
env:
CLOUDSDK_CONFIG: ${{ env.KUBELET_GCLOUD_CONFIG_PATH}}
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Arm.yml b/.github/workflows/beam_PostCommit_Python_Arm.yml
index cf41ed1..f28d6b1 100644
--- a/.github/workflows/beam_PostCommit_Python_Arm.yml
+++ b/.github/workflows/beam_PostCommit_Python_Arm.yml
@@ -32,12 +32,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -73,8 +73,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: ${{ matrix.python_version }}
- name: Install docker compose
@@ -110,9 +110,16 @@
CLOUDSDK_CONFIG: ${{ env.KUBELET_GCLOUD_CONFIG_PATH}}
MULTIARCH_TAG: ${{ steps.set_tag.outputs.TAG }}
USER: github-actions
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Examples_Dataflow.yml b/.github/workflows/beam_PostCommit_Python_Examples_Dataflow.yml
index 61fbb1f..b659b25 100644
--- a/.github/workflows/beam_PostCommit_Python_Examples_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Python_Examples_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -79,9 +79,16 @@
arguments: |
-PuseWheelDistribution \
-PpythonVersion=3.11 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Examples_Direct.yml b/.github/workflows/beam_PostCommit_Python_Examples_Direct.yml
index c8855b2..b7344be 100644
--- a/.github/workflows/beam_PostCommit_Python_Examples_Direct.yml
+++ b/.github/workflows/beam_PostCommit_Python_Examples_Direct.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -86,9 +86,16 @@
gradle-command: :sdks:python:test-suites:direct:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:examples
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Examples_Flink.yml b/.github/workflows/beam_PostCommit_Python_Examples_Flink.yml
index 2e4db61..af349c0 100644
--- a/.github/workflows/beam_PostCommit_Python_Examples_Flink.yml
+++ b/.github/workflows/beam_PostCommit_Python_Examples_Flink.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -86,9 +86,16 @@
gradle-command: :sdks:python:test-suites:portable:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:flinkExamples
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Examples_Spark.yml b/.github/workflows/beam_PostCommit_Python_Examples_Spark.yml
index 498bc4f..66a240e 100644
--- a/.github/workflows/beam_PostCommit_Python_Examples_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Python_Examples_Spark.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -86,9 +86,16 @@
gradle-command: :sdks:python:test-suites:portable:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:sparkExamples
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml b/.github/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml
index d8e9934..e406a58 100644
--- a/.github/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml
+++ b/.github/workflows/beam_PostCommit_Python_MongoDBIO_IT.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -78,9 +78,16 @@
gradle-command: :sdks:python:test-suites:direct:py311:mongodbioIT
arguments: |
-PpythonVersion=3.11 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Nexmark_Direct.yml b/.github/workflows/beam_PostCommit_Python_Nexmark_Direct.yml
index c63bcf2..9f00b6f 100644
--- a/.github/workflows/beam_PostCommit_Python_Nexmark_Direct.yml
+++ b/.github/workflows/beam_PostCommit_Python_Nexmark_Direct.yml
@@ -112,8 +112,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: 3.8
- name: run Java Testing Nexmark (query ${{ matrix.query }})
diff --git a/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml b/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml
index 51b22fa..e572c5c 100644
--- a/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -90,9 +90,16 @@
gradle-command: :sdks:python:test-suites:dataflow:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:validatesContainer
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml b/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml
index 33fa6b5..cfc5db3f 100644
--- a/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml
+++ b/.github/workflows/beam_PostCommit_Python_ValidatesContainer_Dataflow_With_RC.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -91,9 +91,16 @@
arguments: |
-PtestRCDependencies=true \
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml
index b01ea81..79b2269 100644
--- a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -94,9 +94,16 @@
arguments: |
-PuseWheelDistribution \
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml
index 79a7550..367a5da 100644
--- a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml
+++ b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Flink.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -88,9 +88,16 @@
gradle-command: :sdks:python:test-suites:portable:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:flinkValidatesRunner
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml
index 300cee6..9fff393 100644
--- a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml
+++ b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Samza.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -86,9 +86,16 @@
gradle-command: :sdks:python:test-suites:portable:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:samzaValidatesRunner
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml
index c1f5559..67ec4b5 100644
--- a/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml
+++ b/.github/workflows/beam_PostCommit_Python_ValidatesRunner_Spark.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -86,9 +86,16 @@
gradle-command: :sdks:python:test-suites:portable:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:sparkValidatesRunner
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml b/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml
index e402347..65d3f70 100644
--- a/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -67,8 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
@@ -78,9 +78,16 @@
with:
gradle-command: :sdks:python:test-suites:dataflow:gcpCrossLanguagePostCommit
arguments: -PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml b/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml
index ec73464..00bcabf 100644
--- a/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml
+++ b/.github/workflows/beam_PostCommit_Python_Xlang_Gcp_Direct.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -67,8 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
@@ -77,9 +77,16 @@
uses: ./.github/actions/gradle-command-self-hosted-action
with:
gradle-command: :sdks:python:test-suites:direct:gcpCrossLanguagePostCommit
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml b/.github/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml
index 9a4a300..0e73b0e 100644
--- a/.github/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_Python_Xlang_IO_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -67,8 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
@@ -80,9 +80,16 @@
arguments: |
-PuseWheelDistribution \
-PkafkaBootstrapServer=10.128.0.40:9094,10.128.0.28:9094,10.128.0.165:9094 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: archiveJunit
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_SQL.yml b/.github/workflows/beam_PostCommit_SQL.yml
index 1ad8e05..0ac8af3 100644
--- a/.github/workflows/beam_PostCommit_SQL.yml
+++ b/.github/workflows/beam_PostCommit_SQL.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit SQL script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Sickbay_Python.yml b/.github/workflows/beam_PostCommit_Sickbay_Python.yml
index 5e8b717..b01ee16 100644
--- a/.github/workflows/beam_PostCommit_Sickbay_Python.yml
+++ b/.github/workflows/beam_PostCommit_Sickbay_Python.yml
@@ -30,12 +30,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -73,8 +73,8 @@
comment_phrase: ${{ matrix.job_phrase_1 }} ${{ matrix.python_version }} ${{ matrix.job_phrase_2 }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase_1 }} ${{ matrix.python_version }} ${{ matrix.job_phrase_2 }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: ${{ matrix.python_version }}
- name: Set PY_VER_CLEAN
@@ -89,9 +89,16 @@
gradle-command: :sdks:python:test-suites:dataflow:py${{steps.set_py_ver_clean.outputs.py_ver_clean}}:postCommitSickbay
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_TransformService_Direct.yml b/.github/workflows/beam_PostCommit_TransformService_Direct.yml
index 4c9bb3a..a180adf 100644
--- a/.github/workflows/beam_PostCommit_TransformService_Direct.yml
+++ b/.github/workflows/beam_PostCommit_TransformService_Direct.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -68,14 +68,10 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Set up Java 11
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
- - name: Install Python
- uses: actions/setup-python@v4
- with:
+ java-version: 11
python-version: |
3.8
3.11
@@ -88,9 +84,16 @@
-Pjava11Home=$JAVA_HOME_11_X64 \
-PuseWheelDistribution \
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_Website_Publish.yml b/.github/workflows/beam_PostCommit_Website_Publish.yml
index e5759e3..047f956 100644
--- a/.github/workflows/beam_PostCommit_Website_Publish.yml
+++ b/.github/workflows/beam_PostCommit_Website_Publish.yml
@@ -56,6 +56,8 @@
name: beam_PostCommit_Website_Publish
steps:
- uses: actions/checkout@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Website Publish script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_Website_Test.yml b/.github/workflows/beam_PostCommit_Website_Test.yml
index a73d971..3782b9c 100644
--- a/.github/workflows/beam_PostCommit_Website_Test.yml
+++ b/.github/workflows/beam_PostCommit_Website_Test.yml
@@ -67,6 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PostCommit Website Test script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PostCommit_XVR_Direct.yml b/.github/workflows/beam_PostCommit_XVR_Direct.yml
index 8a0c469..c639690 100644
--- a/.github/workflows/beam_PostCommit_XVR_Direct.yml
+++ b/.github/workflows/beam_PostCommit_XVR_Direct.yml
@@ -68,8 +68,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
diff --git a/.github/workflows/beam_PostCommit_XVR_Flink.yml b/.github/workflows/beam_PostCommit_XVR_Flink.yml
index 3e5a588..ea6e0b5 100644
--- a/.github/workflows/beam_PostCommit_XVR_Flink.yml
+++ b/.github/workflows/beam_PostCommit_XVR_Flink.yml
@@ -69,8 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
diff --git a/.github/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml b/.github/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml
index 64c87c9..2d37e05 100644
--- a/.github/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_XVR_GoUsingJava_Dataflow.yml
@@ -67,10 +67,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- python-version: '3.8'
+ python-version: 3.8
- name: run XVR GoUsingJava Dataflow script
env:
USER: github-actions
diff --git a/.github/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml b/.github/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml
index 4f3cf65..494ab1e 100644
--- a/.github/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_XVR_JavaUsingPython_Dataflow.yml
@@ -68,8 +68,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
diff --git a/.github/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml b/.github/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml
index be4fcb9..52adbc2 100644
--- a/.github/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -67,8 +67,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: 3.11
- name: run PostCommit XVR PythonUsingJavaSQL Dataflow script
@@ -79,9 +79,16 @@
gradle-command: :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql
arguments: |
-PpythonVersion=3.11 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml b/.github/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml
index 0318f73..7965445 100644
--- a/.github/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml
+++ b/.github/workflows/beam_PostCommit_XVR_PythonUsingJava_Dataflow.yml
@@ -23,12 +23,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -68,8 +68,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
@@ -82,9 +82,16 @@
gradle-command: :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava
arguments: |
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PostCommit_XVR_Samza.yml b/.github/workflows/beam_PostCommit_XVR_Samza.yml
index 486a451..0f08490 100644
--- a/.github/workflows/beam_PostCommit_XVR_Samza.yml
+++ b/.github/workflows/beam_PostCommit_XVR_Samza.yml
@@ -68,8 +68,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
diff --git a/.github/workflows/beam_PostCommit_XVR_Spark3.yml b/.github/workflows/beam_PostCommit_XVR_Spark3.yml
index 5524f66..f497653 100644
--- a/.github/workflows/beam_PostCommit_XVR_Spark3.yml
+++ b/.github/workflows/beam_PostCommit_XVR_Spark3.yml
@@ -68,8 +68,8 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: |
3.8
diff --git a/.github/workflows/beam_PreCommit_Java.yml b/.github/workflows/beam_PreCommit_Java.yml
index e1a3361..ea3340b 100644
--- a/.github/workflows/beam_PreCommit_Java.yml
+++ b/.github/workflows/beam_PreCommit_Java.yml
@@ -175,6 +175,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Java PreCommit script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml
index 0a117ec..4591fcd 100644
--- a/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services2_IO_Direct.yml
@@ -100,6 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Amazon-Web-Services2 IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml
index 2a80616..960c595 100644
--- a/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Amazon-Web-Services_IO_Direct.yml
@@ -100,6 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Amazon-Web-Services IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml
index 093afbe..e23ccea 100644
--- a/.github/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Amqp_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Amqp IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml
index 013bd58..1b99717 100644
--- a/.github/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Azure_IO_Direct.yml
@@ -100,6 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Azure IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml
index 7713c02..e026a15 100644
--- a/.github/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Cassandra_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Cassandra IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml
index 08650c3..b1c3247 100644
--- a/.github/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Cdap_IO_Direct.yml
@@ -86,6 +86,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Cdap IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml
index 9d2cbcf..f20daac 100644
--- a/.github/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Clickhouse_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Clickhouse IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml
index e2b331c..cf7f996 100644
--- a/.github/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Csv_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Csv IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml
index 5497926..ac68cc6 100644
--- a/.github/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Debezium_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Debezium IO build task
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml
index ad02187..e617672 100644
--- a/.github/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_ElasticSearch_IO_Direct.yml
@@ -84,6 +84,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run ElasticSearch IO build task
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml b/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml
index b12590c..132665d 100644
--- a/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml
+++ b/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java11.yml
@@ -103,11 +103,10 @@
export_default_credentials: true
# The workflow installs java 11 and as default jvm. This is different from
# PreCommit_Java_Examples_Dataflow_Java17 where the build system and sources are compiled with Java8
- - name: Set up Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
+ java-version: 11
- name: run javaExamplesDataflowPrecommit script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml b/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml
index d4a55aa..6ae9eaf 100644
--- a/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml
+++ b/.github/workflows/beam_PreCommit_Java_Examples_Dataflow_Java17.yml
@@ -94,10 +94,9 @@
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
# The test requires Java 17 and Java 8 versions.
# Java 8 is installed second because JAVA_HOME needs to point to Java 8.
- - name: Set up Java 17 and 8
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
java-version: |
17
8
diff --git a/.github/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml
index aeb9ed6..2446eb9 100644
--- a/.github/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_File-schema-transform_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run File-schema-transform IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml
index dc937f2..e5ef36c 100644
--- a/.github/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_GCP_IO_Direct.yml
@@ -100,6 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PreCommit Java GCP IO Direct script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml
index c71c464..dce33b3 100644
--- a/.github/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_HBase_IO_Direct.yml
@@ -84,6 +84,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run HBase IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml
index ed99e58..8572382 100644
--- a/.github/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_HCatalog_IO_Direct.yml
@@ -84,6 +84,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run HCatalog IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml
index 4f90ace..ec3a0c5 100644
--- a/.github/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Hadoop_IO_Direct.yml
@@ -108,6 +108,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Hadoop IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_IOs_Direct.yml b/.github/workflows/beam_PreCommit_Java_IOs_Direct.yml
index 1137d9f..4a221a9 100644
--- a/.github/workflows/beam_PreCommit_Java_IOs_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_IOs_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Java IOs PreCommit script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml
index ac35176..67ee59b 100644
--- a/.github/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_InfluxDb_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run InfluxDb IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml
index d2ef342..4b86f8b 100644
--- a/.github/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_JDBC_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run JDBC IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml
index 223c90c..686cc75 100644
--- a/.github/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Jms_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Jms IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml
index 61ecc1d..86d59f7 100644
--- a/.github/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Kafka_IO_Direct.yml
@@ -90,6 +90,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Kafka IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml
index 8cca395..bfe0d50 100644
--- a/.github/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Kinesis_IO_Direct.yml
@@ -100,6 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Kinesis IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml
index bc3d257..55c81bb 100644
--- a/.github/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Kudu_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Kudu IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml
index b8d28c1..563c871 100644
--- a/.github/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_MongoDb_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run MongoDb IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml
index 7b19949..b85697d 100644
--- a/.github/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Mqtt_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Mqtt IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml
index 26b4637..21f746e 100644
--- a/.github/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Neo4j_IO_Direct.yml
@@ -84,6 +84,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Neo4j IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml b/.github/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml
index 602bb87..eb2f85b 100644
--- a/.github/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml
+++ b/.github/workflows/beam_PreCommit_Java_PVR_Flink_Batch.yml
@@ -88,6 +88,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run validatesPortableRunnerBatch script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml b/.github/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml
index b2f0c66..b3b5bd5 100644
--- a/.github/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml
+++ b/.github/workflows/beam_PreCommit_Java_PVR_Flink_Docker.yml
@@ -93,6 +93,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run PreCommit Java PVR Flink Docker script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml
index 4a72512..9e52618e 100644
--- a/.github/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Parquet_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Parquet IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml
index cac0872..b60f53c 100644
--- a/.github/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Pulsar_IO_Direct.yml
@@ -100,6 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Pulsar IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml
index 4983a84..d917c28 100644
--- a/.github/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_RabbitMq_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run RabbitMq IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml
index 51970fc..91095a0 100644
--- a/.github/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Redis_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Redis IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml
index 1cae0b5..b380e96 100644
--- a/.github/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_SingleStore_IO_Direct.yml
@@ -84,6 +84,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run SingleStore IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml
index 7d28989..fab5fd4 100644
--- a/.github/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Snowflake_IO_Direct.yml
@@ -86,6 +86,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Snowflake IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml
index b009641..06ed141 100644
--- a/.github/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Solr_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Solr IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Spark3_Versions.yml b/.github/workflows/beam_PreCommit_Java_Spark3_Versions.yml
index 42b9a24..aaab301 100644
--- a/.github/workflows/beam_PreCommit_Java_Spark3_Versions.yml
+++ b/.github/workflows/beam_PreCommit_Java_Spark3_Versions.yml
@@ -91,13 +91,10 @@
service_account_key: ${{ secrets.GCP_SA_KEY }}
project_id: ${{ secrets.GCP_PROJECT_ID }}
export_default_credentials: true
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
- cache: 'gradle'
- check-latest: true
+ java-version: 8
- name: run sparkVersionsTest script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml
index 57adc64..f31ad63 100644
--- a/.github/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Splunk_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Splunk IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml
index 88eabd1..bb22fa8 100644
--- a/.github/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Thrift_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Thrift IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml b/.github/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml
index 62f96ae..1dbb51b 100644
--- a/.github/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml
+++ b/.github/workflows/beam_PreCommit_Java_Tika_IO_Direct.yml
@@ -82,6 +82,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Tika IO build script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Kotlin_Examples.yml b/.github/workflows/beam_PreCommit_Kotlin_Examples.yml
index f3e6e36..613a082 100644
--- a/.github/workflows/beam_PreCommit_Kotlin_Examples.yml
+++ b/.github/workflows/beam_PreCommit_Kotlin_Examples.yml
@@ -94,13 +94,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
- cache: 'gradle'
- check-latest: true
+ java-version: 8
- name: run Kotlin Examples script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Portable_Python.yml b/.github/workflows/beam_PreCommit_Portable_Python.yml
index c130246..1540600 100644
--- a/.github/workflows/beam_PreCommit_Portable_Python.yml
+++ b/.github/workflows/beam_PreCommit_Portable_Python.yml
@@ -99,16 +99,10 @@
comment_phrase: ${{ matrix.job_phrase }} ${{ matrix.python_version }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }} ${{ matrix.python_version }})
- - name: Install Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'zulu'
- java-version: '8'
- cache: 'gradle'
- check-latest: true
- - name: Install Python
- uses: actions/setup-python@v4
- with:
+ java-version: 8
python-version: |
${{ matrix.python_version }}
3.8
diff --git a/.github/workflows/beam_PreCommit_Python.yml b/.github/workflows/beam_PreCommit_Python.yml
index d507a8e..53cc2af 100644
--- a/.github/workflows/beam_PreCommit_Python.yml
+++ b/.github/workflows/beam_PreCommit_Python.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -98,9 +98,16 @@
-Pposargs="--ignore=apache_beam/dataframe/ --ignore=apache_beam/examples/ --ignore=apache_beam/runners/ --ignore=apache_beam/transforms/" \
-PpythonVersion=${{ matrix.python_version }} \
-PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_Coverage.yml b/.github/workflows/beam_PreCommit_Python_Coverage.yml
index 2127f08..e04f100 100644
--- a/.github/workflows/beam_PreCommit_Python_Coverage.yml
+++ b/.github/workflows/beam_PreCommit_Python_Coverage.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -88,9 +88,16 @@
gradle-command: :sdks:python:test-suites:tox:py38:preCommitPyCoverage
arguments: |
-PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_Dataframes.yml b/.github/workflows/beam_PreCommit_Python_Dataframes.yml
index 9211e25..8024f8d 100644
--- a/.github/workflows/beam_PreCommit_Python_Dataframes.yml
+++ b/.github/workflows/beam_PreCommit_Python_Dataframes.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -98,9 +98,16 @@
-Pposargs=apache_beam/dataframe/ \
-PpythonVersion=${{ matrix.python_version }} \
-PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_Examples.yml b/.github/workflows/beam_PreCommit_Python_Examples.yml
index 164734f..1983911 100644
--- a/.github/workflows/beam_PreCommit_Python_Examples.yml
+++ b/.github/workflows/beam_PreCommit_Python_Examples.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -98,9 +98,16 @@
-Pposargs=apache_beam/examples/ \
-PpythonVersion=${{ matrix.python_version }} \
-PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_Integration.yml b/.github/workflows/beam_PreCommit_Python_Integration.yml
index f9c32a1..c8f2233 100644
--- a/.github/workflows/beam_PreCommit_Python_Integration.yml
+++ b/.github/workflows/beam_PreCommit_Python_Integration.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -104,9 +104,16 @@
arguments: |
-PuseWheelDistribution \
-PpythonVersion=${{ matrix.python_version }} \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_PVR_Flink.yml b/.github/workflows/beam_PreCommit_Python_PVR_Flink.yml
index 06f1865..2725cc3 100644
--- a/.github/workflows/beam_PreCommit_Python_PVR_Flink.yml
+++ b/.github/workflows/beam_PreCommit_Python_PVR_Flink.yml
@@ -54,12 +54,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -100,8 +100,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Install Python
- uses: actions/setup-python@v4
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
python-version: 3.11
- name: run Python PVR Flink PreCommit script
@@ -112,9 +112,16 @@
gradle-command: :sdks:python:test-suites:portable:py311:flinkValidatesRunner
arguments: |
-PpythonVersion=3.11 \
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: '**/pytest*.xml'
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_Runners.yml b/.github/workflows/beam_PreCommit_Python_Runners.yml
index d170290..d622f9e 100644
--- a/.github/workflows/beam_PreCommit_Python_Runners.yml
+++ b/.github/workflows/beam_PreCommit_Python_Runners.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -98,9 +98,16 @@
-Pposargs=apache_beam/runners/ \
-PpythonVersion=${{ matrix.python_version }} \
-PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_Python_Transforms.yml b/.github/workflows/beam_PreCommit_Python_Transforms.yml
index 7422d7d..d034296 100644
--- a/.github/workflows/beam_PreCommit_Python_Transforms.yml
+++ b/.github/workflows/beam_PreCommit_Python_Transforms.yml
@@ -31,12 +31,12 @@
#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
- pull-requests: read
- checks: read
+ pull-requests: write
+ checks: write
contents: read
deployments: read
id-token: none
- issues: read
+ issues: write
discussions: read
packages: read
pages: read
@@ -98,9 +98,16 @@
-Pposargs=apache_beam/transforms/ \
-PpythonVersion=${{ matrix.python_version }} \
-PuseWheelDistribution
- - name: Archive code coverage results
+ - name: Archive Python Test Results
uses: actions/upload-artifact@v3
+ if: failure()
+ with:
+ name: Python Test Results
+ path: '**/pytest*.xml'
+ - name: Publish Python Test Results
+ uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
- name: python-code-coverage-report
- path: "**/pytest*.xml"
\ No newline at end of file
+ commit: '${{ env.prsha || env.GITHUB_SHA }}'
+ comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
+ files: '**/pytest*.xml'
\ No newline at end of file
diff --git a/.github/workflows/beam_PreCommit_SQL.yml b/.github/workflows/beam_PreCommit_SQL.yml
index 1fe577d..f4a4e30 100644
--- a/.github/workflows/beam_PreCommit_SQL.yml
+++ b/.github/workflows/beam_PreCommit_SQL.yml
@@ -78,11 +78,10 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
+ java-version: 11
- name: Build and Test
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_SQL_Java11.yml b/.github/workflows/beam_PreCommit_SQL_Java11.yml
index 4acbecc..92b8fe3 100644
--- a/.github/workflows/beam_PreCommit_SQL_Java11.yml
+++ b/.github/workflows/beam_PreCommit_SQL_Java11.yml
@@ -78,19 +78,12 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- - name: Set up Java
- uses: actions/setup-java@v3.8.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '11'
- - name: Install Go
- uses: actions/setup-go@v4
- with:
- go-version: '1.21'
- - name: Install Python
- uses: actions/setup-python@v4
- with:
- python-version: '3.8'
+ java-version: 11
+ python-version: 3.8
+ go-version: 1.21
- name: Install Flutter
uses: subosito/flutter-action@v2
with:
diff --git a/.github/workflows/beam_PreCommit_SQL_Java17.yml b/.github/workflows/beam_PreCommit_SQL_Java17.yml
index a5899ff..7d66171 100644
--- a/.github/workflows/beam_PreCommit_SQL_Java17.yml
+++ b/.github/workflows/beam_PreCommit_SQL_Java17.yml
@@ -80,20 +80,13 @@
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
# The test requires Java 17 and Java 8 versions.
# Java 8 is installed second because JAVA_HOME needs to point to Java 8.
- - name: Set up Java 17
- uses: actions/setup-java@v3.11.0
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
with:
- distribution: 'temurin'
- java-version: '17'
- - name: Set up Java 8
- uses: actions/setup-java@v3.11.0
- with:
- distribution: 'temurin'
- java-version: '8'
- - name: Install Python
- uses: actions/setup-python@v4
- with:
- python-version: '3.8'
+ java-version: |
+ 17
+ 8
+ python-version: 3.8
- name: Build and Test
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_PreCommit_Spotless.yml b/.github/workflows/beam_PreCommit_Spotless.yml
index a637add..b4bd436 100644
--- a/.github/workflows/beam_PreCommit_Spotless.yml
+++ b/.github/workflows/beam_PreCommit_Spotless.yml
@@ -92,6 +92,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: run Spotless PreCommit script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_Prober_CommunityMetrics.yml b/.github/workflows/beam_Prober_CommunityMetrics.yml
index d22b7db..ce788ac 100644
--- a/.github/workflows/beam_Prober_CommunityMetrics.yml
+++ b/.github/workflows/beam_Prober_CommunityMetrics.yml
@@ -69,6 +69,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Health check probes for the Community Metrics infrastructure
uses: ./.github/actions/gradle-command-self-hosted-action
with:
diff --git a/.github/workflows/beam_Publish_Beam_SDK_Snapshots.yml b/.github/workflows/beam_Publish_Beam_SDK_Snapshots.yml
index 9bc268b..9e48325 100644
--- a/.github/workflows/beam_Publish_Beam_SDK_Snapshots.yml
+++ b/.github/workflows/beam_Publish_Beam_SDK_Snapshots.yml
@@ -60,7 +60,16 @@
matrix:
job_name: ["beam_Publish_Beam_SDK_Snapshots"]
job_phrase: ["N/A"]
- container_task: ["go:container", "java:container:java8", "java:container:java11", "java:container:java17", "python:container:py38", "python:container:py39", "python:container:py310", "python:container:py311"]
+ container_task:
+ - "go:container"
+ - "java:container:java8"
+ - "java:container:java11"
+ - "java:container:java17"
+ - "java:container:java21"
+ - "python:container:py38"
+ - "python:container:py39"
+ - "python:container:py310"
+ - "python:container:py311"
steps:
- uses: actions/checkout@v3
- name: Setup repository
diff --git a/.github/workflows/beam_Publish_Docker_Snapshots.yml b/.github/workflows/beam_Publish_Docker_Snapshots.yml
index cf2520f..01b846e 100644
--- a/.github/workflows/beam_Publish_Docker_Snapshots.yml
+++ b/.github/workflows/beam_Publish_Docker_Snapshots.yml
@@ -68,6 +68,8 @@
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
+ - name: Setup environment
+ uses: ./.github/actions/setup-environment-action
- name: Authenticate on GCP
uses: google-github-actions/setup-gcloud@v0
with:
diff --git a/.test-infra/mock-apis/README.md b/.test-infra/mock-apis/README.md
new file mode 100644
index 0000000..0165421
--- /dev/null
+++ b/.test-infra/mock-apis/README.md
@@ -0,0 +1,116 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+
+# Overview
+
+This directory holds code and related artifacts to support API related
+integration tests.
+
+## System review
+
+The diagram below summarizes the system design. Integration tests use an API
+client that makes calls to a backend service. Prior to fulfilling the response,
+the service checks and decrements a quota. Said quota persists in a backend
+redis instance that is refreshed on an interval by the
+[Refresher](./src/main/go/cmd/service/refresher).
+
+## Echo Service
+
+The [Echo Service](./src/main/go/cmd/service/echo) implements a simple gRPC
+service that echos a payload. See [echo.proto](./proto/echo/v1/echo.proto)
+for details.
+
+```mermaid
+flowchart LR
+ echoClient --> echoSvc
+ subgraph "Integration Tests"
+ echoClient[Echo Client]
+ end
+ subgraph Backend
+ echoSvc[Echo Service<./src/main/go/cmd/service/echo>]
+ refresher[Refresher<./src/main/go/cmd/service/refresher>]
+ redis[redis://:6739]
+ refresher -- SetQuota(<string>,<int64>,<time.Duration>) --> redis
+ echoSvc -- DecrementQuota(<string>) --> redis
+ end
+```
+
+# Writing Integration Tests
+
+TODO: See https://github.com/apache/beam/issues/28859
+
+# Development Dependencies
+
+| Dependency | Reason |
+|-----------------------------------------------------|----------------------------------------------------------------------------------------|
+| [go](https://go.dev) | For making code changes in this directory. See [go.mod](go.mod) for required version. |
+| [buf](https://github.com/bufbuild/buf#installation) | Optional for when making changes to proto. |
+| [ko](https://ko.build/install/) | To easily build Go container images. |
+
+# Testing
+
+## Unit
+
+To run unit tests in this project, execute the following command:
+
+```
+go test ./src/main/go/internal/...
+```
+
+## Integration
+
+TODO: See https://github.com/apache/beam/issues/28859
+
+# Local Usage
+
+## Requirements
+
+To execute the services on your local machine, you'll need [redis](https://redis.io/docs/getting-started/installation/).
+
+## Execute services
+
+Follow these steps to run the services on your local machine.
+
+
+1. Start redis
+
+ Start redis using the following command.
+ ```
+ redis-server
+ ```
+
+1. Start the refresher service in a new terminal.
+ ```
+ export CACHE_HOST=localhost:6379; \
+ export QUOTA_ID=$(uuidgen); \
+ export QUOTA_REFRESH_INTERVAL=10s; \
+ export QUOTA_SIZE=100; \
+ go run ./src/main/go/cmd/service/refresher
+ ```
+1. Start the echo service in a new terminal.
+ ```
+ export HTTP_PORT=8080; \
+ export GRPC_PORT=50051; \
+ export CACHE_HOST=localhost:6379; \
+ go run ./src/main/go/cmd/service/echo
+ ```
+
+# Deployment
+
+TODO: See https://github.com/apache/beam/issues/28709
diff --git a/.test-infra/mock-apis/buf.gen.yaml b/.test-infra/mock-apis/buf.gen.yaml
new file mode 100644
index 0000000..31e57ff
--- /dev/null
+++ b/.test-infra/mock-apis/buf.gen.yaml
@@ -0,0 +1,40 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# buf.gen.yaml configures proto stub generation using buf.
+#
+# Requirements:
+# - go install google.golang.org/protobuf/cmd/protoc-gen-go@latest
+# - go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest
+# - https://github.com/grpc/grpc-java/blob/master/compiler/README.md#grpc-java-codegen-plugin-for-protobuf-compiler
+# - https://grpc.io/docs/languages/python/quickstart/#grpc-tools
+#
+# Usage:
+# Open a terminal in the same directory as this file and run:
+#
+# buf generate
+#
+# See https://buf.build/docs/ for more details.
+
+version: v1
+plugins:
+- name: go
+ out: src/main/go/internal
+- name: go-grpc
+ out: src/main/go/internal
+- name: java
+ out: src/main/java
+- name: grpc-java
+ out: src/main/java
\ No newline at end of file
diff --git a/.test-infra/mock-apis/buf.lock b/.test-infra/mock-apis/buf.lock
new file mode 100644
index 0000000..1304ceb9
--- /dev/null
+++ b/.test-infra/mock-apis/buf.lock
@@ -0,0 +1,7 @@
+# Generated by buf. DO NOT EDIT.
+version: v1
+deps:
+ - remote: buf.build
+ owner: googleapis
+ repository: googleapis
+ commit: 28151c0d0a1641bf938a7672c500e01d
diff --git a/.test-infra/mock-apis/buf.yaml b/.test-infra/mock-apis/buf.yaml
new file mode 100644
index 0000000..419e020
--- /dev/null
+++ b/.test-infra/mock-apis/buf.yaml
@@ -0,0 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Configures buf to include various proto dependencies.
+# See buf.build for details.
+version: v1
+deps:
+- buf.build/googleapis/googleapis
\ No newline at end of file
diff --git a/.test-infra/mock-apis/build.gradle b/.test-infra/mock-apis/build.gradle
new file mode 100644
index 0000000..64b7e8c
--- /dev/null
+++ b/.test-infra/mock-apis/build.gradle
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * License); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+ plugins {
+ id 'org.apache.beam.module'
+ }
+
+ applyJavaNature(
+ exportJavadoc: false,
+ publish: false,
+ )
+
+description = "Apache Beam :: Test Infra :: Mock APIs"
+ext.summary = "Supports API related integration tests."
+
+def guavaVersion = "31.1-jre"
+def ioGrpcApiVersion = "1.53.0"
+def protobufVersion = "1.55.1"
+def protobufJavaVersion = "3.23.2"
+
+ dependencies {
+
+ // Required by autogenerated proto classes.
+ implementation "io.grpc:grpc-api:${ioGrpcApiVersion}"
+ implementation "com.google.guava:guava:${guavaVersion}"
+ implementation "io.grpc:grpc-protobuf:${protobufVersion}"
+ implementation "com.google.protobuf:protobuf-java:${protobufJavaVersion}"
+ implementation "io.grpc:grpc-stub:${protobufVersion}"
+ }
\ No newline at end of file
diff --git a/.test-infra/mock-apis/go.mod b/.test-infra/mock-apis/go.mod
new file mode 100644
index 0000000..6f88254
--- /dev/null
+++ b/.test-infra/mock-apis/go.mod
@@ -0,0 +1,58 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// This module contains all Go code used for Beam's SDKs. This file is placed
+// in this directory in order to cover the go code required for Java and Python
+// containers, as well as the entire Go SDK. Placing this file in the repository
+// root is not possible because it causes conflicts with a pre-existing vendor
+// directory.
+module github.com/apache/beam/test-infra/mock-apis
+
+go 1.21
+
+require (
+ cloud.google.com/go/logging v1.8.1
+ cloud.google.com/go/monitoring v1.16.0
+ github.com/google/go-cmp v0.5.9
+ github.com/redis/go-redis/v9 v9.2.1
+ google.golang.org/genproto/googleapis/api v0.0.0-20230822172742-b8732ec3820d
+ google.golang.org/grpc v1.58.2
+ google.golang.org/protobuf v1.31.0
+)
+
+require (
+ cloud.google.com/go v0.110.6 // indirect
+ cloud.google.com/go/compute v1.23.0 // indirect
+ cloud.google.com/go/compute/metadata v0.2.3 // indirect
+ cloud.google.com/go/longrunning v0.5.1 // indirect
+ github.com/cespare/xxhash/v2 v2.2.0 // indirect
+ github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect
+ github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
+ github.com/golang/protobuf v1.5.3 // indirect
+ github.com/google/s2a-go v0.1.4 // indirect
+ github.com/googleapis/enterprise-certificate-proxy v0.2.4 // indirect
+ github.com/googleapis/gax-go/v2 v2.12.0 // indirect
+ go.opencensus.io v0.24.0 // indirect
+ golang.org/x/crypto v0.13.0 // indirect
+ golang.org/x/net v0.15.0 // indirect
+ golang.org/x/oauth2 v0.12.0 // indirect
+ golang.org/x/sync v0.3.0 // indirect
+ golang.org/x/sys v0.12.0 // indirect
+ golang.org/x/text v0.13.0 // indirect
+ google.golang.org/api v0.128.0 // indirect
+ google.golang.org/appengine v1.6.7 // indirect
+ google.golang.org/genproto v0.0.0-20230803162519-f966b187b2e5 // indirect
+ google.golang.org/genproto/googleapis/rpc v0.0.0-20230822172742-b8732ec3820d // indirect
+)
diff --git a/.test-infra/mock-apis/go.sum b/.test-infra/mock-apis/go.sum
new file mode 100644
index 0000000..56d3a2b
--- /dev/null
+++ b/.test-infra/mock-apis/go.sum
@@ -0,0 +1,214 @@
+cloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
+cloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
+cloud.google.com/go v0.110.6 h1:8uYAkj3YHTP/1iwReuHPxLSbdcyc+dSBbzFMrVwDR6Q=
+cloud.google.com/go v0.110.6/go.mod h1:+EYjdK8e5RME/VY/qLCAtuyALQ9q67dvuum8i+H5xsI=
+cloud.google.com/go/compute v1.23.0 h1:tP41Zoavr8ptEqaW6j+LQOnyBBhO7OkOMAGrgLopTwY=
+cloud.google.com/go/compute v1.23.0/go.mod h1:4tCnrn48xsqlwSAiLf1HXMQk8CONslYbdiEZc9FEIbM=
+cloud.google.com/go/compute/metadata v0.2.3 h1:mg4jlk7mCAj6xXp9UJ4fjI9VUI5rubuGBW5aJ7UnBMY=
+cloud.google.com/go/compute/metadata v0.2.3/go.mod h1:VAV5nSsACxMJvgaAuX6Pk2AawlZn8kiOGuCv6gTkwuA=
+cloud.google.com/go/iam v1.1.1 h1:lW7fzj15aVIXYHREOqjRBV9PsH0Z6u8Y46a1YGvQP4Y=
+cloud.google.com/go/iam v1.1.1/go.mod h1:A5avdyVL2tCppe4unb0951eI9jreack+RJ0/d+KUZOU=
+cloud.google.com/go/logging v1.8.1 h1:26skQWPeYhvIasWKm48+Eq7oUqdcdbwsCVwz5Ys0FvU=
+cloud.google.com/go/logging v1.8.1/go.mod h1:TJjR+SimHwuC8MZ9cjByQulAMgni+RkXeI3wwctHJEI=
+cloud.google.com/go/longrunning v0.5.1 h1:Fr7TXftcqTudoyRJa113hyaqlGdiBQkp0Gq7tErFDWI=
+cloud.google.com/go/longrunning v0.5.1/go.mod h1:spvimkwdz6SPWKEt/XBij79E9fiTkHSQl/fRUUQJYJc=
+cloud.google.com/go/monitoring v1.16.0 h1:rlndy4K8yknMY9JuGe2aK4SbCh21FXoCdX7SAGHmRgI=
+cloud.google.com/go/monitoring v1.16.0/go.mod h1:Ptp15HgAyM1fNICAojDMoNc/wUmn67mLHQfyqbw+poY=
+github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
+github.com/antihax/optional v1.0.0/go.mod h1:uupD/76wgC+ih3iEmQUL+0Ugr19nfwCT1kdvxnR2qWY=
+github.com/bsm/ginkgo/v2 v2.12.0 h1:Ny8MWAHyOepLGlLKYmXG4IEkioBysk6GpaRTLC8zwWs=
+github.com/bsm/ginkgo/v2 v2.12.0/go.mod h1:SwYbGRRDovPVboqFv0tPTcG1sN61LM1Z4ARdbAV9g4c=
+github.com/bsm/gomega v1.27.10 h1:yeMWxP2pV2fG3FgAODIY8EiRE3dy0aeFYt4l7wh6yKA=
+github.com/bsm/gomega v1.27.10/go.mod h1:JyEr/xRbxbtgWNi8tIEVPUYZ5Dzef52k01W3YH0H+O0=
+github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
+github.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
+github.com/cespare/xxhash/v2 v2.2.0 h1:DC2CZ1Ep5Y4k3ZQ899DldepgrayRUGE6BBZ/cd9Cj44=
+github.com/cespare/xxhash/v2 v2.2.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
+github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
+github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGXZJjfX53e64911xZQV5JYwmTeXPW+k8Sc=
+github.com/cncf/udpa/go v0.0.0-20201120205902-5459f2c99403/go.mod h1:WmhPx2Nbnhtbo57+VJT5O0JRkEi1Wbu0z5j0R8u5Hbk=
+github.com/cncf/udpa/go v0.0.0-20210930031921-04548b0d99d4/go.mod h1:6pvJx4me5XPnfI9Z40ddWsdw2W/uZgQLFXToKeRcDiI=
+github.com/cncf/xds/go v0.0.0-20210805033703-aa0b78936158/go.mod h1:eXthEFrGJvWHgFFCl3hGmgk+/aYT6PnTQLykKQRLhEs=
+github.com/cncf/xds/go v0.0.0-20210922020428-25de7278fc84/go.mod h1:eXthEFrGJvWHgFFCl3hGmgk+/aYT6PnTQLykKQRLhEs=
+github.com/cncf/xds/go v0.0.0-20211011173535-cb28da3451f1/go.mod h1:eXthEFrGJvWHgFFCl3hGmgk+/aYT6PnTQLykKQRLhEs=
+github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
+github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
+github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f h1:lO4WD4F/rVNCu3HqELle0jiPLLBs70cWOduZpkS1E78=
+github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cuUVRXasLTGF7a8hSLbxyZXjz+1KgoB3wDUb6vlszIc=
+github.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
+github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
+github.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98=
+github.com/envoyproxy/go-control-plane v0.9.9-0.20201210154907-fd9021fe5dad/go.mod h1:cXg6YxExXjJnVBQHBLXeUAgxn2UodCpnH306RInaBQk=
+github.com/envoyproxy/go-control-plane v0.9.10-0.20210907150352-cf90f659a021/go.mod h1:AFq3mo9L8Lqqiid3OhADV3RfLJnjiw63cSpi+fDTRC0=
+github.com/envoyproxy/protoc-gen-validate v0.1.0/go.mod h1:iSmxcyjqTsJpI2R4NaDN7+kN2VEUnK/pcBlmesArF7c=
+github.com/ghodss/yaml v1.0.0/go.mod h1:4dBDuWmgqj2HViK6kFavaiC9ZROes6MMH2rRYeMEF04=
+github.com/golang/glog v0.0.0-20160126235308-23def4e6c14b/go.mod h1:SBH7ygxi8pfUlaOkMMuAQtPIUF8ecWP5IEl/CR7VP2Q=
+github.com/golang/groupcache v0.0.0-20200121045136-8c9f03a8e57e/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=
+github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da h1:oI5xCqsCo564l8iNU+DwB5epxmsaqB+rhGL0m5jtYqE=
+github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=
+github.com/golang/mock v1.1.1/go.mod h1:oTYuIxOrZwtPieC+H1uAHpcLFnEyAGVDL/k47Jfbm0A=
+github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
+github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
+github.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
+github.com/golang/protobuf v1.3.3/go.mod h1:vzj43D7+SQXF/4pzW/hwtAqwc6iTitCiVSaWz5lYuqw=
+github.com/golang/protobuf v1.4.0-rc.1/go.mod h1:ceaxUfeHdC40wWswd/P6IGgMaK3YpKi5j83Wpe3EHw8=
+github.com/golang/protobuf v1.4.0-rc.1.0.20200221234624-67d41d38c208/go.mod h1:xKAWHe0F5eneWXFV3EuXVDTCmh+JuBKY0li0aMyXATA=
+github.com/golang/protobuf v1.4.0-rc.2/go.mod h1:LlEzMj4AhA7rCAGe4KMBDvJI+AwstrUpVNzEA03Pprs=
+github.com/golang/protobuf v1.4.0-rc.4.0.20200313231945-b860323f09d0/go.mod h1:WU3c8KckQ9AFe+yFwt9sWVRKCVIyN9cPHBJSNnbL67w=
+github.com/golang/protobuf v1.4.0/go.mod h1:jodUvKwWbYaEsadDk5Fwe5c77LiNKVO9IDvqG2KuDX0=
+github.com/golang/protobuf v1.4.1/go.mod h1:U8fpvMrcmy5pZrNK1lt4xCsGvpyWQ/VVv6QDs8UjoX8=
+github.com/golang/protobuf v1.4.2/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI=
+github.com/golang/protobuf v1.4.3/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI=
+github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk=
+github.com/golang/protobuf v1.5.2/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=
+github.com/golang/protobuf v1.5.3 h1:KhyjKVUg7Usr/dYsdSqoFveMYd5ko72D+zANwlG1mmg=
+github.com/golang/protobuf v1.5.3/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=
+github.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=
+github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
+github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
+github.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
+github.com/google/go-cmp v0.5.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
+github.com/google/go-cmp v0.5.3/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
+github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
+github.com/google/go-cmp v0.5.9 h1:O2Tfq5qg4qc4AmwVlvv0oLiVAGB7enBSJ2x2DqQFi38=
+github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
+github.com/google/s2a-go v0.1.4 h1:1kZ/sQM3srePvKs3tXAvQzo66XfcReoqFpIpIccE7Oc=
+github.com/google/s2a-go v0.1.4/go.mod h1:Ej+mSEMGRnqRzjc7VtF+jdBwYG5fuJfiZ8ELkjEwM0A=
+github.com/google/uuid v1.1.2/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
+github.com/googleapis/enterprise-certificate-proxy v0.2.4 h1:uGy6JWR/uMIILU8wbf+OkstIrNiMjGpEIyhx8f6W7s4=
+github.com/googleapis/enterprise-certificate-proxy v0.2.4/go.mod h1:AwSRAtLfXpU5Nm3pW+v7rGDHp09LsPtGY9MduiEsR9k=
+github.com/googleapis/gax-go/v2 v2.12.0 h1:A+gCJKdRfqXkr+BIRGtZLibNXf0m1f9E4HG56etFpas=
+github.com/googleapis/gax-go/v2 v2.12.0/go.mod h1:y+aIqrI5eb1YGMVJfuV3185Ts/D7qKpsEkdD5+I6QGU=
+github.com/grpc-ecosystem/grpc-gateway v1.16.0/go.mod h1:BDjrQk3hbvj6Nolgz8mAMFbcEtjT1g+wF4CSlocrBnw=
+github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
+github.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
+github.com/redis/go-redis/v9 v9.2.1 h1:WlYJg71ODF0dVspZZCpYmoF1+U1Jjk9Rwd7pq6QmlCg=
+github.com/redis/go-redis/v9 v9.2.1/go.mod h1:hdY0cQFCN4fnSYT6TkisLufl/4W5UIXyv0b/CLO2V2M=
+github.com/rogpeppe/fastuuid v1.2.0/go.mod h1:jVj6XXZzXRy/MSR5jhDC/2q6DgLz+nrA6LYCDYWNEvQ=
+github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
+github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
+github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
+github.com/stretchr/testify v1.5.1/go.mod h1:5W2xD1RspED5o8YsWQXVCued0rvSQ+mT+I5cxcmMvtA=
+github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
+github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
+github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
+github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
+github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
+go.opencensus.io v0.24.0 h1:y73uSU6J157QMP2kn2r30vwW1A2W2WFwSCGnAVxeaD0=
+go.opencensus.io v0.24.0/go.mod h1:vNK8G9p7aAivkbmorf4v+7Hgx+Zs0yY+0fOtgBfjQKo=
+go.opentelemetry.io/proto/otlp v0.7.0/go.mod h1:PqfVotwruBrMGOCsRd/89rSnXhoiJIqeYNgFYFoEGnI=
+golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
+golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
+golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
+golang.org/x/crypto v0.0.0-20220314234659-1baeb1ce4c0b/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
+golang.org/x/crypto v0.13.0 h1:mvySKfSWJ+UKUii46M40LOvyWfN0s2U+46/jDd0e6Ck=
+golang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=
+golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
+golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=
+golang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU=
+golang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=
+golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
+golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
+golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
+golang.org/x/net v0.0.0-20190108225652-1e06a53dbb7e/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
+golang.org/x/net v0.0.0-20190213061140-3a22650c66bd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
+golang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
+golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
+golang.org/x/net v0.0.0-20190603091049-60506f45cf65/go.mod h1:HSz+uSET+XFnRR8LxR5pz3Of3rY3CfYBVs4xY44aLks=
+golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
+golang.org/x/net v0.0.0-20200822124328-c89045814202/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
+golang.org/x/net v0.0.0-20201110031124-69a78807bb2b/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU=
+golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
+golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
+golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
+golang.org/x/net v0.15.0 h1:ugBLEUaxABaB5AJqW9enI0ACdci2RUd4eP51NTBvuJ8=
+golang.org/x/net v0.15.0/go.mod h1:idbUs1IY1+zTqbi8yxTbhexhEEk5ur9LInksu6HrEpk=
+golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
+golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
+golang.org/x/oauth2 v0.12.0 h1:smVPGxink+n1ZI5pkQa8y6fZT0RW0MgCO5bFpepy4B4=
+golang.org/x/oauth2 v0.12.0/go.mod h1:A74bZ3aGXgCY0qaIC9Ahg6Lglin4AMAco8cIv9baba4=
+golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.3.0 h1:ftCYgMx6zT/asHUrPw8BLLscYtGznsLAnjq5RH9P66E=
+golang.org/x/sync v0.3.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=
+golang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
+golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
+golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.12.0 h1:CM0HF96J0hcLAwsHPJZjfdNzs0gftsLfgKt57wWHJ0o=
+golang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
+golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
+golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
+golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
+golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
+golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
+golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
+golang.org/x/text v0.3.8/go.mod h1:E6s5w1FMmriuDzIBO73fBruAKo1PCIq6d2Q6DHfQ8WQ=
+golang.org/x/text v0.13.0 h1:ablQoSUd0tRdKxZewP80B+BaqeKJuVhuRxj/dkrun3k=
+golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
+golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
+golang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
+golang.org/x/tools v0.0.0-20190226205152-f727befe758c/go.mod h1:9Yl7xja0Znq3iFh3HoIrodX9oNMXvdceNzlUR8zjMvY=
+golang.org/x/tools v0.0.0-20190311212946-11955173bddd/go.mod h1:LCzVGOaR6xXOjkQ3onu1FJEFr0SW1gC7cKk1uF8kGRs=
+golang.org/x/tools v0.0.0-20190524140312-2c0ae7006135/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q=
+golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
+golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
+golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
+golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
+golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
+google.golang.org/api v0.128.0 h1:RjPESny5CnQRn9V6siglged+DZCgfu9l6mO9dkX9VOg=
+google.golang.org/api v0.128.0/go.mod h1:Y611qgqaE92On/7g65MQgxYul3c0rEB894kniWLY750=
+google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
+google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
+google.golang.org/appengine v1.6.7 h1:FZR1q0exgwxzPzp/aF+VccGrSfxfPpkBqjIIEq3ru6c=
+google.golang.org/appengine v1.6.7/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
+google.golang.org/genproto v0.0.0-20180817151627-c66870c02cf8/go.mod h1:JiN7NxoALGmiZfu7CAH4rXhgtRTLTxftemlI0sWmxmc=
+google.golang.org/genproto v0.0.0-20190819201941-24fa4b261c55/go.mod h1:DMBHOl98Agz4BDEuKkezgsaosCRResVns1a3J2ZsMNc=
+google.golang.org/genproto v0.0.0-20200513103714-09dca8ec2884/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=
+google.golang.org/genproto v0.0.0-20200526211855-cb27e3aa2013/go.mod h1:NbSheEEYHJ7i3ixzK3sjbqSGDJWnxyFXZblF3eUsNvo=
+google.golang.org/genproto v0.0.0-20230803162519-f966b187b2e5 h1:L6iMMGrtzgHsWofoFcihmDEMYeDR9KN/ThbPWGrh++g=
+google.golang.org/genproto v0.0.0-20230803162519-f966b187b2e5/go.mod h1:oH/ZOT02u4kWEp7oYBGYFFkCdKS/uYR9Z7+0/xuuFp8=
+google.golang.org/genproto/googleapis/api v0.0.0-20230822172742-b8732ec3820d h1:DoPTO70H+bcDXcd39vOqb2viZxgqeBeSGtZ55yZU4/Q=
+google.golang.org/genproto/googleapis/api v0.0.0-20230822172742-b8732ec3820d/go.mod h1:KjSP20unUpOx5kyQUFa7k4OJg0qeJ7DEZflGDu2p6Bk=
+google.golang.org/genproto/googleapis/rpc v0.0.0-20230822172742-b8732ec3820d h1:uvYuEyMHKNt+lT4K3bN6fGswmK8qSvcreM3BwjDh+y4=
+google.golang.org/genproto/googleapis/rpc v0.0.0-20230822172742-b8732ec3820d/go.mod h1:+Bk1OCOj40wS2hwAMA+aCW9ypzm63QTBBHp6lQ3p+9M=
+google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
+google.golang.org/grpc v1.23.0/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=
+google.golang.org/grpc v1.25.1/go.mod h1:c3i+UQWmh7LiEpx4sFZnkU36qjEYZ0imhYfXVyQciAY=
+google.golang.org/grpc v1.27.0/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk=
+google.golang.org/grpc v1.33.1/go.mod h1:fr5YgcSWrqhRRxogOsw7RzIpsmvOZ6IcH4kBYTpR3n0=
+google.golang.org/grpc v1.33.2/go.mod h1:JMHMWHQWaTccqQQlmk3MJZS+GWXOdAesneDmEnv2fbc=
+google.golang.org/grpc v1.36.0/go.mod h1:qjiiYl8FncCW8feJPdyg3v6XW24KsRHe+dy9BAGRRjU=
+google.golang.org/grpc v1.45.0/go.mod h1:lN7owxKUQEqMfSyQikvvk5tf/6zMPsrK+ONuO11+0rQ=
+google.golang.org/grpc v1.58.2 h1:SXUpjxeVF3FKrTYQI4f4KvbGD5u2xccdYdurwowix5I=
+google.golang.org/grpc v1.58.2/go.mod h1:tgX3ZQDlNJGU96V6yHh1T/JeoBQ2TXdr43YbYSsCJk0=
+google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=
+google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=
+google.golang.org/protobuf v0.0.0-20200228230310-ab0ca4ff8a60/go.mod h1:cfTl7dwQJ+fmap5saPgwCLgHXTUD7jkjRqWcaiX5VyM=
+google.golang.org/protobuf v1.20.1-0.20200309200217-e05f789c0967/go.mod h1:A+miEFZTKqfCUM6K7xSMQL9OKL/b6hQv+e19PK+JZNE=
+google.golang.org/protobuf v1.21.0/go.mod h1:47Nbq4nVaFHyn7ilMalzfO3qCViNmqZ2kzikPIcrTAo=
+google.golang.org/protobuf v1.22.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
+google.golang.org/protobuf v1.23.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
+google.golang.org/protobuf v1.23.1-0.20200526195155-81db48ad09cc/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
+google.golang.org/protobuf v1.25.0/go.mod h1:9JNX74DMeImyA3h4bdi1ymwjUzf21/xIlbajtzgsN7c=
+google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=
+google.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
+google.golang.org/protobuf v1.31.0 h1:g0LDEJHgrBl9N9r17Ru3sqWhkIx2NB67okBHPwC7hs8=
+google.golang.org/protobuf v1.31.0/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqwMG9pJV4I=
+gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
+gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
+gopkg.in/yaml.v2 v2.2.3/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
+gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
+gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
+honnef.co/go/tools v0.0.0-20190102054323-c2f93a96b099/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=
+honnef.co/go/tools v0.0.0-20190523083050-ea95bdfd59fc/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=
diff --git a/.test-infra/mock-apis/proto/echo/v1/echo.proto b/.test-infra/mock-apis/proto/echo/v1/echo.proto
new file mode 100644
index 0000000..826dc0f
--- /dev/null
+++ b/.test-infra/mock-apis/proto/echo/v1/echo.proto
@@ -0,0 +1,46 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/*
+ * Protocol buffers describing a simple mock API that echos a request.
+ */
+
+syntax = "proto3";
+
+package proto.echo.v1;
+option go_package = "proto/echo/v1";
+option java_package = "org.apache.beam.testinfra.mockapis.echo.v1";
+
+// EchoService simulates a mock API that echos a request.
+service EchoService {
+
+ // Echo an EchoRequest payload in an EchoResponse.
+ rpc Echo(EchoRequest) returns (EchoResponse) {}
+}
+
+// The request to echo a payload.
+message EchoRequest {
+ string id = 1;
+ bytes payload = 2;
+}
+
+// The response echo of a request payload.
+message EchoResponse {
+ string id = 1;
+ bytes payload = 2;
+}
\ No newline at end of file
diff --git a/.test-infra/mock-apis/src/main/go/cmd/service/echo/main.go b/.test-infra/mock-apis/src/main/go/cmd/service/echo/main.go
new file mode 100644
index 0000000..891468a
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/cmd/service/echo/main.go
@@ -0,0 +1,148 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// echo is an executable that runs the echov1.EchoService.
+package main
+
+import (
+ "context"
+ "fmt"
+ "log/slog"
+ "net"
+ "net/http"
+ "os"
+ "os/signal"
+
+ gcplogging "cloud.google.com/go/logging"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/cache"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/environment"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/logging"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/service/echo"
+ "github.com/redis/go-redis/v9"
+ "google.golang.org/grpc"
+)
+
+var (
+ env = []environment.Variable{
+ environment.CacheHost,
+ environment.GrpcPort,
+ environment.HttpPort,
+ }
+
+ logger *slog.Logger
+ logAttrs []slog.Attr
+ opts = &logging.Options{
+ Name: "echo",
+ }
+)
+
+func init() {
+ for _, v := range env {
+ logAttrs = append(logAttrs, slog.Attr{
+ Key: v.Key(),
+ Value: slog.StringValue(v.Value()),
+ })
+ }
+}
+
+func main() {
+ ctx := context.Background()
+
+ if !environment.ProjectId.Missing() {
+ client, err := gcplogging.NewClient(ctx, environment.ProjectId.Value())
+ if err != nil {
+ slog.LogAttrs(ctx, slog.LevelError, err.Error(), logAttrs...)
+ os.Exit(1)
+ }
+
+ opts.Client = client
+ }
+
+ logger = logging.New(opts)
+
+ if err := run(ctx); err != nil {
+ logger.LogAttrs(ctx, slog.LevelError, err.Error(), logAttrs...)
+ os.Exit(1)
+ }
+}
+
+func run(ctx context.Context) error {
+ ctx, cancel := signal.NotifyContext(ctx, os.Interrupt)
+ defer cancel()
+
+ if err := environment.Missing(env...); err != nil {
+ return err
+ }
+
+ grpcPort, err := environment.GrpcPort.Int()
+ if err != nil {
+ return err
+ }
+ grpcAddress := fmt.Sprintf(":%v", grpcPort)
+
+ httpPort, err := environment.HttpPort.Int()
+ if err != nil {
+ return err
+ }
+ httpAddress := fmt.Sprintf(":%v", httpPort)
+
+ s := grpc.NewServer()
+ defer s.GracefulStop()
+
+ r := redis.NewClient(&redis.Options{
+ Addr: environment.CacheHost.Value(),
+ })
+
+ echoOpts := &echo.Options{
+ Decrementer: (*cache.RedisCache)(r),
+ LoggingAttrs: logAttrs,
+ Logger: logger,
+ // TODO(damondouglas): add GCP metrics client
+ // MetricsWriter:
+ }
+
+ handler, err := echo.Register(s, echoOpts)
+ if err != nil {
+ return err
+ }
+
+ logger.LogAttrs(ctx, slog.LevelInfo, "starting service", logAttrs...)
+
+ lis, err := net.Listen("tcp", grpcAddress)
+ if err != nil {
+ return err
+ }
+
+ errChan := make(chan error)
+ go func() {
+ if err := s.Serve(lis); err != nil {
+ errChan <- err
+ }
+ }()
+
+ go func() {
+ if err := http.ListenAndServe(httpAddress, handler); err != nil {
+ errChan <- err
+ }
+ }()
+
+ select {
+ case err := <-errChan:
+ return err
+ case <-ctx.Done():
+ logger.LogAttrs(ctx, slog.LevelInfo, "shutting down", logAttrs...)
+ return nil
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/go/cmd/service/refresher/main.go b/.test-infra/mock-apis/src/main/go/cmd/service/refresher/main.go
new file mode 100644
index 0000000..63e3267
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/cmd/service/refresher/main.go
@@ -0,0 +1,121 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// refresher is an executable that runs the cache.Refresher service.
+package main
+
+import (
+ "context"
+ "log/slog"
+ "os"
+ "os/signal"
+
+ gcplogging "cloud.google.com/go/logging"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/cache"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/environment"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/logging"
+ "github.com/redis/go-redis/v9"
+)
+
+var (
+ env = []environment.Variable{
+ environment.CacheHost,
+ environment.QuotaId,
+ environment.QuotaSize,
+ environment.QuotaRefreshInterval,
+ }
+ logger *slog.Logger
+ logAttrs []slog.Attr
+ opts = &logging.Options{
+ Name: "refresher",
+ }
+)
+
+func init() {
+ for _, v := range env {
+ logAttrs = append(logAttrs, slog.Attr{
+ Key: v.Key(),
+ Value: slog.StringValue(v.Value()),
+ })
+ }
+}
+
+func main() {
+ ctx := context.Background()
+
+ if !environment.ProjectId.Missing() {
+ client, err := gcplogging.NewClient(ctx, environment.ProjectId.Value())
+ if err != nil {
+ slog.LogAttrs(ctx, slog.LevelError, err.Error(), logAttrs...)
+ os.Exit(1)
+ }
+
+ opts.Client = client
+ }
+
+ logger = logging.New(opts)
+ if err := run(ctx); err != nil {
+ logger.LogAttrs(ctx, slog.LevelError, err.Error(), logAttrs...)
+ os.Exit(1)
+ }
+}
+
+func run(ctx context.Context) error {
+ ctx, cancel := signal.NotifyContext(ctx, os.Interrupt)
+ defer cancel()
+
+ if err := environment.Missing(env...); err != nil {
+ return err
+ }
+
+ size, err := environment.QuotaSize.UInt64()
+ if err != nil {
+ return err
+ }
+
+ interval, err := environment.QuotaRefreshInterval.Duration()
+ if err != nil {
+ return err
+ }
+
+ r := redis.NewClient(&redis.Options{
+ Addr: environment.CacheHost.Value(),
+ })
+
+ opts := &cache.Options{
+ Logger: logger,
+ Setter: (*cache.RedisCache)(r),
+ }
+
+ ref, err := cache.NewRefresher(ctx, opts)
+ if err != nil {
+ return err
+ }
+
+ errChan := make(chan error)
+ go func() {
+ if err := ref.Refresh(ctx, environment.QuotaId.Value(), size, interval); err != nil {
+ errChan <- err
+ }
+ }()
+
+ select {
+ case err := <-errChan:
+ return err
+ case <-ctx.Done():
+ return nil
+ }
+
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/cache/cache.go b/.test-infra/mock-apis/src/main/go/internal/cache/cache.go
new file mode 100644
index 0000000..cab20ad
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/cache/cache.go
@@ -0,0 +1,122 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package cache
+
+import (
+ "context"
+ "errors"
+ "fmt"
+ "log/slog"
+ "reflect"
+ "time"
+
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/logging"
+)
+
+var (
+
+ // ErrNotExist is an error indicating that a resource does not exist
+ ErrNotExist = errors.New("resource does not exist")
+)
+
+// IsNotExist is true when err is ErrNotExist.
+func IsNotExist(err error) bool {
+ return errors.Is(err, ErrNotExist)
+}
+
+// Options for running the Refresher.
+type Options struct {
+ Setter UInt64Setter
+ Logger *slog.Logger
+}
+
+// Refresher refreshes a value in a cache on a set interval.
+type Refresher struct {
+ opts *Options
+ stop chan struct{}
+}
+
+// NewRefresher instantiates a Refresher.
+func NewRefresher(ctx context.Context, opts *Options) (*Refresher, error) {
+ if opts.Logger == nil {
+ opts.Logger = logging.New(&logging.Options{
+ Name: reflect.TypeOf((*Refresher)(nil)).PkgPath(),
+ })
+ }
+
+ if opts.Setter == nil {
+ return nil, fmt.Errorf("%T.Setter is nil but required", opts)
+ }
+
+ if err := opts.Setter.Alive(ctx); err != nil {
+ return nil, err
+ }
+
+ ref := &Refresher{
+ opts: opts,
+ }
+
+ return ref, nil
+}
+
+// Stop the Refresher.
+func (ref *Refresher) Stop() {
+ ref.stop <- struct{}{}
+}
+
+// Refresh the size of the associated key at an interval.
+func (ref *Refresher) Refresh(ctx context.Context, key string, size uint64, interval time.Duration) error {
+ ctx, cancel := context.WithCancel(ctx)
+ defer cancel()
+ ref.stop = make(chan struct{})
+ attrs := []slog.Attr{
+ {
+ Key: "key",
+ Value: slog.StringValue(key),
+ },
+ {
+ Key: "size",
+ Value: slog.Uint64Value(size),
+ },
+ {
+ Key: "interval",
+ Value: slog.StringValue(interval.String()),
+ },
+ }
+
+ ref.opts.Logger.LogAttrs(ctx, slog.LevelInfo, "starting refresher service", attrs...)
+
+ if err := ref.opts.Setter.Set(ctx, key, size, interval); err != nil {
+ return err
+ }
+ ref.opts.Logger.LogAttrs(ctx, slog.LevelDebug, "successful initial refresh", attrs...)
+
+ tick := time.Tick(interval)
+ for {
+ select {
+ case <-tick:
+ if err := ref.opts.Setter.Set(ctx, key, size, interval); err != nil {
+ return err
+ }
+ ref.opts.Logger.LogAttrs(ctx, slog.LevelDebug, "refresh successful", attrs...)
+ case <-ref.stop:
+ ref.opts.Logger.LogAttrs(ctx, slog.LevelInfo, "stopping refresher service", attrs...)
+ return nil
+ case <-ctx.Done():
+ return nil
+ }
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/cache/doc.go b/.test-infra/mock-apis/src/main/go/internal/cache/doc.go
new file mode 100644
index 0000000..c0f937a
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/cache/doc.go
@@ -0,0 +1,17 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package cache stores and retrieves data from a cache.
+package cache
diff --git a/.test-infra/mock-apis/src/main/go/internal/cache/interface.go b/.test-infra/mock-apis/src/main/go/internal/cache/interface.go
new file mode 100644
index 0000000..8266f72
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/cache/interface.go
@@ -0,0 +1,45 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package cache
+
+import (
+ "context"
+ "time"
+)
+
+// HealthChecker checks the health and availability of a resource.
+type HealthChecker interface {
+
+ // Alive checks whether the resource is healthy and available.
+ Alive(ctx context.Context) error
+}
+
+// UInt64Setter associates a key with a value for an expiry time.Duration.
+type UInt64Setter interface {
+ HealthChecker
+
+ // Set a key with a value for an expiry time.Duration.
+ Set(ctx context.Context, key string, value uint64, expiry time.Duration) error
+}
+
+// Decrementer decrements a value associated with a key.
+type Decrementer interface {
+ HealthChecker
+
+ // Decrement the value associated with a key; returns the value after
+ // decrementing it.
+ Decrement(ctx context.Context, key string) (int64, error)
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/cache/redis.go b/.test-infra/mock-apis/src/main/go/internal/cache/redis.go
new file mode 100644
index 0000000..51ad730
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/cache/redis.go
@@ -0,0 +1,59 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package cache
+
+import (
+ "context"
+ "time"
+
+ "github.com/redis/go-redis/v9"
+)
+
+// Validate interface implementations
+var _ UInt64Setter = &RedisCache{}
+var _ Decrementer = &RedisCache{}
+var _ HealthChecker = &RedisCache{}
+
+// RedisCache implements a Decrementer and a Refresher.
+type RedisCache redis.Client
+
+// Set implements Refresher's Set method using a redis cache where expiry of 0
+// has no expiration. Returns any error from the redis client.
+func (client *RedisCache) Set(ctx context.Context, key string, value uint64, expiry time.Duration) error {
+ r := (*redis.Client)(client)
+ return r.Set(ctx, key, value, expiry).Err()
+}
+
+// Decrement implements Decrementer's Decrement method using a redis cache.
+// Returns an error when the key does not exist or from the redis client.
+func (client *RedisCache) Decrement(ctx context.Context, key string) (int64, error) {
+ r := (*redis.Client)(client)
+ v, err := r.Exists(ctx, key).Result()
+ if err != nil {
+ return -1, err
+ }
+ if v == 0 {
+ return -1, ErrNotExist
+ }
+ return r.Decr(ctx, key).Result()
+}
+
+// Alive implements HealthChecker's Alive checking the availability of a
+// redis cache. Returns an error if no successful connection.
+func (client *RedisCache) Alive(ctx context.Context) error {
+ r := (*redis.Client)(client)
+ return r.Ping(ctx).Err()
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/environment/variable.go b/.test-infra/mock-apis/src/main/go/internal/environment/variable.go
new file mode 100644
index 0000000..b1e3a8e
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/environment/variable.go
@@ -0,0 +1,118 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package environment provides helpers for interacting with environment variables.
+package environment
+
+import (
+ "fmt"
+ "os"
+ "strconv"
+ "strings"
+ "time"
+)
+
+var (
+ // HttpPort is the port to bind an HTTP service.
+ HttpPort Variable = "HTTP_PORT"
+
+ // GrpcPort is the port to bind a gRPC service.
+ GrpcPort Variable = "GRPC_PORT"
+
+ // CacheHost is the host address of the cache.
+ CacheHost Variable = "CACHE_HOST"
+
+ // ProjectId is the ID of the Google Cloud host project.
+ ProjectId Variable = "PROJECT_ID"
+
+ // QuotaId uniquely identifies a quota measure.
+ QuotaId Variable = "QUOTA_ID"
+
+ // QuotaSize specifies the size of the quota.
+ QuotaSize Variable = "QUOTA_SIZE"
+
+ // QuotaRefreshInterval configures how often a quota is refreshed.
+ QuotaRefreshInterval Variable = "QUOTA_REFRESH_INTERVAL"
+)
+
+// Variable defines an environment variable via a string type alias.
+// Variable's string defaultValue assigns the system environment variable key.
+type Variable string
+
+// Default a default value to the system environment.
+func (v Variable) Default(value string) error {
+ if v.Missing() {
+ return os.Setenv((string)(v), value)
+ }
+ return nil
+}
+
+// MustDefault a default value to the system environment. Panics on error.
+func (v Variable) MustDefault(value string) {
+ if err := v.Default(value); err != nil {
+ panic(err)
+ }
+}
+
+// Missing reports whether the system environment variable is an empty string.
+func (v Variable) Missing() bool {
+ return v.Value() == ""
+}
+
+// Key returns the system environment variable key.
+func (v Variable) Key() string {
+ return (string)(v)
+}
+
+// Value returns the system environment variable value.
+func (v Variable) Value() string {
+ return os.Getenv((string)(v))
+}
+
+// Int returns the system environment variable parsed as an int.
+func (v Variable) Int() (int, error) {
+ return strconv.Atoi(v.Value())
+}
+
+// UInt64 returns the system environment variable value parsed as a uint64.
+func (v Variable) UInt64() (uint64, error) {
+ return strconv.ParseUint(v.Value(), 10, 64)
+}
+
+// Duration returns the system environment variable value parsed as time.Duration.
+func (v Variable) Duration() (time.Duration, error) {
+ return time.ParseDuration(v.Value())
+}
+
+// KeyValue returns a concatenated string of the system environment variable's
+// <key>=<defaultValue>.
+func (v Variable) KeyValue() string {
+ return fmt.Sprintf("%s=%s", (string)(v), v.Value())
+}
+
+// Missing reports as an error listing all Variable among vars that are
+// not assigned in the system environment.
+func Missing(vars ...Variable) error {
+ var missing []string
+ for _, v := range vars {
+ if v.Missing() {
+ missing = append(missing, v.KeyValue())
+ }
+ }
+ if len(missing) > 0 {
+ return fmt.Errorf("variables empty but expected from environment: %s", strings.Join(missing, "; "))
+ }
+ return nil
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/environment/variable_test.go b/.test-infra/mock-apis/src/main/go/internal/environment/variable_test.go
new file mode 100644
index 0000000..b566f14
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/environment/variable_test.go
@@ -0,0 +1,312 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package environment
+
+import (
+ "errors"
+ "os"
+ "testing"
+
+ "github.com/google/go-cmp/cmp"
+)
+
+func TestMissing(t *testing.T) {
+ type args struct {
+ vars []Variable
+ values []string
+ }
+ tests := []struct {
+ name string
+ args args
+ want error
+ }{
+ {
+ name: "{}",
+ args: args{},
+ },
+ {
+ name: "{A=}",
+ args: args{
+ vars: []Variable{
+ "A",
+ },
+ values: []string{
+ "",
+ },
+ },
+ want: errors.New("variables empty but expected from environment: A="),
+ },
+ {
+ name: "{A=1}",
+ args: args{
+ vars: []Variable{
+ "A",
+ },
+ values: []string{
+ "1",
+ },
+ },
+ want: nil,
+ },
+ {
+ name: "{A=; B=}",
+ args: args{
+ vars: []Variable{
+ "A",
+ "B",
+ },
+ values: []string{
+ "",
+ "",
+ },
+ },
+ want: errors.New("variables empty but expected from environment: A=; B="),
+ },
+ }
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ var got, want string
+ clearVars(tt.args.vars...)
+ set(t, tt.args.vars, tt.args.values)
+ err := Missing(tt.args.vars...)
+ if err != nil {
+ got = err.Error()
+ }
+ if tt.want != nil {
+ want = tt.want.Error()
+ }
+ if diff := cmp.Diff(want, got); diff != "" {
+ t.Errorf("Missing() error returned unexpected difference in error messages (-want +got):\n%s", diff)
+ }
+ })
+ }
+}
+
+func TestVariable_Default(t *testing.T) {
+ type args struct {
+ setValue string
+ defaultValue string
+ }
+ tests := []struct {
+ name string
+ v Variable
+ args args
+ want string
+ }{
+ {
+ name: "environment variable not set",
+ v: "A",
+ args: args{
+ defaultValue: "1",
+ },
+ want: "1",
+ },
+ {
+ name: "environment variable default is overridden by set value",
+ v: "A",
+ args: args{
+ setValue: "2",
+ defaultValue: "1",
+ },
+ want: "2",
+ },
+ }
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ clearVars(tt.v)
+ if tt.args.setValue != "" {
+ set(t, []Variable{tt.v}, []string{tt.args.setValue})
+ }
+ if err := tt.v.Default(tt.args.defaultValue); err != nil {
+ t.Fatalf("could not set default environment variable value during test execution: %v", err)
+ }
+ got := os.Getenv(tt.v.Key())
+ if got != tt.want {
+ t.Errorf("Default() = %s, want %s", got, tt.want)
+ }
+ })
+ }
+}
+
+func TestVariable_KeyValue(t *testing.T) {
+ tests := []struct {
+ name string
+ v Variable
+ value string
+ want string
+ }{
+ {
+ name: "environment variable not set",
+ v: "A",
+ want: "A=",
+ },
+ {
+ name: "environment variable is set",
+ v: "A",
+ value: "1",
+ want: "A=1",
+ },
+ }
+ for _, tt := range tests {
+ clearVars(tt.v)
+ t.Run(tt.name, func(t *testing.T) {
+ set(t, []Variable{tt.v}, []string{tt.value})
+ got := tt.v.KeyValue()
+ if got != tt.want {
+ t.Errorf("KeyValue() = %s, want %s", got, tt.want)
+ }
+ })
+ }
+}
+
+func TestVariable_Missing(t *testing.T) {
+ type args struct {
+ setValue string
+ defaultValue string
+ }
+ tests := []struct {
+ name string
+ args args
+ v Variable
+ want bool
+ }{
+ {
+ name: "no default and not set",
+ args: args{},
+ v: "A",
+ want: true,
+ },
+ {
+ name: "has default but not set",
+ args: args{
+ defaultValue: "1",
+ },
+ v: "A",
+ want: false,
+ },
+ {
+ name: "no default but set",
+ args: args{
+ setValue: "1",
+ },
+ v: "A",
+ want: false,
+ },
+ {
+ name: "has default and set",
+ args: args{
+ setValue: "2",
+ defaultValue: "1",
+ },
+ v: "A",
+ want: false,
+ },
+ }
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ clearVars(tt.v)
+ if tt.args.defaultValue != "" {
+ if err := tt.v.Default(tt.args.defaultValue); err != nil {
+ t.Fatalf("could not set default environment variable value during test execution: %v", err)
+ }
+ }
+ if tt.args.setValue != "" {
+ set(t, []Variable{tt.v}, []string{tt.args.setValue})
+ }
+ if got := tt.v.Missing(); got != tt.want {
+ t.Errorf("Missing() = %v, want %v", got, tt.want)
+ }
+ })
+ }
+}
+
+func TestVariable_Value(t *testing.T) {
+ type args struct {
+ setValue string
+ defaultValue string
+ }
+ tests := []struct {
+ name string
+ args args
+ v Variable
+ want string
+ }{
+ {
+ name: "no default and not set",
+ args: args{},
+ v: "A",
+ want: "",
+ },
+ {
+ name: "has default but not set",
+ args: args{
+ defaultValue: "1",
+ },
+ v: "A",
+ want: "1",
+ },
+ {
+ name: "no default but set",
+ args: args{
+ setValue: "1",
+ },
+ v: "A",
+ want: "1",
+ },
+ {
+ name: "has default and set",
+ args: args{
+ setValue: "2",
+ defaultValue: "1",
+ },
+ v: "A",
+ want: "2",
+ },
+ }
+ for _, tt := range tests {
+ clearVars(tt.v)
+ if tt.args.defaultValue != "" {
+ if err := tt.v.Default(tt.args.defaultValue); err != nil {
+ t.Fatalf("could not set default environment variable value during test execution: %v", err)
+ }
+ }
+ if tt.args.setValue != "" {
+ set(t, []Variable{tt.v}, []string{tt.args.setValue})
+ }
+ t.Run(tt.name, func(t *testing.T) {
+ if got := tt.v.Value(); got != tt.want {
+ t.Errorf("Value() = %v, want %v", got, tt.want)
+ }
+ })
+ }
+}
+
+func clearVars(vars ...Variable) {
+ for _, k := range vars {
+ _ = os.Setenv(k.Key(), "")
+ }
+}
+
+func set(t *testing.T, vars []Variable, values []string) {
+ if len(vars) != len(values) {
+ t.Fatalf("test cases should be configured with matching args.vars and args.values: len(tt.args.vars): %v != len(tt.args.values): %v", len(vars), len(values))
+ }
+ for i := range vars {
+ key := vars[i].Key()
+ value := values[i]
+ _ = os.Setenv(key, value)
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/logging/logging.go b/.test-infra/mock-apis/src/main/go/internal/logging/logging.go
new file mode 100644
index 0000000..53cead4
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/logging/logging.go
@@ -0,0 +1,137 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package logging performs structured output of log entries.
+package logging
+
+import (
+ "context"
+ "encoding/json"
+ "io"
+ "log/slog"
+ "os"
+ "path"
+ "runtime"
+ "sync"
+
+ "cloud.google.com/go/logging"
+ "cloud.google.com/go/logging/apiv2/loggingpb"
+)
+
+// Options for the slog.Logger
+type Options struct {
+ *slog.HandlerOptions
+ Name string
+ Writer io.Writer
+ Client *logging.Client
+}
+
+// New instantiates a slog.Logger to output using Google Cloud logging entries.
+// When running locally, output is JSON strings of Cloud logging entries and
+// does not make any API calls to the service. When running in Google Cloud,
+// logging entries are submitted to the Cloud logging service.
+func New(opts *Options) *slog.Logger {
+ if opts.HandlerOptions == nil {
+ opts.HandlerOptions = &slog.HandlerOptions{}
+ }
+
+ opts.AddSource = true
+
+ if opts.Writer == nil {
+ opts.Writer = os.Stdout
+ }
+
+ handler := &gcpHandler{
+ name: opts.Name,
+ mu: &sync.Mutex{},
+ out: opts.Writer,
+ JSONHandler: slog.NewJSONHandler(opts.Writer, opts.HandlerOptions),
+ }
+
+ if opts.Client != nil {
+ handler.logger = opts.Client.Logger(path.Base(opts.Name))
+ }
+
+ return slog.New(handler)
+}
+
+var _ slog.Handler = &gcpHandler{}
+
+type gcpHandler struct {
+ name string
+ *slog.JSONHandler
+ mu *sync.Mutex
+ out io.Writer
+ logger *logging.Logger
+}
+
+func (g *gcpHandler) Enabled(ctx context.Context, level slog.Level) bool {
+ return g.JSONHandler.Enabled(ctx, level)
+}
+
+func severity(lvl slog.Level) logging.Severity {
+ switch lvl {
+ case slog.LevelDebug:
+ return logging.Debug
+ case slog.LevelInfo:
+ return logging.Info
+ case slog.LevelWarn:
+ return logging.Warning
+ case slog.LevelError:
+ return logging.Error
+ }
+ return logging.Default
+}
+
+func (g *gcpHandler) Handle(_ context.Context, record slog.Record) error {
+ payload := map[string]any{
+ "message": record.Message,
+ }
+ record.Attrs(func(attr slog.Attr) bool {
+ payload[attr.Key] = attr.Value.Any()
+ return true
+ })
+ fs := runtime.CallersFrames([]uintptr{record.PC})
+ f, _ := fs.Next()
+ entry := logging.Entry{
+ LogName: g.name,
+ Timestamp: record.Time,
+ Severity: severity(record.Level),
+ Payload: payload,
+ SourceLocation: &loggingpb.LogEntrySourceLocation{
+ File: f.File,
+ Line: int64(f.Line),
+ },
+ }
+ g.mu.Lock()
+ defer g.mu.Unlock()
+ if g.logger == nil {
+ return json.NewEncoder(g.out).Encode(entry)
+ }
+
+ entry.LogName = ""
+ g.logger.Log(entry)
+ return g.logger.Flush()
+}
+
+func (g *gcpHandler) WithAttrs(attrs []slog.Attr) slog.Handler {
+ h := g.JSONHandler
+ return h.WithAttrs(attrs)
+}
+
+func (g *gcpHandler) WithGroup(name string) slog.Handler {
+ h := g.JSONHandler
+ return h.WithGroup(name)
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/logging/logging_test.go b/.test-infra/mock-apis/src/main/go/internal/logging/logging_test.go
new file mode 100644
index 0000000..87bfa16
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/logging/logging_test.go
@@ -0,0 +1,153 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package logging_test
+
+import (
+ "bytes"
+ "context"
+ "encoding/json"
+ "log/slog"
+ "runtime"
+ "testing"
+ "time"
+
+ gcplogging "cloud.google.com/go/logging"
+ "cloud.google.com/go/logging/apiv2/loggingpb"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/logging"
+ "github.com/google/go-cmp/cmp"
+ "github.com/google/go-cmp/cmp/cmpopts"
+)
+
+var (
+ opts = []cmp.Option{
+ cmpopts.IgnoreFields(loggingpb.LogEntrySourceLocation{}, "state", "sizeCache", "unknownFields"),
+ cmpopts.IgnoreFields(gcplogging.Entry{}, "Timestamp"),
+ }
+)
+
+func Test_logger_Info(t *testing.T) {
+ type args struct {
+ message string
+ fields []slog.Attr
+ }
+ tests := []struct {
+ name string
+ args args
+ want gcplogging.Entry
+ }{
+ {
+ name: "message only",
+ args: args{
+ message: "hello log",
+ },
+ want: gcplogging.Entry{
+ LogName: "message only",
+ Severity: gcplogging.Info,
+ Payload: map[string]interface{}{
+ "message": "hello log",
+ },
+ },
+ },
+ {
+ name: "with flat fields",
+ args: args{
+ message: "message with fields",
+ fields: []slog.Attr{
+ {
+ Key: "string",
+ Value: slog.StringValue("a string"),
+ },
+ {
+ Key: "int",
+ Value: slog.IntValue(1),
+ },
+ {
+ Key: "bool",
+ Value: slog.BoolValue(true),
+ },
+ {
+ Key: "float",
+ Value: slog.Float64Value(1.23456789),
+ },
+ },
+ },
+ want: gcplogging.Entry{
+ LogName: "with flat fields",
+ Severity: gcplogging.Info,
+ Payload: map[string]interface{}{
+ "message": "message with fields",
+ "string": "a string",
+ "int": float64(1),
+ "bool": true,
+ "float": 1.23456789,
+ },
+ },
+ },
+ }
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ buf := bytes.Buffer{}
+ l := logging.New(&logging.Options{
+ Name: tt.name,
+ Writer: &buf,
+ })
+ l.LogAttrs(context.Background(), slog.LevelInfo, tt.args.message, tt.args.fields...)
+ _, file, line, _ := runtime.Caller(0)
+ tt.want.SourceLocation = &loggingpb.LogEntrySourceLocation{
+ File: file,
+ Line: int64(line) - 1,
+ }
+ var got gcplogging.Entry
+ if err := json.NewDecoder(&buf).Decode(&got); err != nil {
+ t.Fatal(err)
+ }
+ if diff := cmp.Diff(tt.want, got, opts...); diff != "" {
+ t.Errorf("LogAttrs(Info) yielded unexpected difference in log entry (-want, +got):\n%s", diff)
+ }
+ })
+ }
+}
+func Test_logger_Error(t *testing.T) {
+ buf := bytes.Buffer{}
+ l := logging.New(&logging.Options{
+ Name: "test logger error",
+ Writer: &buf,
+ })
+ message := "some error"
+ fields := []slog.Attr{
+ {
+ Key: "observed",
+ Value: slog.TimeValue(time.Unix(1000000000, 0)),
+ },
+ }
+ l.LogAttrs(context.Background(), slog.LevelError, message, fields...)
+ _, file, line, _ := runtime.Caller(0)
+ var got gcplogging.Entry
+ if err := json.NewDecoder(&buf).Decode(&got); err != nil {
+ t.Fatal(err)
+ }
+ if diff := cmp.Diff(gcplogging.Entry{
+ LogName: "test logger error",
+ Severity: gcplogging.Error,
+ Payload: map[string]any{"message": "some error", "observed": "2001-09-09T01:46:40Z"},
+ SourceLocation: &loggingpb.LogEntrySourceLocation{
+ File: file,
+ Line: int64(line) - 1,
+ },
+ }, got, opts...); diff != "" {
+ t.Errorf("LogAttrs(Error) yielded unexpected difference in log entry (-want, +got):\n%s", diff)
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/metric/doc.go b/.test-infra/mock-apis/src/main/go/internal/metric/doc.go
new file mode 100644
index 0000000..43bfc77
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/metric/doc.go
@@ -0,0 +1,17 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package metric supports monitoring.
+package metric
diff --git a/.test-infra/mock-apis/src/main/go/internal/metric/gcp.go b/.test-infra/mock-apis/src/main/go/internal/metric/gcp.go
new file mode 100644
index 0000000..3d23d53
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/metric/gcp.go
@@ -0,0 +1,77 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package metric
+
+import (
+ "context"
+ "path"
+
+ monitoring "cloud.google.com/go/monitoring/apiv3"
+ "cloud.google.com/go/monitoring/apiv3/v2/monitoringpb"
+ "google.golang.org/genproto/googleapis/api/metric"
+ "google.golang.org/genproto/googleapis/api/monitoredres"
+ "google.golang.org/protobuf/types/known/timestamppb"
+)
+
+const (
+ metricTypePrefix = "custom.googleapis.com"
+ monitoredResourceType = "generic_task"
+)
+
+// GcpGauge implements a Writer for a Google Cloud gauge.
+// See https://cloud.google.com/monitoring/api/v3/kinds-and-types#metric-kinds
+type GcpGauge monitoring.MetricClient
+
+// Write to a Google Cloud monitoring gauge.
+func (writer *GcpGauge) Write(ctx context.Context, name string, unit string, points ...*Point) error {
+ var mPts []*monitoringpb.Point
+ for _, p := range points {
+ t := timestamppb.New(p.Timestamp)
+ mPts = append(mPts, &monitoringpb.Point{
+ Interval: &monitoringpb.TimeInterval{
+ StartTime: t,
+ EndTime: t,
+ },
+ Value: &monitoringpb.TypedValue{
+ Value: &monitoringpb.TypedValue_Int64Value{
+ Int64Value: p.Value,
+ },
+ },
+ })
+ }
+ ts := timeseries(name, unit, metric.MetricDescriptor_GAUGE, mPts)
+
+ client := (*monitoring.MetricClient)(writer)
+ return client.CreateTimeSeries(ctx, &monitoringpb.CreateTimeSeriesRequest{
+ Name: name,
+ TimeSeries: []*monitoringpb.TimeSeries{ts},
+ })
+}
+
+func timeseries(name string, unit string, kind metric.MetricDescriptor_MetricKind, points []*monitoringpb.Point) *monitoringpb.TimeSeries {
+ return &monitoringpb.TimeSeries{
+ Metric: &metric.Metric{
+ Type: path.Join(metricTypePrefix, name),
+ },
+ Resource: &monitoredres.MonitoredResource{
+ Type: monitoredResourceType,
+ },
+ MetricKind: kind,
+ ValueType: metric.MetricDescriptor_INT64,
+ Unit: unit,
+ Points: points,
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/metric/interface.go b/.test-infra/mock-apis/src/main/go/internal/metric/interface.go
new file mode 100644
index 0000000..d0f7e38
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/metric/interface.go
@@ -0,0 +1,38 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package metric
+
+import (
+ "context"
+ "time"
+)
+
+// Writer writes to a metric sink.
+type Writer interface {
+
+ // Write to a metric sink.
+ Write(ctx context.Context, name string, unit string, points ...*Point) error
+}
+
+// Point models a metric data point.
+type Point struct {
+
+ // Timestamp of the metric data point.
+ Timestamp time.Time
+
+ // Value of the metric data point.
+ Value int64
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/proto/echo/v1/echo.pb.go b/.test-infra/mock-apis/src/main/go/internal/proto/echo/v1/echo.pb.go
new file mode 100644
index 0000000..97ced92
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/proto/echo/v1/echo.pb.go
@@ -0,0 +1,256 @@
+//
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements. See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership. The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+//
+// Protocol buffers describing a simple mock API that echos a request.
+
+// Code generated by protoc-gen-go. DO NOT EDIT.
+// versions:
+// protoc-gen-go v1.28.1
+// protoc (unknown)
+// source: proto/echo/v1/echo.proto
+
+package v1
+
+import (
+ protoreflect "google.golang.org/protobuf/reflect/protoreflect"
+ protoimpl "google.golang.org/protobuf/runtime/protoimpl"
+ reflect "reflect"
+ sync "sync"
+)
+
+const (
+ // Verify that this generated code is sufficiently up-to-date.
+ _ = protoimpl.EnforceVersion(20 - protoimpl.MinVersion)
+ // Verify that runtime/protoimpl is sufficiently up-to-date.
+ _ = protoimpl.EnforceVersion(protoimpl.MaxVersion - 20)
+)
+
+// The request to echo a payload.
+type EchoRequest struct {
+ state protoimpl.MessageState
+ sizeCache protoimpl.SizeCache
+ unknownFields protoimpl.UnknownFields
+
+ Id string `protobuf:"bytes,1,opt,name=id,proto3" json:"id,omitempty"`
+ Payload []byte `protobuf:"bytes,2,opt,name=payload,proto3" json:"payload,omitempty"`
+}
+
+func (x *EchoRequest) Reset() {
+ *x = EchoRequest{}
+ if protoimpl.UnsafeEnabled {
+ mi := &file_proto_echo_v1_echo_proto_msgTypes[0]
+ ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x))
+ ms.StoreMessageInfo(mi)
+ }
+}
+
+func (x *EchoRequest) String() string {
+ return protoimpl.X.MessageStringOf(x)
+}
+
+func (*EchoRequest) ProtoMessage() {}
+
+func (x *EchoRequest) ProtoReflect() protoreflect.Message {
+ mi := &file_proto_echo_v1_echo_proto_msgTypes[0]
+ if protoimpl.UnsafeEnabled && x != nil {
+ ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x))
+ if ms.LoadMessageInfo() == nil {
+ ms.StoreMessageInfo(mi)
+ }
+ return ms
+ }
+ return mi.MessageOf(x)
+}
+
+// Deprecated: Use EchoRequest.ProtoReflect.Descriptor instead.
+func (*EchoRequest) Descriptor() ([]byte, []int) {
+ return file_proto_echo_v1_echo_proto_rawDescGZIP(), []int{0}
+}
+
+func (x *EchoRequest) GetId() string {
+ if x != nil {
+ return x.Id
+ }
+ return ""
+}
+
+func (x *EchoRequest) GetPayload() []byte {
+ if x != nil {
+ return x.Payload
+ }
+ return nil
+}
+
+// The response echo of a request payload.
+type EchoResponse struct {
+ state protoimpl.MessageState
+ sizeCache protoimpl.SizeCache
+ unknownFields protoimpl.UnknownFields
+
+ Id string `protobuf:"bytes,1,opt,name=id,proto3" json:"id,omitempty"`
+ Payload []byte `protobuf:"bytes,2,opt,name=payload,proto3" json:"payload,omitempty"`
+}
+
+func (x *EchoResponse) Reset() {
+ *x = EchoResponse{}
+ if protoimpl.UnsafeEnabled {
+ mi := &file_proto_echo_v1_echo_proto_msgTypes[1]
+ ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x))
+ ms.StoreMessageInfo(mi)
+ }
+}
+
+func (x *EchoResponse) String() string {
+ return protoimpl.X.MessageStringOf(x)
+}
+
+func (*EchoResponse) ProtoMessage() {}
+
+func (x *EchoResponse) ProtoReflect() protoreflect.Message {
+ mi := &file_proto_echo_v1_echo_proto_msgTypes[1]
+ if protoimpl.UnsafeEnabled && x != nil {
+ ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x))
+ if ms.LoadMessageInfo() == nil {
+ ms.StoreMessageInfo(mi)
+ }
+ return ms
+ }
+ return mi.MessageOf(x)
+}
+
+// Deprecated: Use EchoResponse.ProtoReflect.Descriptor instead.
+func (*EchoResponse) Descriptor() ([]byte, []int) {
+ return file_proto_echo_v1_echo_proto_rawDescGZIP(), []int{1}
+}
+
+func (x *EchoResponse) GetId() string {
+ if x != nil {
+ return x.Id
+ }
+ return ""
+}
+
+func (x *EchoResponse) GetPayload() []byte {
+ if x != nil {
+ return x.Payload
+ }
+ return nil
+}
+
+var File_proto_echo_v1_echo_proto protoreflect.FileDescriptor
+
+var file_proto_echo_v1_echo_proto_rawDesc = []byte{
+ 0x0a, 0x18, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x2f, 0x65, 0x63, 0x68, 0x6f, 0x2f, 0x76, 0x31, 0x2f,
+ 0x65, 0x63, 0x68, 0x6f, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x12, 0x0d, 0x70, 0x72, 0x6f, 0x74,
+ 0x6f, 0x2e, 0x65, 0x63, 0x68, 0x6f, 0x2e, 0x76, 0x31, 0x22, 0x37, 0x0a, 0x0b, 0x45, 0x63, 0x68,
+ 0x6f, 0x52, 0x65, 0x71, 0x75, 0x65, 0x73, 0x74, 0x12, 0x0e, 0x0a, 0x02, 0x69, 0x64, 0x18, 0x01,
+ 0x20, 0x01, 0x28, 0x09, 0x52, 0x02, 0x69, 0x64, 0x12, 0x18, 0x0a, 0x07, 0x70, 0x61, 0x79, 0x6c,
+ 0x6f, 0x61, 0x64, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0c, 0x52, 0x07, 0x70, 0x61, 0x79, 0x6c, 0x6f,
+ 0x61, 0x64, 0x22, 0x38, 0x0a, 0x0c, 0x45, 0x63, 0x68, 0x6f, 0x52, 0x65, 0x73, 0x70, 0x6f, 0x6e,
+ 0x73, 0x65, 0x12, 0x0e, 0x0a, 0x02, 0x69, 0x64, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x02,
+ 0x69, 0x64, 0x12, 0x18, 0x0a, 0x07, 0x70, 0x61, 0x79, 0x6c, 0x6f, 0x61, 0x64, 0x18, 0x02, 0x20,
+ 0x01, 0x28, 0x0c, 0x52, 0x07, 0x70, 0x61, 0x79, 0x6c, 0x6f, 0x61, 0x64, 0x32, 0x50, 0x0a, 0x0b,
+ 0x45, 0x63, 0x68, 0x6f, 0x53, 0x65, 0x72, 0x76, 0x69, 0x63, 0x65, 0x12, 0x41, 0x0a, 0x04, 0x45,
+ 0x63, 0x68, 0x6f, 0x12, 0x1a, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x2e, 0x65, 0x63, 0x68, 0x6f,
+ 0x2e, 0x76, 0x31, 0x2e, 0x45, 0x63, 0x68, 0x6f, 0x52, 0x65, 0x71, 0x75, 0x65, 0x73, 0x74, 0x1a,
+ 0x1b, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x2e, 0x65, 0x63, 0x68, 0x6f, 0x2e, 0x76, 0x31, 0x2e,
+ 0x45, 0x63, 0x68, 0x6f, 0x52, 0x65, 0x73, 0x70, 0x6f, 0x6e, 0x73, 0x65, 0x22, 0x00, 0x42, 0x3b,
+ 0x0a, 0x2a, 0x6f, 0x72, 0x67, 0x2e, 0x61, 0x70, 0x61, 0x63, 0x68, 0x65, 0x2e, 0x62, 0x65, 0x61,
+ 0x6d, 0x2e, 0x74, 0x65, 0x73, 0x74, 0x69, 0x6e, 0x66, 0x72, 0x61, 0x2e, 0x6d, 0x6f, 0x63, 0x6b,
+ 0x61, 0x70, 0x69, 0x73, 0x2e, 0x65, 0x63, 0x68, 0x6f, 0x2e, 0x76, 0x31, 0x5a, 0x0d, 0x70, 0x72,
+ 0x6f, 0x74, 0x6f, 0x2f, 0x65, 0x63, 0x68, 0x6f, 0x2f, 0x76, 0x31, 0x62, 0x06, 0x70, 0x72, 0x6f,
+ 0x74, 0x6f, 0x33,
+}
+
+var (
+ file_proto_echo_v1_echo_proto_rawDescOnce sync.Once
+ file_proto_echo_v1_echo_proto_rawDescData = file_proto_echo_v1_echo_proto_rawDesc
+)
+
+func file_proto_echo_v1_echo_proto_rawDescGZIP() []byte {
+ file_proto_echo_v1_echo_proto_rawDescOnce.Do(func() {
+ file_proto_echo_v1_echo_proto_rawDescData = protoimpl.X.CompressGZIP(file_proto_echo_v1_echo_proto_rawDescData)
+ })
+ return file_proto_echo_v1_echo_proto_rawDescData
+}
+
+var file_proto_echo_v1_echo_proto_msgTypes = make([]protoimpl.MessageInfo, 2)
+var file_proto_echo_v1_echo_proto_goTypes = []interface{}{
+ (*EchoRequest)(nil), // 0: proto.echo.v1.EchoRequest
+ (*EchoResponse)(nil), // 1: proto.echo.v1.EchoResponse
+}
+var file_proto_echo_v1_echo_proto_depIdxs = []int32{
+ 0, // 0: proto.echo.v1.EchoService.Echo:input_type -> proto.echo.v1.EchoRequest
+ 1, // 1: proto.echo.v1.EchoService.Echo:output_type -> proto.echo.v1.EchoResponse
+ 1, // [1:2] is the sub-list for method output_type
+ 0, // [0:1] is the sub-list for method input_type
+ 0, // [0:0] is the sub-list for extension type_name
+ 0, // [0:0] is the sub-list for extension extendee
+ 0, // [0:0] is the sub-list for field type_name
+}
+
+func init() { file_proto_echo_v1_echo_proto_init() }
+func file_proto_echo_v1_echo_proto_init() {
+ if File_proto_echo_v1_echo_proto != nil {
+ return
+ }
+ if !protoimpl.UnsafeEnabled {
+ file_proto_echo_v1_echo_proto_msgTypes[0].Exporter = func(v interface{}, i int) interface{} {
+ switch v := v.(*EchoRequest); i {
+ case 0:
+ return &v.state
+ case 1:
+ return &v.sizeCache
+ case 2:
+ return &v.unknownFields
+ default:
+ return nil
+ }
+ }
+ file_proto_echo_v1_echo_proto_msgTypes[1].Exporter = func(v interface{}, i int) interface{} {
+ switch v := v.(*EchoResponse); i {
+ case 0:
+ return &v.state
+ case 1:
+ return &v.sizeCache
+ case 2:
+ return &v.unknownFields
+ default:
+ return nil
+ }
+ }
+ }
+ type x struct{}
+ out := protoimpl.TypeBuilder{
+ File: protoimpl.DescBuilder{
+ GoPackagePath: reflect.TypeOf(x{}).PkgPath(),
+ RawDescriptor: file_proto_echo_v1_echo_proto_rawDesc,
+ NumEnums: 0,
+ NumMessages: 2,
+ NumExtensions: 0,
+ NumServices: 1,
+ },
+ GoTypes: file_proto_echo_v1_echo_proto_goTypes,
+ DependencyIndexes: file_proto_echo_v1_echo_proto_depIdxs,
+ MessageInfos: file_proto_echo_v1_echo_proto_msgTypes,
+ }.Build()
+ File_proto_echo_v1_echo_proto = out.File
+ file_proto_echo_v1_echo_proto_rawDesc = nil
+ file_proto_echo_v1_echo_proto_goTypes = nil
+ file_proto_echo_v1_echo_proto_depIdxs = nil
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/proto/echo/v1/echo_grpc.pb.go b/.test-infra/mock-apis/src/main/go/internal/proto/echo/v1/echo_grpc.pb.go
new file mode 100644
index 0000000..3ce2bde
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/proto/echo/v1/echo_grpc.pb.go
@@ -0,0 +1,107 @@
+// Code generated by protoc-gen-go-grpc. DO NOT EDIT.
+// versions:
+// - protoc-gen-go-grpc v1.2.0
+// - protoc (unknown)
+// source: proto/echo/v1/echo.proto
+
+package v1
+
+import (
+ context "context"
+ grpc "google.golang.org/grpc"
+ codes "google.golang.org/grpc/codes"
+ status "google.golang.org/grpc/status"
+)
+
+// This is a compile-time assertion to ensure that this generated file
+// is compatible with the grpc package it is being compiled against.
+// Requires gRPC-Go v1.32.0 or later.
+const _ = grpc.SupportPackageIsVersion7
+
+// EchoServiceClient is the client API for EchoService service.
+//
+// For semantics around ctx use and closing/ending streaming RPCs, please refer to https://pkg.go.dev/google.golang.org/grpc/?tab=doc#ClientConn.NewStream.
+type EchoServiceClient interface {
+ // Echo an EchoRequest payload in an EchoResponse.
+ Echo(ctx context.Context, in *EchoRequest, opts ...grpc.CallOption) (*EchoResponse, error)
+}
+
+type echoServiceClient struct {
+ cc grpc.ClientConnInterface
+}
+
+func NewEchoServiceClient(cc grpc.ClientConnInterface) EchoServiceClient {
+ return &echoServiceClient{cc}
+}
+
+func (c *echoServiceClient) Echo(ctx context.Context, in *EchoRequest, opts ...grpc.CallOption) (*EchoResponse, error) {
+ out := new(EchoResponse)
+ err := c.cc.Invoke(ctx, "/proto.echo.v1.EchoService/Echo", in, out, opts...)
+ if err != nil {
+ return nil, err
+ }
+ return out, nil
+}
+
+// EchoServiceServer is the server API for EchoService service.
+// All implementations must embed UnimplementedEchoServiceServer
+// for forward compatibility
+type EchoServiceServer interface {
+ // Echo an EchoRequest payload in an EchoResponse.
+ Echo(context.Context, *EchoRequest) (*EchoResponse, error)
+ mustEmbedUnimplementedEchoServiceServer()
+}
+
+// UnimplementedEchoServiceServer must be embedded to have forward compatible implementations.
+type UnimplementedEchoServiceServer struct {
+}
+
+func (UnimplementedEchoServiceServer) Echo(context.Context, *EchoRequest) (*EchoResponse, error) {
+ return nil, status.Errorf(codes.Unimplemented, "method Echo not implemented")
+}
+func (UnimplementedEchoServiceServer) mustEmbedUnimplementedEchoServiceServer() {}
+
+// UnsafeEchoServiceServer may be embedded to opt out of forward compatibility for this service.
+// Use of this interface is not recommended, as added methods to EchoServiceServer will
+// result in compilation errors.
+type UnsafeEchoServiceServer interface {
+ mustEmbedUnimplementedEchoServiceServer()
+}
+
+func RegisterEchoServiceServer(s grpc.ServiceRegistrar, srv EchoServiceServer) {
+ s.RegisterService(&EchoService_ServiceDesc, srv)
+}
+
+func _EchoService_Echo_Handler(srv interface{}, ctx context.Context, dec func(interface{}) error, interceptor grpc.UnaryServerInterceptor) (interface{}, error) {
+ in := new(EchoRequest)
+ if err := dec(in); err != nil {
+ return nil, err
+ }
+ if interceptor == nil {
+ return srv.(EchoServiceServer).Echo(ctx, in)
+ }
+ info := &grpc.UnaryServerInfo{
+ Server: srv,
+ FullMethod: "/proto.echo.v1.EchoService/Echo",
+ }
+ handler := func(ctx context.Context, req interface{}) (interface{}, error) {
+ return srv.(EchoServiceServer).Echo(ctx, req.(*EchoRequest))
+ }
+ return interceptor(ctx, in, info, handler)
+}
+
+// EchoService_ServiceDesc is the grpc.ServiceDesc for EchoService service.
+// It's only intended for direct use with grpc.RegisterService,
+// and not to be introspected or modified (even as a copy)
+var EchoService_ServiceDesc = grpc.ServiceDesc{
+ ServiceName: "proto.echo.v1.EchoService",
+ HandlerType: (*EchoServiceServer)(nil),
+ Methods: []grpc.MethodDesc{
+ {
+ MethodName: "Echo",
+ Handler: _EchoService_Echo_Handler,
+ },
+ },
+ Streams: []grpc.StreamDesc{},
+ Metadata: "proto/echo/v1/echo.proto",
+}
diff --git a/.test-infra/mock-apis/src/main/go/internal/service/echo/echo.go b/.test-infra/mock-apis/src/main/go/internal/service/echo/echo.go
new file mode 100644
index 0000000..d958d18
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/go/internal/service/echo/echo.go
@@ -0,0 +1,185 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements. See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License. You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package echo contains the EchoService API implementation.
+package echo
+
+import (
+ "context"
+ "encoding/json"
+ "fmt"
+ "log/slog"
+ "net/http"
+ "path"
+ "reflect"
+ "time"
+
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/cache"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/logging"
+ "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/metric"
+ echov1 "github.com/apache/beam/test-infra/mock-apis/src/main/go/internal/proto/echo/v1"
+ "google.golang.org/grpc"
+ "google.golang.org/grpc/codes"
+ "google.golang.org/grpc/health/grpc_health_v1"
+ "google.golang.org/grpc/status"
+)
+
+const (
+ metricsNamePrefix = "echo"
+ echoPath = "/proto.echo.v1.EchoService/Echo"
+ echoPathAlias = "/v1/echo"
+ healthPath = "/grpc.health.v1.Health/Check"
+ healthPathAlias = "/v1/healthz"
+)
+
+type Options struct {
+ Decrementer cache.Decrementer
+ MetricsWriter metric.Writer
+ Logger *slog.Logger
+ LoggingAttrs []slog.Attr
+}
+
+// Register a grpc.Server with the echov1.EchoService. Returns a http.Handler or error.
+func Register(s *grpc.Server, opts *Options) (http.Handler, error) {
+ if opts.Logger == nil {
+ opts.Logger = logging.New(&logging.Options{
+ Name: reflect.TypeOf((*echo)(nil)).PkgPath(),
+ })
+ }
+ srv := &echo{
+ opts: opts,
+ }
+
+ echov1.RegisterEchoServiceServer(s, srv)
+ grpc_health_v1.RegisterHealthServer(s, srv)
+
+ return srv, nil
+}
+
+type echo struct {
+ echov1.UnimplementedEchoServiceServer
+ grpc_health_v1.UnimplementedHealthServer
+ opts *Options
+}
+
+// ServeHTTP implements http.Handler, allowing echo to support HTTP clients in addition to gRPC.
+func (srv *echo) ServeHTTP(w http.ResponseWriter, r *http.Request) {
+ switch r.URL.Path {
+ case echoPath, echoPathAlias:
+ srv.httpHandler(w, r)
+ case healthPath, healthPathAlias:
+ srv.checkHandler(w, r)
+ default:
+ http.Error(w, fmt.Sprintf("%s not found", r.URL.Path), http.StatusNotFound)
+ }
+}
+
+// Check checks whether echo service's underlying decrementer is alive.
+func (srv *echo) Check(ctx context.Context, _ *grpc_health_v1.HealthCheckRequest) (*grpc_health_v1.HealthCheckResponse, error) {
+ if err := srv.opts.Decrementer.Alive(ctx); err != nil {
+ return nil, err
+ }
+ return &grpc_health_v1.HealthCheckResponse{
+ Status: grpc_health_v1.HealthCheckResponse_SERVING,
+ }, nil
+}
+
+func (srv *echo) checkHandler(w http.ResponseWriter, r *http.Request) {
+ resp, err := srv.Check(r.Context(), nil)
+ if err != nil {
+
+ http.Error(w, err.Error(), http.StatusInternalServerError)
+ return
+ }
+ if err := json.NewEncoder(w).Encode(resp); err != nil {
+ srv.opts.Logger.LogAttrs(context.Background(), slog.LevelError, err.Error(), srv.opts.LoggingAttrs...)
+ http.Error(w, err.Error(), http.StatusInternalServerError)
+ }
+}
+
+// Watch the health of the echov1.EchoServiceServer.
+func (srv *echo) Watch(request *grpc_health_v1.HealthCheckRequest, server grpc_health_v1.Health_WatchServer) error {
+ resp, err := srv.Check(server.Context(), request)
+ if err != nil {
+ srv.opts.Logger.LogAttrs(context.Background(), slog.LevelError, err.Error(), srv.opts.LoggingAttrs...)
+ return err
+ }
+ return server.Send(resp)
+}
+
+// Echo a EchoRequest with a EchoResponse. Decrements an underlying quota identified by the id of the request.
+// Returns a cache.IsNotExist if request's id does not map to a key in the cache.
+// See cache.Refresher for how the cache refreshes the quota identified by the request id.
+func (srv *echo) Echo(ctx context.Context, request *echov1.EchoRequest) (*echov1.EchoResponse, error) {
+ v, err := srv.opts.Decrementer.Decrement(ctx, request.Id)
+ if cache.IsNotExist(err) {
+ return nil, status.Errorf(codes.NotFound, "error: source not found: %s, err %v", request.Id, err)
+ }
+ if err != nil {
+ srv.opts.Logger.LogAttrs(context.Background(), slog.LevelError, err.Error(), srv.opts.LoggingAttrs...)
+ return nil, status.Errorf(codes.Internal, "error: encountered from cache for resource: %srv, err %v", request.Id, err)
+ }
+
+ if err := srv.writeMetric(ctx, request.Id, v); err != nil {
+ return nil, err
+ }
+
+ if v < 0 {
+ return nil, status.Errorf(codes.ResourceExhausted, "error: resource exhausted for: %srv", request.Id)
+ }
+
+ return &echov1.EchoResponse{
+ Id: request.Id,
+ Payload: request.Payload,
+ }, nil
+}
+
+func (srv *echo) writeMetric(ctx context.Context, id string, value int64) error {
+ if srv.opts.MetricsWriter == nil {
+ return nil
+ }
+ if err := srv.opts.MetricsWriter.Write(ctx, path.Join(metricsNamePrefix, id), "unit", &metric.Point{
+ Timestamp: time.Now(),
+ Value: value + 1,
+ }); err != nil {
+ srv.opts.Logger.LogAttrs(context.Background(), slog.LevelError, err.Error(), srv.opts.LoggingAttrs...)
+ }
+ return nil
+}
+
+func (srv *echo) httpHandler(w http.ResponseWriter, r *http.Request) {
+ var body *echov1.EchoRequest
+ if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
+ err = fmt.Errorf("error decoding request body, payload field of %T needs to be base64 encoded, error: %w", body, err)
+ srv.opts.Logger.LogAttrs(context.Background(), slog.LevelError, err.Error(), srv.opts.LoggingAttrs...)
+ http.Error(w, err.Error(), http.StatusBadRequest)
+ return
+ }
+
+ resp, err := srv.Echo(r.Context(), body)
+ if status.Code(err) == http.StatusNotFound {
+ http.Error(w, err.Error(), http.StatusNotFound)
+ return
+ }
+
+ if err != nil {
+ http.Error(w, err.Error(), http.StatusInternalServerError)
+ return
+ }
+
+ if err := json.NewEncoder(w).Encode(resp); err != nil {
+ http.Error(w, err.Error(), http.StatusInternalServerError)
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/Echo.java b/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/Echo.java
new file mode 100644
index 0000000..4652ff7
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/Echo.java
@@ -0,0 +1,1447 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.testinfra.mockapis.echo.v1;
+
+@SuppressWarnings({
+ "argument",
+ "assignment",
+ "initialization.fields.uninitialized",
+ "initialization.static.field.uninitialized",
+ "override.param",
+ "ClassTypeParameterName",
+ "ForbidNonVendoredGuava",
+ "JavadocStyle",
+ "LocalVariableName",
+ "MemberName",
+ "NeedBraces",
+ "MissingOverride",
+ "RedundantModifier",
+ "ReferenceEquality",
+ "UnusedVariable",
+})
+public final class Echo {
+ private Echo() {}
+
+ public static void registerAllExtensions(com.google.protobuf.ExtensionRegistryLite registry) {}
+
+ public static void registerAllExtensions(com.google.protobuf.ExtensionRegistry registry) {
+ registerAllExtensions((com.google.protobuf.ExtensionRegistryLite) registry);
+ }
+
+ public interface EchoRequestOrBuilder
+ extends
+ // @@protoc_insertion_point(interface_extends:proto.echo.v1.EchoRequest)
+ com.google.protobuf.MessageOrBuilder {
+
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The id.
+ */
+ java.lang.String getId();
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The bytes for id.
+ */
+ com.google.protobuf.ByteString getIdBytes();
+
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return The payload.
+ */
+ com.google.protobuf.ByteString getPayload();
+ }
+ /**
+ *
+ *
+ * <pre>
+ * The request to echo a payload.
+ * </pre>
+ *
+ * Protobuf type {@code proto.echo.v1.EchoRequest}
+ */
+ public static final class EchoRequest extends com.google.protobuf.GeneratedMessageV3
+ implements
+ // @@protoc_insertion_point(message_implements:proto.echo.v1.EchoRequest)
+ EchoRequestOrBuilder {
+ private static final long serialVersionUID = 0L;
+ // Use EchoRequest.newBuilder() to construct.
+ private EchoRequest(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {
+ super(builder);
+ }
+
+ private EchoRequest() {
+ id_ = "";
+ payload_ = com.google.protobuf.ByteString.EMPTY;
+ }
+
+ @java.lang.Override
+ @SuppressWarnings({"unused"})
+ protected java.lang.Object newInstance(UnusedPrivateParameter unused) {
+ return new EchoRequest();
+ }
+
+ @java.lang.Override
+ public final com.google.protobuf.UnknownFieldSet getUnknownFields() {
+ return this.unknownFields;
+ }
+
+ public static final com.google.protobuf.Descriptors.Descriptor getDescriptor() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoRequest_descriptor;
+ }
+
+ @java.lang.Override
+ protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable
+ internalGetFieldAccessorTable() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoRequest_fieldAccessorTable
+ .ensureFieldAccessorsInitialized(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.class,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.Builder.class);
+ }
+
+ public static final int ID_FIELD_NUMBER = 1;
+
+ @SuppressWarnings("serial")
+ private volatile java.lang.Object id_ = "";
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The id.
+ */
+ @java.lang.Override
+ public java.lang.String getId() {
+ java.lang.Object ref = id_;
+ if (ref instanceof java.lang.String) {
+ return (java.lang.String) ref;
+ } else {
+ com.google.protobuf.ByteString bs = (com.google.protobuf.ByteString) ref;
+ java.lang.String s = bs.toStringUtf8();
+ id_ = s;
+ return s;
+ }
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The bytes for id.
+ */
+ @java.lang.Override
+ public com.google.protobuf.ByteString getIdBytes() {
+ java.lang.Object ref = id_;
+ if (ref instanceof java.lang.String) {
+ com.google.protobuf.ByteString b =
+ com.google.protobuf.ByteString.copyFromUtf8((java.lang.String) ref);
+ id_ = b;
+ return b;
+ } else {
+ return (com.google.protobuf.ByteString) ref;
+ }
+ }
+
+ public static final int PAYLOAD_FIELD_NUMBER = 2;
+ private com.google.protobuf.ByteString payload_ = com.google.protobuf.ByteString.EMPTY;
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return The payload.
+ */
+ @java.lang.Override
+ public com.google.protobuf.ByteString getPayload() {
+ return payload_;
+ }
+
+ private byte memoizedIsInitialized = -1;
+
+ @java.lang.Override
+ public final boolean isInitialized() {
+ byte isInitialized = memoizedIsInitialized;
+ if (isInitialized == 1) return true;
+ if (isInitialized == 0) return false;
+
+ memoizedIsInitialized = 1;
+ return true;
+ }
+
+ @java.lang.Override
+ public void writeTo(com.google.protobuf.CodedOutputStream output) throws java.io.IOException {
+ if (!com.google.protobuf.GeneratedMessageV3.isStringEmpty(id_)) {
+ com.google.protobuf.GeneratedMessageV3.writeString(output, 1, id_);
+ }
+ if (!payload_.isEmpty()) {
+ output.writeBytes(2, payload_);
+ }
+ getUnknownFields().writeTo(output);
+ }
+
+ @java.lang.Override
+ public int getSerializedSize() {
+ int size = memoizedSize;
+ if (size != -1) return size;
+
+ size = 0;
+ if (!com.google.protobuf.GeneratedMessageV3.isStringEmpty(id_)) {
+ size += com.google.protobuf.GeneratedMessageV3.computeStringSize(1, id_);
+ }
+ if (!payload_.isEmpty()) {
+ size += com.google.protobuf.CodedOutputStream.computeBytesSize(2, payload_);
+ }
+ size += getUnknownFields().getSerializedSize();
+ memoizedSize = size;
+ return size;
+ }
+
+ @java.lang.Override
+ public boolean equals(final java.lang.Object obj) {
+ if (obj == this) {
+ return true;
+ }
+ if (!(obj instanceof org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest)) {
+ return super.equals(obj);
+ }
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest other =
+ (org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest) obj;
+
+ if (!getId().equals(other.getId())) return false;
+ if (!getPayload().equals(other.getPayload())) return false;
+ if (!getUnknownFields().equals(other.getUnknownFields())) return false;
+ return true;
+ }
+
+ @java.lang.Override
+ public int hashCode() {
+ if (memoizedHashCode != 0) {
+ return memoizedHashCode;
+ }
+ int hash = 41;
+ hash = (19 * hash) + getDescriptor().hashCode();
+ hash = (37 * hash) + ID_FIELD_NUMBER;
+ hash = (53 * hash) + getId().hashCode();
+ hash = (37 * hash) + PAYLOAD_FIELD_NUMBER;
+ hash = (53 * hash) + getPayload().hashCode();
+ hash = (29 * hash) + getUnknownFields().hashCode();
+ memoizedHashCode = hash;
+ return hash;
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ java.nio.ByteBuffer data) throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ java.nio.ByteBuffer data, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ com.google.protobuf.ByteString data)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ com.google.protobuf.ByteString data,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(byte[] data)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ byte[] data, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ java.io.InputStream input) throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(PARSER, input);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ java.io.InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(
+ PARSER, input, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseDelimitedFrom(
+ java.io.InputStream input) throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseDelimitedWithIOException(PARSER, input);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseDelimitedFrom(
+ java.io.InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseDelimitedWithIOException(
+ PARSER, input, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ com.google.protobuf.CodedInputStream input) throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(PARSER, input);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest parseFrom(
+ com.google.protobuf.CodedInputStream input,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(
+ PARSER, input, extensionRegistry);
+ }
+
+ @java.lang.Override
+ public Builder newBuilderForType() {
+ return newBuilder();
+ }
+
+ public static Builder newBuilder() {
+ return DEFAULT_INSTANCE.toBuilder();
+ }
+
+ public static Builder newBuilder(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest prototype) {
+ return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);
+ }
+
+ @java.lang.Override
+ public Builder toBuilder() {
+ return this == DEFAULT_INSTANCE ? new Builder() : new Builder().mergeFrom(this);
+ }
+
+ @java.lang.Override
+ protected Builder newBuilderForType(
+ com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {
+ Builder builder = new Builder(parent);
+ return builder;
+ }
+ /**
+ *
+ *
+ * <pre>
+ * The request to echo a payload.
+ * </pre>
+ *
+ * Protobuf type {@code proto.echo.v1.EchoRequest}
+ */
+ public static final class Builder
+ extends com.google.protobuf.GeneratedMessageV3.Builder<Builder>
+ implements
+ // @@protoc_insertion_point(builder_implements:proto.echo.v1.EchoRequest)
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequestOrBuilder {
+ public static final com.google.protobuf.Descriptors.Descriptor getDescriptor() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoRequest_descriptor;
+ }
+
+ @java.lang.Override
+ protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable
+ internalGetFieldAccessorTable() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoRequest_fieldAccessorTable
+ .ensureFieldAccessorsInitialized(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.class,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.Builder.class);
+ }
+
+ // Construct using org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.newBuilder()
+ private Builder() {}
+
+ private Builder(com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {
+ super(parent);
+ }
+
+ @java.lang.Override
+ public Builder clear() {
+ super.clear();
+ bitField0_ = 0;
+ id_ = "";
+ payload_ = com.google.protobuf.ByteString.EMPTY;
+ return this;
+ }
+
+ @java.lang.Override
+ public com.google.protobuf.Descriptors.Descriptor getDescriptorForType() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoRequest_descriptor;
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest
+ getDefaultInstanceForType() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.getDefaultInstance();
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest build() {
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest result = buildPartial();
+ if (!result.isInitialized()) {
+ throw newUninitializedMessageException(result);
+ }
+ return result;
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest buildPartial() {
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest result =
+ new org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest(this);
+ if (bitField0_ != 0) {
+ buildPartial0(result);
+ }
+ onBuilt();
+ return result;
+ }
+
+ private void buildPartial0(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest result) {
+ int from_bitField0_ = bitField0_;
+ if (((from_bitField0_ & 0x00000001) != 0)) {
+ result.id_ = id_;
+ }
+ if (((from_bitField0_ & 0x00000002) != 0)) {
+ result.payload_ = payload_;
+ }
+ }
+
+ @java.lang.Override
+ public Builder clone() {
+ return super.clone();
+ }
+
+ @java.lang.Override
+ public Builder setField(
+ com.google.protobuf.Descriptors.FieldDescriptor field, java.lang.Object value) {
+ return super.setField(field, value);
+ }
+
+ @java.lang.Override
+ public Builder clearField(com.google.protobuf.Descriptors.FieldDescriptor field) {
+ return super.clearField(field);
+ }
+
+ @java.lang.Override
+ public Builder clearOneof(com.google.protobuf.Descriptors.OneofDescriptor oneof) {
+ return super.clearOneof(oneof);
+ }
+
+ @java.lang.Override
+ public Builder setRepeatedField(
+ com.google.protobuf.Descriptors.FieldDescriptor field,
+ int index,
+ java.lang.Object value) {
+ return super.setRepeatedField(field, index, value);
+ }
+
+ @java.lang.Override
+ public Builder addRepeatedField(
+ com.google.protobuf.Descriptors.FieldDescriptor field, java.lang.Object value) {
+ return super.addRepeatedField(field, value);
+ }
+
+ @java.lang.Override
+ public Builder mergeFrom(com.google.protobuf.Message other) {
+ if (other instanceof org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest) {
+ return mergeFrom((org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest) other);
+ } else {
+ super.mergeFrom(other);
+ return this;
+ }
+ }
+
+ public Builder mergeFrom(org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest other) {
+ if (other
+ == org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.getDefaultInstance())
+ return this;
+ if (!other.getId().isEmpty()) {
+ id_ = other.id_;
+ bitField0_ |= 0x00000001;
+ onChanged();
+ }
+ if (other.getPayload() != com.google.protobuf.ByteString.EMPTY) {
+ setPayload(other.getPayload());
+ }
+ this.mergeUnknownFields(other.getUnknownFields());
+ onChanged();
+ return this;
+ }
+
+ @java.lang.Override
+ public final boolean isInitialized() {
+ return true;
+ }
+
+ @java.lang.Override
+ public Builder mergeFrom(
+ com.google.protobuf.CodedInputStream input,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ if (extensionRegistry == null) {
+ throw new java.lang.NullPointerException();
+ }
+ try {
+ boolean done = false;
+ while (!done) {
+ int tag = input.readTag();
+ switch (tag) {
+ case 0:
+ done = true;
+ break;
+ case 10:
+ {
+ id_ = input.readStringRequireUtf8();
+ bitField0_ |= 0x00000001;
+ break;
+ } // case 10
+ case 18:
+ {
+ payload_ = input.readBytes();
+ bitField0_ |= 0x00000002;
+ break;
+ } // case 18
+ default:
+ {
+ if (!super.parseUnknownField(input, extensionRegistry, tag)) {
+ done = true; // was an endgroup tag
+ }
+ break;
+ } // default:
+ } // switch (tag)
+ } // while (!done)
+ } catch (com.google.protobuf.InvalidProtocolBufferException e) {
+ throw e.unwrapIOException();
+ } finally {
+ onChanged();
+ } // finally
+ return this;
+ }
+
+ private int bitField0_;
+
+ private java.lang.Object id_ = "";
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The id.
+ */
+ public java.lang.String getId() {
+ java.lang.Object ref = id_;
+ if (!(ref instanceof java.lang.String)) {
+ com.google.protobuf.ByteString bs = (com.google.protobuf.ByteString) ref;
+ java.lang.String s = bs.toStringUtf8();
+ id_ = s;
+ return s;
+ } else {
+ return (java.lang.String) ref;
+ }
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The bytes for id.
+ */
+ public com.google.protobuf.ByteString getIdBytes() {
+ java.lang.Object ref = id_;
+ if (ref instanceof String) {
+ com.google.protobuf.ByteString b =
+ com.google.protobuf.ByteString.copyFromUtf8((java.lang.String) ref);
+ id_ = b;
+ return b;
+ } else {
+ return (com.google.protobuf.ByteString) ref;
+ }
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @param value The id to set.
+ * @return This builder for chaining.
+ */
+ public Builder setId(java.lang.String value) {
+ if (value == null) {
+ throw new NullPointerException();
+ }
+ id_ = value;
+ bitField0_ |= 0x00000001;
+ onChanged();
+ return this;
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return This builder for chaining.
+ */
+ public Builder clearId() {
+ id_ = getDefaultInstance().getId();
+ bitField0_ = (bitField0_ & ~0x00000001);
+ onChanged();
+ return this;
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @param value The bytes for id to set.
+ * @return This builder for chaining.
+ */
+ public Builder setIdBytes(com.google.protobuf.ByteString value) {
+ if (value == null) {
+ throw new NullPointerException();
+ }
+ checkByteStringIsUtf8(value);
+ id_ = value;
+ bitField0_ |= 0x00000001;
+ onChanged();
+ return this;
+ }
+
+ private com.google.protobuf.ByteString payload_ = com.google.protobuf.ByteString.EMPTY;
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return The payload.
+ */
+ @java.lang.Override
+ public com.google.protobuf.ByteString getPayload() {
+ return payload_;
+ }
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @param value The payload to set.
+ * @return This builder for chaining.
+ */
+ public Builder setPayload(com.google.protobuf.ByteString value) {
+ if (value == null) {
+ throw new NullPointerException();
+ }
+ payload_ = value;
+ bitField0_ |= 0x00000002;
+ onChanged();
+ return this;
+ }
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return This builder for chaining.
+ */
+ public Builder clearPayload() {
+ bitField0_ = (bitField0_ & ~0x00000002);
+ payload_ = getDefaultInstance().getPayload();
+ onChanged();
+ return this;
+ }
+
+ @java.lang.Override
+ public final Builder setUnknownFields(
+ final com.google.protobuf.UnknownFieldSet unknownFields) {
+ return super.setUnknownFields(unknownFields);
+ }
+
+ @java.lang.Override
+ public final Builder mergeUnknownFields(
+ final com.google.protobuf.UnknownFieldSet unknownFields) {
+ return super.mergeUnknownFields(unknownFields);
+ }
+
+ // @@protoc_insertion_point(builder_scope:proto.echo.v1.EchoRequest)
+ }
+
+ // @@protoc_insertion_point(class_scope:proto.echo.v1.EchoRequest)
+ private static final org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest
+ DEFAULT_INSTANCE;
+
+ static {
+ DEFAULT_INSTANCE = new org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest();
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest getDefaultInstance() {
+ return DEFAULT_INSTANCE;
+ }
+
+ private static final com.google.protobuf.Parser<EchoRequest> PARSER =
+ new com.google.protobuf.AbstractParser<EchoRequest>() {
+ @java.lang.Override
+ public EchoRequest parsePartialFrom(
+ com.google.protobuf.CodedInputStream input,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ Builder builder = newBuilder();
+ try {
+ builder.mergeFrom(input, extensionRegistry);
+ } catch (com.google.protobuf.InvalidProtocolBufferException e) {
+ throw e.setUnfinishedMessage(builder.buildPartial());
+ } catch (com.google.protobuf.UninitializedMessageException e) {
+ throw e.asInvalidProtocolBufferException()
+ .setUnfinishedMessage(builder.buildPartial());
+ } catch (java.io.IOException e) {
+ throw new com.google.protobuf.InvalidProtocolBufferException(e)
+ .setUnfinishedMessage(builder.buildPartial());
+ }
+ return builder.buildPartial();
+ }
+ };
+
+ public static com.google.protobuf.Parser<EchoRequest> parser() {
+ return PARSER;
+ }
+
+ @java.lang.Override
+ public com.google.protobuf.Parser<EchoRequest> getParserForType() {
+ return PARSER;
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest getDefaultInstanceForType() {
+ return DEFAULT_INSTANCE;
+ }
+ }
+
+ public interface EchoResponseOrBuilder
+ extends
+ // @@protoc_insertion_point(interface_extends:proto.echo.v1.EchoResponse)
+ com.google.protobuf.MessageOrBuilder {
+
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The id.
+ */
+ java.lang.String getId();
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The bytes for id.
+ */
+ com.google.protobuf.ByteString getIdBytes();
+
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return The payload.
+ */
+ com.google.protobuf.ByteString getPayload();
+ }
+ /**
+ *
+ *
+ * <pre>
+ * The response echo of a request payload.
+ * </pre>
+ *
+ * Protobuf type {@code proto.echo.v1.EchoResponse}
+ */
+ public static final class EchoResponse extends com.google.protobuf.GeneratedMessageV3
+ implements
+ // @@protoc_insertion_point(message_implements:proto.echo.v1.EchoResponse)
+ EchoResponseOrBuilder {
+ private static final long serialVersionUID = 0L;
+ // Use EchoResponse.newBuilder() to construct.
+ private EchoResponse(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {
+ super(builder);
+ }
+
+ private EchoResponse() {
+ id_ = "";
+ payload_ = com.google.protobuf.ByteString.EMPTY;
+ }
+
+ @java.lang.Override
+ @SuppressWarnings({"unused"})
+ protected java.lang.Object newInstance(UnusedPrivateParameter unused) {
+ return new EchoResponse();
+ }
+
+ @java.lang.Override
+ public final com.google.protobuf.UnknownFieldSet getUnknownFields() {
+ return this.unknownFields;
+ }
+
+ public static final com.google.protobuf.Descriptors.Descriptor getDescriptor() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoResponse_descriptor;
+ }
+
+ @java.lang.Override
+ protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable
+ internalGetFieldAccessorTable() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoResponse_fieldAccessorTable
+ .ensureFieldAccessorsInitialized(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.class,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.Builder.class);
+ }
+
+ public static final int ID_FIELD_NUMBER = 1;
+
+ @SuppressWarnings("serial")
+ private volatile java.lang.Object id_ = "";
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The id.
+ */
+ @java.lang.Override
+ public java.lang.String getId() {
+ java.lang.Object ref = id_;
+ if (ref instanceof java.lang.String) {
+ return (java.lang.String) ref;
+ } else {
+ com.google.protobuf.ByteString bs = (com.google.protobuf.ByteString) ref;
+ java.lang.String s = bs.toStringUtf8();
+ id_ = s;
+ return s;
+ }
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The bytes for id.
+ */
+ @java.lang.Override
+ public com.google.protobuf.ByteString getIdBytes() {
+ java.lang.Object ref = id_;
+ if (ref instanceof java.lang.String) {
+ com.google.protobuf.ByteString b =
+ com.google.protobuf.ByteString.copyFromUtf8((java.lang.String) ref);
+ id_ = b;
+ return b;
+ } else {
+ return (com.google.protobuf.ByteString) ref;
+ }
+ }
+
+ public static final int PAYLOAD_FIELD_NUMBER = 2;
+ private com.google.protobuf.ByteString payload_ = com.google.protobuf.ByteString.EMPTY;
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return The payload.
+ */
+ @java.lang.Override
+ public com.google.protobuf.ByteString getPayload() {
+ return payload_;
+ }
+
+ private byte memoizedIsInitialized = -1;
+
+ @java.lang.Override
+ public final boolean isInitialized() {
+ byte isInitialized = memoizedIsInitialized;
+ if (isInitialized == 1) return true;
+ if (isInitialized == 0) return false;
+
+ memoizedIsInitialized = 1;
+ return true;
+ }
+
+ @java.lang.Override
+ public void writeTo(com.google.protobuf.CodedOutputStream output) throws java.io.IOException {
+ if (!com.google.protobuf.GeneratedMessageV3.isStringEmpty(id_)) {
+ com.google.protobuf.GeneratedMessageV3.writeString(output, 1, id_);
+ }
+ if (!payload_.isEmpty()) {
+ output.writeBytes(2, payload_);
+ }
+ getUnknownFields().writeTo(output);
+ }
+
+ @java.lang.Override
+ public int getSerializedSize() {
+ int size = memoizedSize;
+ if (size != -1) return size;
+
+ size = 0;
+ if (!com.google.protobuf.GeneratedMessageV3.isStringEmpty(id_)) {
+ size += com.google.protobuf.GeneratedMessageV3.computeStringSize(1, id_);
+ }
+ if (!payload_.isEmpty()) {
+ size += com.google.protobuf.CodedOutputStream.computeBytesSize(2, payload_);
+ }
+ size += getUnknownFields().getSerializedSize();
+ memoizedSize = size;
+ return size;
+ }
+
+ @java.lang.Override
+ public boolean equals(final java.lang.Object obj) {
+ if (obj == this) {
+ return true;
+ }
+ if (!(obj instanceof org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse)) {
+ return super.equals(obj);
+ }
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse other =
+ (org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse) obj;
+
+ if (!getId().equals(other.getId())) return false;
+ if (!getPayload().equals(other.getPayload())) return false;
+ if (!getUnknownFields().equals(other.getUnknownFields())) return false;
+ return true;
+ }
+
+ @java.lang.Override
+ public int hashCode() {
+ if (memoizedHashCode != 0) {
+ return memoizedHashCode;
+ }
+ int hash = 41;
+ hash = (19 * hash) + getDescriptor().hashCode();
+ hash = (37 * hash) + ID_FIELD_NUMBER;
+ hash = (53 * hash) + getId().hashCode();
+ hash = (37 * hash) + PAYLOAD_FIELD_NUMBER;
+ hash = (53 * hash) + getPayload().hashCode();
+ hash = (29 * hash) + getUnknownFields().hashCode();
+ memoizedHashCode = hash;
+ return hash;
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ java.nio.ByteBuffer data) throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ java.nio.ByteBuffer data, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ com.google.protobuf.ByteString data)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ com.google.protobuf.ByteString data,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ byte[] data) throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ byte[] data, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ return PARSER.parseFrom(data, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ java.io.InputStream input) throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(PARSER, input);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ java.io.InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(
+ PARSER, input, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseDelimitedFrom(
+ java.io.InputStream input) throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseDelimitedWithIOException(PARSER, input);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseDelimitedFrom(
+ java.io.InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseDelimitedWithIOException(
+ PARSER, input, extensionRegistry);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ com.google.protobuf.CodedInputStream input) throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(PARSER, input);
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse parseFrom(
+ com.google.protobuf.CodedInputStream input,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ return com.google.protobuf.GeneratedMessageV3.parseWithIOException(
+ PARSER, input, extensionRegistry);
+ }
+
+ @java.lang.Override
+ public Builder newBuilderForType() {
+ return newBuilder();
+ }
+
+ public static Builder newBuilder() {
+ return DEFAULT_INSTANCE.toBuilder();
+ }
+
+ public static Builder newBuilder(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse prototype) {
+ return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);
+ }
+
+ @java.lang.Override
+ public Builder toBuilder() {
+ return this == DEFAULT_INSTANCE ? new Builder() : new Builder().mergeFrom(this);
+ }
+
+ @java.lang.Override
+ protected Builder newBuilderForType(
+ com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {
+ Builder builder = new Builder(parent);
+ return builder;
+ }
+ /**
+ *
+ *
+ * <pre>
+ * The response echo of a request payload.
+ * </pre>
+ *
+ * Protobuf type {@code proto.echo.v1.EchoResponse}
+ */
+ public static final class Builder
+ extends com.google.protobuf.GeneratedMessageV3.Builder<Builder>
+ implements
+ // @@protoc_insertion_point(builder_implements:proto.echo.v1.EchoResponse)
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponseOrBuilder {
+ public static final com.google.protobuf.Descriptors.Descriptor getDescriptor() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoResponse_descriptor;
+ }
+
+ @java.lang.Override
+ protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable
+ internalGetFieldAccessorTable() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoResponse_fieldAccessorTable
+ .ensureFieldAccessorsInitialized(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.class,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.Builder.class);
+ }
+
+ // Construct using org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.newBuilder()
+ private Builder() {}
+
+ private Builder(com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {
+ super(parent);
+ }
+
+ @java.lang.Override
+ public Builder clear() {
+ super.clear();
+ bitField0_ = 0;
+ id_ = "";
+ payload_ = com.google.protobuf.ByteString.EMPTY;
+ return this;
+ }
+
+ @java.lang.Override
+ public com.google.protobuf.Descriptors.Descriptor getDescriptorForType() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo
+ .internal_static_proto_echo_v1_EchoResponse_descriptor;
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse
+ getDefaultInstanceForType() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.getDefaultInstance();
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse build() {
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse result = buildPartial();
+ if (!result.isInitialized()) {
+ throw newUninitializedMessageException(result);
+ }
+ return result;
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse buildPartial() {
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse result =
+ new org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse(this);
+ if (bitField0_ != 0) {
+ buildPartial0(result);
+ }
+ onBuilt();
+ return result;
+ }
+
+ private void buildPartial0(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse result) {
+ int from_bitField0_ = bitField0_;
+ if (((from_bitField0_ & 0x00000001) != 0)) {
+ result.id_ = id_;
+ }
+ if (((from_bitField0_ & 0x00000002) != 0)) {
+ result.payload_ = payload_;
+ }
+ }
+
+ @java.lang.Override
+ public Builder clone() {
+ return super.clone();
+ }
+
+ @java.lang.Override
+ public Builder setField(
+ com.google.protobuf.Descriptors.FieldDescriptor field, java.lang.Object value) {
+ return super.setField(field, value);
+ }
+
+ @java.lang.Override
+ public Builder clearField(com.google.protobuf.Descriptors.FieldDescriptor field) {
+ return super.clearField(field);
+ }
+
+ @java.lang.Override
+ public Builder clearOneof(com.google.protobuf.Descriptors.OneofDescriptor oneof) {
+ return super.clearOneof(oneof);
+ }
+
+ @java.lang.Override
+ public Builder setRepeatedField(
+ com.google.protobuf.Descriptors.FieldDescriptor field,
+ int index,
+ java.lang.Object value) {
+ return super.setRepeatedField(field, index, value);
+ }
+
+ @java.lang.Override
+ public Builder addRepeatedField(
+ com.google.protobuf.Descriptors.FieldDescriptor field, java.lang.Object value) {
+ return super.addRepeatedField(field, value);
+ }
+
+ @java.lang.Override
+ public Builder mergeFrom(com.google.protobuf.Message other) {
+ if (other instanceof org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse) {
+ return mergeFrom((org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse) other);
+ } else {
+ super.mergeFrom(other);
+ return this;
+ }
+ }
+
+ public Builder mergeFrom(org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse other) {
+ if (other
+ == org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.getDefaultInstance())
+ return this;
+ if (!other.getId().isEmpty()) {
+ id_ = other.id_;
+ bitField0_ |= 0x00000001;
+ onChanged();
+ }
+ if (other.getPayload() != com.google.protobuf.ByteString.EMPTY) {
+ setPayload(other.getPayload());
+ }
+ this.mergeUnknownFields(other.getUnknownFields());
+ onChanged();
+ return this;
+ }
+
+ @java.lang.Override
+ public final boolean isInitialized() {
+ return true;
+ }
+
+ @java.lang.Override
+ public Builder mergeFrom(
+ com.google.protobuf.CodedInputStream input,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws java.io.IOException {
+ if (extensionRegistry == null) {
+ throw new java.lang.NullPointerException();
+ }
+ try {
+ boolean done = false;
+ while (!done) {
+ int tag = input.readTag();
+ switch (tag) {
+ case 0:
+ done = true;
+ break;
+ case 10:
+ {
+ id_ = input.readStringRequireUtf8();
+ bitField0_ |= 0x00000001;
+ break;
+ } // case 10
+ case 18:
+ {
+ payload_ = input.readBytes();
+ bitField0_ |= 0x00000002;
+ break;
+ } // case 18
+ default:
+ {
+ if (!super.parseUnknownField(input, extensionRegistry, tag)) {
+ done = true; // was an endgroup tag
+ }
+ break;
+ } // default:
+ } // switch (tag)
+ } // while (!done)
+ } catch (com.google.protobuf.InvalidProtocolBufferException e) {
+ throw e.unwrapIOException();
+ } finally {
+ onChanged();
+ } // finally
+ return this;
+ }
+
+ private int bitField0_;
+
+ private java.lang.Object id_ = "";
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The id.
+ */
+ public java.lang.String getId() {
+ java.lang.Object ref = id_;
+ if (!(ref instanceof java.lang.String)) {
+ com.google.protobuf.ByteString bs = (com.google.protobuf.ByteString) ref;
+ java.lang.String s = bs.toStringUtf8();
+ id_ = s;
+ return s;
+ } else {
+ return (java.lang.String) ref;
+ }
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return The bytes for id.
+ */
+ public com.google.protobuf.ByteString getIdBytes() {
+ java.lang.Object ref = id_;
+ if (ref instanceof String) {
+ com.google.protobuf.ByteString b =
+ com.google.protobuf.ByteString.copyFromUtf8((java.lang.String) ref);
+ id_ = b;
+ return b;
+ } else {
+ return (com.google.protobuf.ByteString) ref;
+ }
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @param value The id to set.
+ * @return This builder for chaining.
+ */
+ public Builder setId(java.lang.String value) {
+ if (value == null) {
+ throw new NullPointerException();
+ }
+ id_ = value;
+ bitField0_ |= 0x00000001;
+ onChanged();
+ return this;
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @return This builder for chaining.
+ */
+ public Builder clearId() {
+ id_ = getDefaultInstance().getId();
+ bitField0_ = (bitField0_ & ~0x00000001);
+ onChanged();
+ return this;
+ }
+ /**
+ * <code>string id = 1 [json_name = "id"];</code>
+ *
+ * @param value The bytes for id to set.
+ * @return This builder for chaining.
+ */
+ public Builder setIdBytes(com.google.protobuf.ByteString value) {
+ if (value == null) {
+ throw new NullPointerException();
+ }
+ checkByteStringIsUtf8(value);
+ id_ = value;
+ bitField0_ |= 0x00000001;
+ onChanged();
+ return this;
+ }
+
+ private com.google.protobuf.ByteString payload_ = com.google.protobuf.ByteString.EMPTY;
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return The payload.
+ */
+ @java.lang.Override
+ public com.google.protobuf.ByteString getPayload() {
+ return payload_;
+ }
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @param value The payload to set.
+ * @return This builder for chaining.
+ */
+ public Builder setPayload(com.google.protobuf.ByteString value) {
+ if (value == null) {
+ throw new NullPointerException();
+ }
+ payload_ = value;
+ bitField0_ |= 0x00000002;
+ onChanged();
+ return this;
+ }
+ /**
+ * <code>bytes payload = 2 [json_name = "payload"];</code>
+ *
+ * @return This builder for chaining.
+ */
+ public Builder clearPayload() {
+ bitField0_ = (bitField0_ & ~0x00000002);
+ payload_ = getDefaultInstance().getPayload();
+ onChanged();
+ return this;
+ }
+
+ @java.lang.Override
+ public final Builder setUnknownFields(
+ final com.google.protobuf.UnknownFieldSet unknownFields) {
+ return super.setUnknownFields(unknownFields);
+ }
+
+ @java.lang.Override
+ public final Builder mergeUnknownFields(
+ final com.google.protobuf.UnknownFieldSet unknownFields) {
+ return super.mergeUnknownFields(unknownFields);
+ }
+
+ // @@protoc_insertion_point(builder_scope:proto.echo.v1.EchoResponse)
+ }
+
+ // @@protoc_insertion_point(class_scope:proto.echo.v1.EchoResponse)
+ private static final org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse
+ DEFAULT_INSTANCE;
+
+ static {
+ DEFAULT_INSTANCE = new org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse();
+ }
+
+ public static org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse
+ getDefaultInstance() {
+ return DEFAULT_INSTANCE;
+ }
+
+ private static final com.google.protobuf.Parser<EchoResponse> PARSER =
+ new com.google.protobuf.AbstractParser<EchoResponse>() {
+ @java.lang.Override
+ public EchoResponse parsePartialFrom(
+ com.google.protobuf.CodedInputStream input,
+ com.google.protobuf.ExtensionRegistryLite extensionRegistry)
+ throws com.google.protobuf.InvalidProtocolBufferException {
+ Builder builder = newBuilder();
+ try {
+ builder.mergeFrom(input, extensionRegistry);
+ } catch (com.google.protobuf.InvalidProtocolBufferException e) {
+ throw e.setUnfinishedMessage(builder.buildPartial());
+ } catch (com.google.protobuf.UninitializedMessageException e) {
+ throw e.asInvalidProtocolBufferException()
+ .setUnfinishedMessage(builder.buildPartial());
+ } catch (java.io.IOException e) {
+ throw new com.google.protobuf.InvalidProtocolBufferException(e)
+ .setUnfinishedMessage(builder.buildPartial());
+ }
+ return builder.buildPartial();
+ }
+ };
+
+ public static com.google.protobuf.Parser<EchoResponse> parser() {
+ return PARSER;
+ }
+
+ @java.lang.Override
+ public com.google.protobuf.Parser<EchoResponse> getParserForType() {
+ return PARSER;
+ }
+
+ @java.lang.Override
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse
+ getDefaultInstanceForType() {
+ return DEFAULT_INSTANCE;
+ }
+ }
+
+ private static final com.google.protobuf.Descriptors.Descriptor
+ internal_static_proto_echo_v1_EchoRequest_descriptor;
+ private static final com.google.protobuf.GeneratedMessageV3.FieldAccessorTable
+ internal_static_proto_echo_v1_EchoRequest_fieldAccessorTable;
+ private static final com.google.protobuf.Descriptors.Descriptor
+ internal_static_proto_echo_v1_EchoResponse_descriptor;
+ private static final com.google.protobuf.GeneratedMessageV3.FieldAccessorTable
+ internal_static_proto_echo_v1_EchoResponse_fieldAccessorTable;
+
+ public static com.google.protobuf.Descriptors.FileDescriptor getDescriptor() {
+ return descriptor;
+ }
+
+ private static com.google.protobuf.Descriptors.FileDescriptor descriptor;
+
+ static {
+ java.lang.String[] descriptorData = {
+ "\n\030proto/echo/v1/echo.proto\022\rproto.echo.v"
+ + "1\"7\n\013EchoRequest\022\016\n\002id\030\001 \001(\tR\002id\022\030\n\007payl"
+ + "oad\030\002 \001(\014R\007payload\"8\n\014EchoResponse\022\016\n\002id"
+ + "\030\001 \001(\tR\002id\022\030\n\007payload\030\002 \001(\014R\007payload2P\n\013"
+ + "EchoService\022A\n\004Echo\022\032.proto.echo.v1.Echo"
+ + "Request\032\033.proto.echo.v1.EchoResponse\"\000B;"
+ + "\n*org.apache.beam.testinfra.mockapis.ech"
+ + "o.v1Z\rproto/echo/v1b\006proto3"
+ };
+ descriptor =
+ com.google.protobuf.Descriptors.FileDescriptor.internalBuildGeneratedFileFrom(
+ descriptorData, new com.google.protobuf.Descriptors.FileDescriptor[] {});
+ internal_static_proto_echo_v1_EchoRequest_descriptor = getDescriptor().getMessageTypes().get(0);
+ internal_static_proto_echo_v1_EchoRequest_fieldAccessorTable =
+ new com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(
+ internal_static_proto_echo_v1_EchoRequest_descriptor,
+ new java.lang.String[] {
+ "Id", "Payload",
+ });
+ internal_static_proto_echo_v1_EchoResponse_descriptor =
+ getDescriptor().getMessageTypes().get(1);
+ internal_static_proto_echo_v1_EchoResponse_fieldAccessorTable =
+ new com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(
+ internal_static_proto_echo_v1_EchoResponse_descriptor,
+ new java.lang.String[] {
+ "Id", "Payload",
+ });
+ }
+
+ // @@protoc_insertion_point(outer_class_scope)
+}
diff --git a/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/EchoServiceGrpc.java b/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/EchoServiceGrpc.java
new file mode 100644
index 0000000..1443789
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/EchoServiceGrpc.java
@@ -0,0 +1,393 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.testinfra.mockapis.echo.v1;
+
+import static io.grpc.MethodDescriptor.generateFullMethodName;
+
+/**
+ *
+ *
+ * <pre>
+ * EchoService simulates a mock API that echos a request.
+ * </pre>
+ */
+@SuppressWarnings({
+ "argument",
+ "assignment",
+ "initialization.fields.uninitialized",
+ "initialization.static.field.uninitialized",
+ "override.param",
+ "ClassTypeParameterName",
+ "ForbidNonVendoredGuava",
+ "JavadocStyle",
+ "LocalVariableName",
+ "MemberName",
+ "NeedBraces",
+ "MissingOverride",
+ "RedundantModifier",
+ "ReferenceEquality",
+ "UnusedVariable",
+})
+@javax.annotation.Generated(
+ value = "by gRPC proto compiler (version 1.58.0)",
+ comments = "Source: proto/echo/v1/echo.proto")
+@io.grpc.stub.annotations.GrpcGenerated
+public final class EchoServiceGrpc {
+
+ private EchoServiceGrpc() {}
+
+ public static final java.lang.String SERVICE_NAME = "proto.echo.v1.EchoService";
+
+ // Static method descriptors that strictly reflect the proto.
+ private static volatile io.grpc.MethodDescriptor<
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ getEchoMethod;
+
+ @io.grpc.stub.annotations.RpcMethod(
+ fullMethodName = SERVICE_NAME + '/' + "Echo",
+ requestType = org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest.class,
+ responseType = org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse.class,
+ methodType = io.grpc.MethodDescriptor.MethodType.UNARY)
+ public static io.grpc.MethodDescriptor<
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ getEchoMethod() {
+ io.grpc.MethodDescriptor<
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ getEchoMethod;
+ if ((getEchoMethod = EchoServiceGrpc.getEchoMethod) == null) {
+ synchronized (EchoServiceGrpc.class) {
+ if ((getEchoMethod = EchoServiceGrpc.getEchoMethod) == null) {
+ EchoServiceGrpc.getEchoMethod =
+ getEchoMethod =
+ io.grpc.MethodDescriptor
+ .<org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ newBuilder()
+ .setType(io.grpc.MethodDescriptor.MethodType.UNARY)
+ .setFullMethodName(generateFullMethodName(SERVICE_NAME, "Echo"))
+ .setSampledToLocalTracing(true)
+ .setRequestMarshaller(
+ io.grpc.protobuf.ProtoUtils.marshaller(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest
+ .getDefaultInstance()))
+ .setResponseMarshaller(
+ io.grpc.protobuf.ProtoUtils.marshaller(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse
+ .getDefaultInstance()))
+ .setSchemaDescriptor(new EchoServiceMethodDescriptorSupplier("Echo"))
+ .build();
+ }
+ }
+ }
+ return getEchoMethod;
+ }
+
+ /** Creates a new async stub that supports all call types for the service */
+ public static EchoServiceStub newStub(io.grpc.Channel channel) {
+ io.grpc.stub.AbstractStub.StubFactory<EchoServiceStub> factory =
+ new io.grpc.stub.AbstractStub.StubFactory<EchoServiceStub>() {
+ @java.lang.Override
+ public EchoServiceStub newStub(io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ return new EchoServiceStub(channel, callOptions);
+ }
+ };
+ return EchoServiceStub.newStub(factory, channel);
+ }
+
+ /**
+ * Creates a new blocking-style stub that supports unary and streaming output calls on the service
+ */
+ public static EchoServiceBlockingStub newBlockingStub(io.grpc.Channel channel) {
+ io.grpc.stub.AbstractStub.StubFactory<EchoServiceBlockingStub> factory =
+ new io.grpc.stub.AbstractStub.StubFactory<EchoServiceBlockingStub>() {
+ @java.lang.Override
+ public EchoServiceBlockingStub newStub(
+ io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ return new EchoServiceBlockingStub(channel, callOptions);
+ }
+ };
+ return EchoServiceBlockingStub.newStub(factory, channel);
+ }
+
+ /** Creates a new ListenableFuture-style stub that supports unary calls on the service */
+ public static EchoServiceFutureStub newFutureStub(io.grpc.Channel channel) {
+ io.grpc.stub.AbstractStub.StubFactory<EchoServiceFutureStub> factory =
+ new io.grpc.stub.AbstractStub.StubFactory<EchoServiceFutureStub>() {
+ @java.lang.Override
+ public EchoServiceFutureStub newStub(
+ io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ return new EchoServiceFutureStub(channel, callOptions);
+ }
+ };
+ return EchoServiceFutureStub.newStub(factory, channel);
+ }
+
+ /**
+ *
+ *
+ * <pre>
+ * EchoService simulates a mock API that echos a request.
+ * </pre>
+ */
+ public interface AsyncService {
+
+ /**
+ *
+ *
+ * <pre>
+ * Echo an EchoRequest payload in an EchoResponse.
+ * </pre>
+ */
+ default void echo(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest request,
+ io.grpc.stub.StreamObserver<org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ responseObserver) {
+ io.grpc.stub.ServerCalls.asyncUnimplementedUnaryCall(getEchoMethod(), responseObserver);
+ }
+ }
+
+ /**
+ * Base class for the server implementation of the service EchoService.
+ *
+ * <pre>
+ * EchoService simulates a mock API that echos a request.
+ * </pre>
+ */
+ public abstract static class EchoServiceImplBase
+ implements io.grpc.BindableService, AsyncService {
+
+ @java.lang.Override
+ public final io.grpc.ServerServiceDefinition bindService() {
+ return EchoServiceGrpc.bindService(this);
+ }
+ }
+
+ /**
+ * A stub to allow clients to do asynchronous rpc calls to service EchoService.
+ *
+ * <pre>
+ * EchoService simulates a mock API that echos a request.
+ * </pre>
+ */
+ public static final class EchoServiceStub
+ extends io.grpc.stub.AbstractAsyncStub<EchoServiceStub> {
+ private EchoServiceStub(io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ super(channel, callOptions);
+ }
+
+ @java.lang.Override
+ protected EchoServiceStub build(io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ return new EchoServiceStub(channel, callOptions);
+ }
+
+ /**
+ *
+ *
+ * <pre>
+ * Echo an EchoRequest payload in an EchoResponse.
+ * </pre>
+ */
+ public void echo(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest request,
+ io.grpc.stub.StreamObserver<org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ responseObserver) {
+ io.grpc.stub.ClientCalls.asyncUnaryCall(
+ getChannel().newCall(getEchoMethod(), getCallOptions()), request, responseObserver);
+ }
+ }
+
+ /**
+ * A stub to allow clients to do synchronous rpc calls to service EchoService.
+ *
+ * <pre>
+ * EchoService simulates a mock API that echos a request.
+ * </pre>
+ */
+ public static final class EchoServiceBlockingStub
+ extends io.grpc.stub.AbstractBlockingStub<EchoServiceBlockingStub> {
+ private EchoServiceBlockingStub(io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ super(channel, callOptions);
+ }
+
+ @java.lang.Override
+ protected EchoServiceBlockingStub build(
+ io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ return new EchoServiceBlockingStub(channel, callOptions);
+ }
+
+ /**
+ *
+ *
+ * <pre>
+ * Echo an EchoRequest payload in an EchoResponse.
+ * </pre>
+ */
+ public org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse echo(
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest request) {
+ return io.grpc.stub.ClientCalls.blockingUnaryCall(
+ getChannel(), getEchoMethod(), getCallOptions(), request);
+ }
+ }
+
+ /**
+ * A stub to allow clients to do ListenableFuture-style rpc calls to service EchoService.
+ *
+ * <pre>
+ * EchoService simulates a mock API that echos a request.
+ * </pre>
+ */
+ public static final class EchoServiceFutureStub
+ extends io.grpc.stub.AbstractFutureStub<EchoServiceFutureStub> {
+ private EchoServiceFutureStub(io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ super(channel, callOptions);
+ }
+
+ @java.lang.Override
+ protected EchoServiceFutureStub build(
+ io.grpc.Channel channel, io.grpc.CallOptions callOptions) {
+ return new EchoServiceFutureStub(channel, callOptions);
+ }
+
+ /**
+ *
+ *
+ * <pre>
+ * Echo an EchoRequest payload in an EchoResponse.
+ * </pre>
+ */
+ public com.google.common.util.concurrent.ListenableFuture<
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>
+ echo(org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest request) {
+ return io.grpc.stub.ClientCalls.futureUnaryCall(
+ getChannel().newCall(getEchoMethod(), getCallOptions()), request);
+ }
+ }
+
+ private static final int METHODID_ECHO = 0;
+
+ private static final class MethodHandlers<Req, Resp>
+ implements io.grpc.stub.ServerCalls.UnaryMethod<Req, Resp>,
+ io.grpc.stub.ServerCalls.ServerStreamingMethod<Req, Resp>,
+ io.grpc.stub.ServerCalls.ClientStreamingMethod<Req, Resp>,
+ io.grpc.stub.ServerCalls.BidiStreamingMethod<Req, Resp> {
+ private final AsyncService serviceImpl;
+ private final int methodId;
+
+ MethodHandlers(AsyncService serviceImpl, int methodId) {
+ this.serviceImpl = serviceImpl;
+ this.methodId = methodId;
+ }
+
+ @java.lang.Override
+ @java.lang.SuppressWarnings("unchecked")
+ public void invoke(Req request, io.grpc.stub.StreamObserver<Resp> responseObserver) {
+ switch (methodId) {
+ case METHODID_ECHO:
+ serviceImpl.echo(
+ (org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest) request,
+ (io.grpc.stub.StreamObserver<
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>)
+ responseObserver);
+ break;
+ default:
+ throw new AssertionError();
+ }
+ }
+
+ @java.lang.Override
+ @java.lang.SuppressWarnings("unchecked")
+ public io.grpc.stub.StreamObserver<Req> invoke(
+ io.grpc.stub.StreamObserver<Resp> responseObserver) {
+ switch (methodId) {
+ default:
+ throw new AssertionError();
+ }
+ }
+ }
+
+ public static final io.grpc.ServerServiceDefinition bindService(AsyncService service) {
+ return io.grpc.ServerServiceDefinition.builder(getServiceDescriptor())
+ .addMethod(
+ getEchoMethod(),
+ io.grpc.stub.ServerCalls.asyncUnaryCall(
+ new MethodHandlers<
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoRequest,
+ org.apache.beam.testinfra.mockapis.echo.v1.Echo.EchoResponse>(
+ service, METHODID_ECHO)))
+ .build();
+ }
+
+ private abstract static class EchoServiceBaseDescriptorSupplier
+ implements io.grpc.protobuf.ProtoFileDescriptorSupplier,
+ io.grpc.protobuf.ProtoServiceDescriptorSupplier {
+ EchoServiceBaseDescriptorSupplier() {}
+
+ @java.lang.Override
+ public com.google.protobuf.Descriptors.FileDescriptor getFileDescriptor() {
+ return org.apache.beam.testinfra.mockapis.echo.v1.Echo.getDescriptor();
+ }
+
+ @java.lang.Override
+ public com.google.protobuf.Descriptors.ServiceDescriptor getServiceDescriptor() {
+ return getFileDescriptor().findServiceByName("EchoService");
+ }
+ }
+
+ private static final class EchoServiceFileDescriptorSupplier
+ extends EchoServiceBaseDescriptorSupplier {
+ EchoServiceFileDescriptorSupplier() {}
+ }
+
+ private static final class EchoServiceMethodDescriptorSupplier
+ extends EchoServiceBaseDescriptorSupplier
+ implements io.grpc.protobuf.ProtoMethodDescriptorSupplier {
+ private final java.lang.String methodName;
+
+ EchoServiceMethodDescriptorSupplier(java.lang.String methodName) {
+ this.methodName = methodName;
+ }
+
+ @java.lang.Override
+ public com.google.protobuf.Descriptors.MethodDescriptor getMethodDescriptor() {
+ return getServiceDescriptor().findMethodByName(methodName);
+ }
+ }
+
+ private static volatile io.grpc.ServiceDescriptor serviceDescriptor;
+
+ public static io.grpc.ServiceDescriptor getServiceDescriptor() {
+ io.grpc.ServiceDescriptor result = serviceDescriptor;
+ if (result == null) {
+ synchronized (EchoServiceGrpc.class) {
+ result = serviceDescriptor;
+ if (result == null) {
+ serviceDescriptor =
+ result =
+ io.grpc.ServiceDescriptor.newBuilder(SERVICE_NAME)
+ .setSchemaDescriptor(new EchoServiceFileDescriptorSupplier())
+ .addMethod(getEchoMethod())
+ .build();
+ }
+ }
+ }
+ return result;
+ }
+}
diff --git a/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/package-info.java b/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/package-info.java
new file mode 100644
index 0000000..00b7fa2
--- /dev/null
+++ b/.test-infra/mock-apis/src/main/java/org/apache/beam/testinfra/mockapis/echo/v1/package-info.java
@@ -0,0 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/** Autogenerated code supporting a quota aware gRPC endpoint client. */
+package org.apache.beam.testinfra.mockapis.echo.v1;
diff --git a/build.gradle.kts b/build.gradle.kts
index a97b737..b330bd0 100644
--- a/build.gradle.kts
+++ b/build.gradle.kts
@@ -54,6 +54,7 @@
// Proto/grpc generated wrappers
"**/apache_beam/portability/api/**/*_pb2*.py",
"**/go/pkg/beam/**/*.pb.go",
+ "**/mock-apis/**/*.pb.go",
// Ignore go.sum files, which don't permit headers
"**/go.sum",
@@ -198,6 +199,9 @@
// Ignore typesciript package management.
"sdks/typescript/package-lock.json",
"sdks/typescript/node_modules/**/*",
+
+ // Ignore buf autogenerated files.
+ "**/buf.lock",
)
// Add .gitignore excludes to the Apache Rat exclusion list. We re-create the behavior
diff --git a/examples/notebooks/healthcare/beam_post_hl7_messages_to_hcapi.ipynb b/examples/notebooks/healthcare/beam_post_hl7_messages_to_hcapi.ipynb
new file mode 100644
index 0000000..ab6b2d9
--- /dev/null
+++ b/examples/notebooks/healthcare/beam_post_hl7_messages_to_hcapi.ipynb
@@ -0,0 +1,528 @@
+{
+ "nbformat": 4,
+ "nbformat_minor": 0,
+ "metadata": {
+ "colab": {
+ "provenance": [],
+ "private_outputs": true,
+ "toc_visible": true,
+ "include_colab_link": true
+ },
+ "kernelspec": {
+ "name": "python3",
+ "display_name": "Python 3"
+ },
+ "language_info": {
+ "name": "python"
+ }
+ },
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "view-in-github",
+ "colab_type": "text"
+ },
+ "source": [
+ "<a href=\"https://colab.research.google.com/github/devanshmodi/beam/blob/devanshmodi-patch-healthcare-hl7-to-hcapi/examples/notebooks/healthcare/beam_post_hl7_messages_to_hcapi.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "zQ_JXPR3RoFV"
+ },
+ "outputs": [],
+ "source": [
+ "# @title ###### Licensed to the Apache Software Foundation (ASF), Version 2.0 (the \"License\")\n",
+ "\n",
+ "# Licensed to the Apache Software Foundation (ASF) under one\n",
+ "# or more contributor license agreements. See the NOTICE file\n",
+ "# distributed with this work for additional information\n",
+ "# regarding copyright ownership. The ASF licenses this file\n",
+ "# to you under the Apache License, Version 2.0 (the\n",
+ "# \"License\"); you may not use this file except in compliance\n",
+ "# with the License. You may obtain a copy of the License at\n",
+ "#\n",
+ "# http://www.apache.org/licenses/LICENSE-2.0\n",
+ "#\n",
+ "# Unless required by applicable law or agreed to in writing,\n",
+ "# software distributed under the License is distributed on an\n",
+ "# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n",
+ "# KIND, either express or implied. See the License for the\n",
+ "# specific language governing permissions and limitations\n",
+ "# under the License\n",
+ "\n",
+ "##################################\n",
+ "# Author: Devansh Modi #\n",
+ "##################################\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "**Highlevel Architecture**\n",
+ "\n",
+ ""
+ ],
+ "metadata": {
+ "id": "RL1LDp645ogr"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "# **Post Hl7v2 messages to Google Cloud Healthcare API HL7v2 store pipeline**\n",
+ "\n",
+ "This example demonstrates how to set up an Apache Beam pipeline that reads a HL7 file from [Google Cloud Storage](https://https://cloud.google.com/storage), and calls the [Google Cloud Healthcare API Hl7v2 store to store Hl7 messages](https://cloud.google.com/healthcare-api/docs/how-tos/hl7v2-messages) to extract information from unstructured data. This application can be used in contexts such as reading raw Hl7 messages, if needed parse them or modify them as per your defined Hl7v2 store configurations and store data into Hl7v2 store.\n",
+ "\n",
+ "An Apache Beam pipeline is a pipeline that reads input data, transforms that data, and writes output data. It consists of PTransforms and PCollections. A PCollection represents a distributed data set that your Beam pipeline operates on. A PTransform represents a data processing operation, or a step, in your pipeline. It takes one or more PCollections as input, performs a processing function that you provide on the elements of that PCollection, and produces zero or more output PCollection objects.\n",
+ "\n",
+ "For details about Apache Beam pipelines, including PTransforms and PCollections, visit the [Beam Programming Guide](https://beam.apache.org/documentation/programming-guide/).\n",
+ "\n",
+ "You'll be able to use this notebook to explore the data in each PCollection."
+ ],
+ "metadata": {
+ "id": "wC9KRrlORwKu"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "**What is an HL7v2 message?**\n",
+ "\n",
+ "HL7 Messages are used to transfer electronic data between disparate healthcare systems, each sending information about a particular event such as a patient admission.\n",
+ "\n",
+ "An HL7 message consists of one or more segments. Each segment is displayed on a different line of text. A carriage return character (\\r, which is 0D in hexadecimal) separates one segment from another.\n",
+ "\n",
+ "Each segment consists of one or more composites, also known as fields. A pipe (|) character is used to separate one composite from another. If a composite contains other composites, these sub-composites (or sub-fields) are normally separated by caret (^) characters.\n",
+ "\n"
+ ],
+ "metadata": {
+ "id": "AOVYgtyaqSxa"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "***Sample HL7v2 Message***\n",
+ "\n",
+ "The below reference message shows a sample Hl7v2 messages seperated by \\r.\n",
+ "\n",
+ "**MSH|^~\\&|FROM_APP|FROM_FACILITY|TO_APP|TO_FACILITY|20150503223000||ADT^A01|20150503223000|P|2.5|\\r\n",
+ "EVN|A01|20110613083617|\\r\n",
+ "PID|1||21004053^^^^MRN||SULLY^BRIAN||19611209|M|||123 MAIN ST^^MOUNTAIN SPRINGS^CO^80439|\\r\n",
+ "PV1||I|H73 RM1^1^^HIGHWAY 73 CLINIC||||5148^MARY QUINN|||||||||Y||||||||||||||||||||||||||||20150503223000|**\n",
+ "\n",
+ "The file contains many such messages and the objective of this code will be to split and construct messages and POST it to Google Cloud HealthCare API HL7v2 store."
+ ],
+ "metadata": {
+ "id": "-lpbvwHmX1L5"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "Lets install necessary packages"
+ ],
+ "metadata": {
+ "id": "81wCK9XnS6Sc"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "!pip install apache-beam[gcp]"
+ ],
+ "metadata": {
+ "id": "Yv1phmRZS23c"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "**Google Cloud Authentication**\n",
+ "\n",
+ "As we are using Google Clous Storage and HealthCare API, we will be requiring tokens to make sure our connection is secure.\n",
+ "\n",
+ "Click [this](https://cloud.google.com/free) link to create a new Google Cloud Platform account\n"
+ ],
+ "metadata": {
+ "id": "3EcdPBczYQlB"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "**GCP Setup**\n",
+ "1. Authenticate your notebook by `gcloud auth application-default login` in the Colab terminal.\n",
+ "\n",
+ "2. Run `gcloud config set project <YOUR-PROJECT>`\n",
+ "\n",
+ "Set the variables in the next cell based upon your project and preferences.\n",
+ "\n",
+ "Note that below, **us-central1** is hardcoded as the location. This is because of the limited number of [locations](https://cloud.google.com/healthcare-api/docs/how-tos/hl7v2-messages) the API currently supports."
+ ],
+ "metadata": {
+ "id": "tpePe_yOsdSJ"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "Before running please set the following variables as arguments as mentioned below\n"
+ ],
+ "metadata": {
+ "id": "_1Q3mw1usnoE"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "args = {'gcp_project':'xxx', #GCP project ID\n",
+ " 'gcp_region':'xxx', # GCP project region\n",
+ " 'temp_location':'gs://<YOUR Bucket>/tmp', #input location where your HL7 messages are stored in GCS bucket\n",
+ " 'input_file':'gs://<YOUR Bucket>/my_message.hl7', #input location where your HL7 messages are stored in GCS bucket\n",
+ " 'hcapi_project_id':'xxxxxx', #healthcare API project ID\n",
+ " 'hcapi_dataset':'xxxx', #healthcare dataset\n",
+ " 'hcapi_version':'v1', #healthcare API version by defualt v1\n",
+ " 'hcapi_location':'xxxx', #healthcare API configured location\n",
+ " 'hcapi_hl7_store':'xxx', #healthcare api hl7 store\n",
+ " 'hcapi_fhir_store':''}"
+ ],
+ "metadata": {
+ "id": "a722GbqdvgOX"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "**Google Cloud Healthcare (HCAPI) API Utils class**\n",
+ "\n",
+ "Below is the code snippet which describes the class having healthcare API connections and configurations. Basic functionality includes constructing the hcapi_url as per the input parameters, cleaning the HL7 message in a proper format and posting hl7v2 message to hl7v2 store. You can add more transformations as per your requirements."
+ ],
+ "metadata": {
+ "id": "NHzk8JIqxQoa"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "import google.auth\n",
+ "import google.auth.transport.requests\n",
+ "import base64\n",
+ "import json\n",
+ "import hashlib\n",
+ "import requests\n",
+ "import logging\n",
+ "import apache_beam as beam\n",
+ "from apache_beam.options.pipeline_options import PipelineOptions\n",
+ "from apache_beam.options.pipeline_options import SetupOptions\n",
+ "from apache_beam.testing.test_pipeline import TestPipeline\n",
+ "import apache_beam.runners.interactive.interactive_beam as ib\n",
+ "from apache_beam import io\n",
+ "\n",
+ "logging.basicConfig(level=logging.INFO, format='%(asctime)s :: %(levelname)s :: %(message)s')\n",
+ "\n",
+ "class hcapi_cls:\n",
+ "\n",
+ " def __init__(self, args):\n",
+ " self.hcapi_hl7_store = str(args['hcapi_hl7_store'])\n",
+ " self.hcapi_project_id = str(args['hcapi_project_id'])\n",
+ " self.hcapi_version = str(args['hcapi_version'])\n",
+ " self.hcapi_location = str(args['hcapi_location'])\n",
+ " self.hcapi_dataset = str(args['hcapi_dataset'])\n",
+ " self.hcapi_fhir_store = str(args['hcapi_fhir_store'])\n",
+ " self.token = None\n",
+ "\n",
+ " def google_api_headers(self):\n",
+ " \"\"\" Function gets the token for the request \"\"\"\n",
+ " logging.info(\"fetching token and refreshing credentials\")\n",
+ " creds, project = google.auth.default()\n",
+ " auth_req = google.auth.transport.requests.Request()\n",
+ " creds.refresh(auth_req)\n",
+ " return {\n",
+ " \"Authorization\": f\"Bearer {creds.token}\",\n",
+ " \"Prefer\": \"handling=strict\"\n",
+ " }\n",
+ "\n",
+ " def hcapi_dataset_url(self, version=None, project=None, location=None, dataset=None):\n",
+ " \"\"\" This function creates base hcapi dataset url and returns it \"\"\"\n",
+ " base = 'https://healthcare.googleapis.com'\n",
+ " version = self.hcapi_version\n",
+ " project = self.hcapi_project_id\n",
+ " location = self.hcapi_location\n",
+ " dataset = self.hcapi_dataset\n",
+ " return f'{base}/{version}/projects/{project}/locations/{location}/datasets/{dataset}'\n",
+ "\n",
+ " def hcapi_get(self, url):\n",
+ " \"\"\" Function to send get request to HCAPI \"\"\"\n",
+ " response = requests.get(url, headers=self.google_api_headers())\n",
+ " if not response.ok:\n",
+ " raise Exception(f'Error with HC API get:\\n{response.text}')\n",
+ " return response.json()\n",
+ "\n",
+ " def hcapi_post(self, url, data):\n",
+ " \"\"\" Function to send post request to HCAPI \"\"\"\n",
+ " response = requests.post(url, headers=self.google_api_headers(), json=data)\n",
+ " if not response.ok:\n",
+ " raise Exception(f'Error with HC API post:\\n{response.text}')\n",
+ " return response.json()\n",
+ "\n",
+ " def hcapi_delete(self, url):\n",
+ " \"\"\" Function to send delete request to HCAPI \"\"\"\n",
+ " response = requests.delete(url, headers=self.google_api_headers())\n",
+ " if not response.ok:\n",
+ " raise Exception(f'Error with HC API get:\\n{response.text}')\n",
+ " return response.json()\n",
+ "\n",
+ " def hcapi_hl7_url(self, version=None, project=None, location=None, dataset=None, store=None):\n",
+ " \"\"\" This function creates hcapi hl7V2store url and returns the url \"\"\"\n",
+ " base_url = self.hcapi_dataset_url(version=version, project=project,\n",
+ " location=location, dataset=dataset)\n",
+ " hl7_store = self.hcapi_hl7_store\n",
+ " return f'{base_url}/hl7V2Stores/{hl7_store}'\n",
+ "\n",
+ " def get_hl7_message(self, message_id):\n",
+ " \"\"\" Function to get message from HL7v2 store using HCAPI URL \"\"\"\n",
+ " url = f'{self.hcapi_hl7_url()}/messages/{message_id}'\n",
+ " return self.hcapi_get(url)\n",
+ "\n",
+ " def post_hl7_message(self, payload):\n",
+ " \"\"\" Function to post messages to HL7v2 store \"\"\"\n",
+ " url = f'{self.hcapi_hl7_url()}/messages'\n",
+ " return self.hcapi_post(url, payload)\n",
+ "\n",
+ " def message_to_hl7_store(self, message):\n",
+ " \"\"\" Function to clean up Hl7 messages with \\r seperator before posting to HCAPI \"\"\"\n",
+ " messase =str(message)\n",
+ " message = message.replace('\\n', '\\r')\n",
+ " message = message.replace('\\\\r', '\\r')\n",
+ " message = message.replace('\\r\\r', '\\r')\n",
+ " encoded = base64.b64encode(str(message).encode())\n",
+ " payload = {\n",
+ " \"message\": {\n",
+ " \"data\": encoded.decode()\n",
+ " }\n",
+ " }\n",
+ " return self.post_hl7_message(payload)\n",
+ "\n",
+ " def hcapi_fhir_url(self, version=None, project=None, location=None, dataset=None, store=None):\n",
+ " \"\"\" This function creates hcapi fhir store url and returns it \"\"\"\n",
+ " base_url = self.hcapi_dataset_url(version=version, project=project,\n",
+ " location=location, dataset=dataset)\n",
+ " if store is None:\n",
+ " raise Exception('No FHIR store specified')\n",
+ " return f'{base_url}/fhirStores/{store}/fhir'\n",
+ "\n",
+ " def hcapi_fhir_request(self, store_key, query, data={}, method='GET'):\n",
+ " \"\"\" Function to send post request to HCAPI FHIR store \"\"\"\n",
+ " store = self.hcapi_fhir_store\n",
+ " if not store:\n",
+ " raise Exception(f\"Couldn't FHIR find store named {store_key} in config\")\n",
+ " url = self.hcapi_fhir_url(store=store)\n",
+ " url = f'{url}/{query}' if query else url\n",
+ " get = lambda q, d: self.hcapi_get(url)\n",
+ " post = lambda q, d: self.hcapi_post(url, data)\n",
+ " delete = lambda q, d: self.hcapi_delete(url)\n",
+ " return {'GET': get, 'POST': post, 'DELETE' : delete}[method](query, data)\n",
+ "\n"
+ ],
+ "metadata": {
+ "id": "H7g4_-rGS9P_"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "**Pipeline Setup**\n",
+ "\n",
+ "We will use InteractiveRunner in this notebook.\n",
+ "Following are the DoFn classes which carry out their respective operations"
+ ],
+ "metadata": {
+ "id": "lXnzAtbHyUd2"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "The following class **BuildFileName** takes the file name from the element and converts its into string. You can enhance this class to construct GCS bucket URL, if your GCS bucket prefix remains constant."
+ ],
+ "metadata": {
+ "id": "TKnL8kxh3Kms"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "class BuildFileName(beam.DoFn):\n",
+ " \"\"\" Class to get file name from variable and returns the filename \"\"\"\n",
+ " def process(self, element):\n",
+ " logging.info(\"processing the following file: {}\".format(element))\n",
+ " file_path = str(element)\n",
+ " yield file_path"
+ ],
+ "metadata": {
+ "id": "N01E3dQd3Jr3"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "The following class **BuildMessages** takes the GCS URL from the above class reads it, separates out each message, appends them into a list and return the list for the next class."
+ ],
+ "metadata": {
+ "id": "Jej68R8w3i2Z"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "class BuildMessages(beam.DoFn):\n",
+ " \"\"\" Class to read file, clean and seperate messgaes based on MSH\"\"\"\n",
+ " def process(self, file_name):\n",
+ " try:\n",
+ " logging.info(\"starting to read file: {}\".format(file_name))\n",
+ " file = io.gcsio.GcsIO().open(filename=file_name, mode='r')\n",
+ " read_file = file.read()\n",
+ " new_file = str(read_file, encoding='utf-8').replace('\\n', '\\r')\n",
+ " logging.info(\"starting to seperate HL7 messages into list\")\n",
+ " messages=[]\n",
+ " for line in new_file.split('\\r'):\n",
+ " if line[:3] =='MSH':\n",
+ " messages.append(line)\n",
+ " else:\n",
+ " messages[-1]+= line\n",
+ "\n",
+ "\n",
+ " logging.info(\"total number of messages parsed are {}\".format(len(messages)))\n",
+ " return messages\n",
+ " except Exception as error:\n",
+ " logging.error(\"got the following error while processing : {}\".format('\\n'+str(error)))\n",
+ " raise Exception\n",
+ "\n",
+ "\n"
+ ],
+ "metadata": {
+ "id": "MC6tr_sGyNKG"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "The following class **PostToHL7V2Store** takes the messages return in the earlier class and POST each messages to Hl7v2 store ."
+ ],
+ "metadata": {
+ "id": "1hpuoUGA33jo"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "class PostToHL7V2Store(beam.DoFn):\n",
+ " \"\"\" Class to read file, clean and seperate messgaes based on MSH\"\"\"\n",
+ " def process(self, element):\n",
+ " try:\n",
+ " logging.info(\"starting to prepare and post message\")\n",
+ " hl7v2_store_response = hcapi.message_to_hl7_store(element)\n",
+ " message_id = hl7v2_store_response['name'].split(\"/\")[-1]\n",
+ " logging.info(\"successfully posted message to Hl7v2 store with message id :- {}\".format(message_id))\n",
+ "\n",
+ " yield message_id\n",
+ " except Exception as error:\n",
+ " logging.error(\"got the following error while processing : {}\".format(error))\n",
+ " raise Exception"
+ ],
+ "metadata": {
+ "id": "lVjqYfb2330k"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "The following function sets up a beam pipeline with various other Pipeline options that will extracts messages from Hl7 text and post each hl7 message to hl7v2 store using Google Cloud Healthcare API (HCAPI) api methods.\n",
+ "\n",
+ "**\"|\"** is an overloaded operator that applies a PTransform to a PCollection to produce a new PCollection. Together with |, >> allows you to optionally name a PTransform.\n",
+ "\n",
+ "Usage:[PCollection] | [PTransform], **or** [PCollection] | [name] >> [PTransform]"
+ ],
+ "metadata": {
+ "id": "g5oJgXCk4O1a"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "\n",
+ "import apache_beam.runners.interactive.interactive_beam as ib\n",
+ "def run(beam_args,argv=None,save_main_session=True):\n",
+ " runnertype = \"InteractiveRunner\"\n",
+ " project=beam_args['gcp_project']\n",
+ " region=beam_args['gcp_region']\n",
+ " temp_location=beam_args['temp_location']\n",
+ "\n",
+ " options = PipelineOptions(\n",
+ " flags=argv,\n",
+ " runner=runnertype,\n",
+ " project=project,\n",
+ " job_name=\"my-beam-hl7to-hcapi\",\n",
+ " temp_location=temp_location,\n",
+ " region=region)\n",
+ " beam_pipeline_options = PipelineOptions(beam_args)\n",
+ " beam_pipeline_options.view_as(SetupOptions).save_main_session = save_main_session\n",
+ " with beam.Pipeline(options=beam_pipeline_options) as pipeline:\n",
+ " file = (\n",
+ " pipeline\n",
+ " | 'reading filename' >> beam.Create([args_dict['input_file']])\n",
+ " | 'preparing file path' >> beam.ParDo(BuildFileName())\n",
+ " )\n",
+ " hl7_messages=(\n",
+ " file\n",
+ " | 'parsing hl7 messages' >> beam.ParDo(BuildMessages())\n",
+ " )\n",
+ " post_hl7_messages = (\n",
+ " hl7_messages\n",
+ " | \"posting to hl7v2 Store\" >> beam.ParDo(PostToHL7V2Store())\n",
+ " )\n",
+ "\n",
+ "\n",
+ " ib.show_graph(pipeline)\n",
+ "\n",
+ "\n",
+ "if __name__ == \"__main__\":\n",
+ " logging.getLogger().setLevel(logging.INFO)\n",
+ " args_dict = dict(args)\n",
+ " hcapi= hcapi_cls(args_dict)\n",
+ " run(beam_args=args_dict)"
+ ],
+ "metadata": {
+ "id": "Dynn2PDuyRBT"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ ""
+ ],
+ "metadata": {
+ "id": "tweQCiuX5RVK"
+ }
+ }
+ ]
+}
\ No newline at end of file
diff --git a/sdks/java/build-tools/src/main/resources/beam/checkstyle/suppressions.xml b/sdks/java/build-tools/src/main/resources/beam/checkstyle/suppressions.xml
index 7037f05..3f05250 100644
--- a/sdks/java/build-tools/src/main/resources/beam/checkstyle/suppressions.xml
+++ b/sdks/java/build-tools/src/main/resources/beam/checkstyle/suppressions.xml
@@ -87,6 +87,7 @@
<suppress id="ForbidNonVendoredGrpcProtobuf" files=".*it.*Client\.java" />
<suppress id="ForbidNonVendoredGrpcProtobuf" files=".*it.*LT\.java" />
<suppress id="ForbidNonVendoredGrpcProtobuf" files=".*it.*ResourceManagerTest\.java" />
+ <suppress id="ForbidNonVendoredGrpcProtobuf" files=".*testinfra.*mockapis.*" />
<!-- Flink -->
<!-- Checkstyle does not correctly detect package files across multiple source directories. -->
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/HistogramData.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/HistogramData.java
index cca3a44..dd2193d 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/HistogramData.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/HistogramData.java
@@ -79,7 +79,7 @@
}
/**
- * Returns a histogram object wiht exponential boundaries. The input parameter {@code scale}
+ * Returns a histogram object with exponential boundaries. The input parameter {@code scale}
* determines a coefficient 'base' which species bucket boundaries.
*
* <pre>
@@ -381,7 +381,7 @@
return getBucketIndexZeroScale(value) >> (-getScale());
}
- // This method is valid for all 'scale' values but we fallback to more effecient methods for
+ // This method is valid for all 'scale' values but we fallback to more efficient methods for
// non-positive scales.
// For a value>base we would like to find an i s.t. :
// base^i <= value < base^(i+1)
diff --git a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/HistogramDataTest.java b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/HistogramDataTest.java
index bfad087..133bf78 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/HistogramDataTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/HistogramDataTest.java
@@ -205,7 +205,7 @@
// The following tests cover exponential buckets.
@Test
- public void testExponentialBuckets_PostiveScaleRecord() {
+ public void testExponentialBuckets_PositiveScaleRecord() {
// Buckets will be:
// Index Range
// Underflow (-inf, 0)
diff --git a/sdks/java/io/rrio/src/main/java/org/apache/beam/io/requestresponse/ThrottleWithoutExternalResource.java b/sdks/java/io/rrio/src/main/java/org/apache/beam/io/requestresponse/ThrottleWithoutExternalResource.java
new file mode 100644
index 0000000..0648a86
--- /dev/null
+++ b/sdks/java/io/rrio/src/main/java/org/apache/beam/io/requestresponse/ThrottleWithoutExternalResource.java
@@ -0,0 +1,57 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.io.requestresponse;
+
+import com.google.auto.value.AutoValue;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.values.PCollection;
+
+/**
+ * {@link ThrottleWithoutExternalResource} throttles a {@link RequestT} {@link PCollection} emitting
+ * a {@link RequestT} {@link PCollection} at a maximally configured rate, without using an external
+ * resource.
+ */
+// TODO(damondouglas): expand what "without external resource" means with respect to "with external
+// resource" when the other throttle transforms implemented.
+// See: https://github.com/apache/beam/issues/28932
+class ThrottleWithoutExternalResource<RequestT>
+ extends PTransform<PCollection<RequestT>, PCollection<RequestT>> {
+
+ // TODO(damondouglas): remove suppress warnings when finally utilized in a future PR.
+ @SuppressWarnings({"unused"})
+ private final Configuration<RequestT> configuration;
+
+ private ThrottleWithoutExternalResource(Configuration<RequestT> configuration) {
+ this.configuration = configuration;
+ }
+
+ @Override
+ public PCollection<RequestT> expand(PCollection<RequestT> input) {
+ // TODO(damondouglas): expand in a future PR.
+ return input;
+ }
+
+ @AutoValue
+ abstract static class Configuration<RequestT> {
+
+ @AutoValue.Builder
+ abstract static class Builder<RequestT> {
+ abstract Configuration<RequestT> build();
+ }
+ }
+}
diff --git a/sdks/python/apache_beam/runners/dataflow/internal/names.py b/sdks/python/apache_beam/runners/dataflow/internal/names.py
index 579764a..b39c4a2 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/names.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/names.py
@@ -34,6 +34,6 @@
# Unreleased sdks use container image tag specified below.
# Update this tag whenever there is a change that
# requires changes to SDK harness container or SDK harness launcher.
-BEAM_DEV_SDK_CONTAINER_TAG = 'beam-master-20231009'
+BEAM_DEV_SDK_CONTAINER_TAG = 'beam-master-20231023'
DATAFLOW_CONTAINER_IMAGE_REPOSITORY = 'gcr.io/cloud-dataflow/v1beta3'
diff --git a/sdks/python/apache_beam/yaml/yaml_provider.py b/sdks/python/apache_beam/yaml/yaml_provider.py
index 84399cd..33c1638 100644
--- a/sdks/python/apache_beam/yaml/yaml_provider.py
+++ b/sdks/python/apache_beam/yaml/yaml_provider.py
@@ -460,6 +460,13 @@
}
+def element_to_rows(e):
+ if isinstance(e, dict):
+ return dicts_to_rows(e)
+ else:
+ return beam.Row(element=dicts_to_rows(e))
+
+
def dicts_to_rows(o):
if isinstance(o, dict):
return beam.Row(**{k: dicts_to_rows(v) for k, v in o.items()})
@@ -487,7 +494,7 @@
reshuffle (optional): Whether to introduce a reshuffle if there is more
than one element in the collection. Defaults to True.
"""
- return beam.Create(dicts_to_rows(elements), reshuffle)
+ return beam.Create([element_to_rows(e) for e in elements], reshuffle)
def with_schema(**args):
# TODO: This is preliminary.
diff --git a/sdks/python/apache_beam/yaml/yaml_transform_test.py b/sdks/python/apache_beam/yaml/yaml_transform_test.py
index 05bbf41..ce60857 100644
--- a/sdks/python/apache_beam/yaml/yaml_transform_test.py
+++ b/sdks/python/apache_beam/yaml/yaml_transform_test.py
@@ -64,11 +64,11 @@
self._error_handling = error_handling
def expand(self, pcoll):
- def raise_on_big(element):
- if len(element) > self._limit:
- raise ValueError(element)
+ def raise_on_big(row):
+ if len(row.element) > self._limit:
+ raise ValueError(row.element)
else:
- return element
+ return row.element
good, bad = pcoll | beam.Map(raise_on_big).with_exception_handling()
return {'small_elements': good, self._error_handling['output']: bad}
@@ -211,7 +211,7 @@
- type: PyMap
input: [CreateBig, CreateSmall]
config:
- fn: "lambda x: x * x"
+ fn: "lambda x: x.element * x.element"
output: PyMap
''',
providers=TEST_PROVIDERS)
@@ -273,7 +273,7 @@
- type: PyMap
name: PyMap
config:
- fn: "lambda elem: elem * elem"
+ fn: "lambda row: row.element * row.element"
input: Create
output: PyMap
''',
@@ -431,11 +431,14 @@
- type: Create
config:
elements: [0, 1, 2, 4]
- - type: PyMap
+ - type: MapToFields
name: ToRow
input: Create
config:
- fn: "lambda x: beam.Row(num=x, str='a' * x or 'bbb')"
+ language: python
+ fields:
+ num: element
+ str: "'a' * element or 'bbb'"
- type: Filter
input: ToRow
config:
@@ -595,7 +598,8 @@
"""
def __init__(self, name, transform_names):
super().__init__({
- transform_name: lambda: beam.Map(lambda x: (x or ()) + (name, ))
+ transform_name:
+ lambda: beam.Map(lambda x: (x if type(x) == tuple else ()) + (name, ))
for transform_name in transform_names.strip().split()
})
self._name = name
@@ -728,7 +732,7 @@
def expand(self, pcoll):
a = self._a
b = self._b
- return pcoll | beam.Map(lambda x: a * x + b)
+ return pcoll | beam.Map(lambda x: a * x.element + b)
if __name__ == '__main__':
diff --git a/settings.gradle.kts b/settings.gradle.kts
index e8e374e..de83c11 100644
--- a/settings.gradle.kts
+++ b/settings.gradle.kts
@@ -326,6 +326,8 @@
// no dots allowed for project paths
include("beam-test-infra-metrics")
project(":beam-test-infra-metrics").projectDir = file(".test-infra/metrics")
+include("beam-test-infra-mock-apis")
+project(":beam-test-infra-mock-apis").projectDir = file(".test-infra/mock-apis")
include("beam-test-infra-pipelines")
project(":beam-test-infra-pipelines").projectDir = file(".test-infra/pipelines")
include("beam-test-tools")
diff --git a/website/www/site/content/en/blog/beam-sql-with-notebooks.md b/website/www/site/content/en/blog/beam-sql-with-notebooks.md
index 4f7c428..d7d80f4 100644
--- a/website/www/site/content/en/blog/beam-sql-with-notebooks.md
+++ b/website/www/site/content/en/blog/beam-sql-with-notebooks.md
@@ -420,7 +420,7 @@
import requests
# The covidtracking project has stopped collecting new data, current data ends on 2021-03-07
-json_current='https://covidtracking.com/api/v1/states/current.json'
+json_current='https://api.covidtracking.com/v1/states/current.json'
def get_json_data(url):
with requests.Session() as session: