Merge pull request #9701: [BEAM-7623] UT for BeamSql DDL with map field having row as value

diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md
index 487d186..0214f0e 100644
--- a/.github/PULL_REQUEST_TEMPLATE.md
+++ b/.github/PULL_REQUEST_TEMPLATE.md
@@ -15,7 +15,7 @@
 --- | --- | --- | --- | --- | --- | --- | ---
 Go | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
 Java | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)
-Python | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/) | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/)
+Python | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/) | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/)
 XLang | --- | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/) | --- | --- | ---
 
 Pre-Commit Tests Status (on master branch)
diff --git a/.test-infra/jenkins/CommonTestProperties.groovy b/.test-infra/jenkins/CommonTestProperties.groovy
index 14dd7b0..203e398 100644
--- a/.test-infra/jenkins/CommonTestProperties.groovy
+++ b/.test-infra/jenkins/CommonTestProperties.groovy
@@ -35,7 +35,7 @@
                 JAVA: [
                         DATAFLOW: ":runners:google-cloud-dataflow-java",
                         SPARK: ":runners:spark",
-                        FLINK: ":runners:flink:1.5",
+                        FLINK: ":runners:flink:1.8",
                         DIRECT: ":runners:direct-java"
                 ],
                 PYTHON: [
diff --git a/.test-infra/jenkins/LoadTestsBuilder.groovy b/.test-infra/jenkins/LoadTestsBuilder.groovy
index f85011d..c259033 100644
--- a/.test-infra/jenkins/LoadTestsBuilder.groovy
+++ b/.test-infra/jenkins/LoadTestsBuilder.groovy
@@ -30,7 +30,7 @@
     commonJobProperties.setTopLevelMainJobProperties(scope, 'master', 240)
 
     for (testConfiguration in testConfigurations) {
-        loadTest(scope, testConfiguration.title, testConfiguration.runner, sdk, testConfiguration.jobProperties, testConfiguration.itClass)
+        loadTest(scope, testConfiguration.title, testConfiguration.runner, sdk, testConfiguration.pipelineOptions, testConfiguration.test)
     }
   }
 
diff --git a/.test-infra/jenkins/README.md b/.test-infra/jenkins/README.md
index b450f2f..db54b79 100644
--- a/.test-infra/jenkins/README.md
+++ b/.test-infra/jenkins/README.md
@@ -36,7 +36,7 @@
 | beam_PreCommit_Java_Examples_Dataflow | [commit](https://builds.apache.org/job/beam_PreCommit_Java_Examples_Dataflow_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Java_Examples_Dataflow_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Java_Examples_Dataflow_Phrase/) | `Run Java PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Java_Examples_Dataflow_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Java_Examples_Dataflow_Cron) |
 | beam_PreCommit_Portable_Python | [commit](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Phrase/) | `Run Portable PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron) |
 | beam_PreCommit_Python | [commit](https://builds.apache.org/job/beam_PreCommit_Python_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Python_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Python_Phrase/) | `Run Python PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python_Cron) |
-| beam_PreCommit_Python_PVR_Flink | [commit](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Phrase/) | `Run Python PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron) |
+| beam_PreCommit_Python2_PVR_Flink | [commit](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Phrase/) | `Run Python PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron) |
 | beam_PreCommit_RAT | [commit](https://builds.apache.org/job/beam_PreCommit_RAT_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_RAT_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_RAT_Phrase/) | `Run RAT PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_RAT_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_RAT_Cron) |
 | beam_PreCommit_Spotless | [commit](https://builds.apache.org/job/beam_PreCommit_Spotless_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Spotless_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Spotless_Phrase/) | `Run Spotless PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Spotless_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Spotless_Cron) |
 | beam_PreCommit_Website | [commit](https://builds.apache.org/job/beam_PreCommit_Website_Commit/), [cron](https://builds.apache.org/job/beam_PreCommit_Website_Cron/), [phrase](https://builds.apache.org/job/beam_PreCommit_Website_Phrase/) | `Run Website PreCommit` | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Website_Cron/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Website_Cron) |
@@ -70,6 +70,7 @@
 | beam_PostCommit_Java_ValidatesRunner_Spark | [cron](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/), [phrase](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_PR/) | `Run Spark ValidatesRunner` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark) |
 | beam_PostCommit_Py_VR_Dataflow | [cron](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/), [phrase](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_PR/) | `Run Python Dataflow ValidatesRunner` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow) |
 | beam_PostCommit_Py_ValCont | [cron](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/), [phrase](https://builds.apache.org/job/beam_PostCommit_Py_ValCont_PR/) | `Run Python Dataflow ValidatesContainer` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont) |
+| beam_PostCommit_Python35_VR_Flink | [cron](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/), [phrase](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/) | `Run Python 3.5 Flink ValidatesRunner` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink) |
 | beam_PostCommit_Python_VR_Spark | [cron](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/), [phrase](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/) | `Run Python Spark ValidatesRunner` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark) |
 | beam_PostCommit_Python2  | [cron](https://builds.apache.org/job/beam_PostCommit_Python2), [phrase](https://builds.apache.org/job/beam_PostCommit_Python2_PR/) | `Run Python 2 PostCommit` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2) |
 | beam_PostCommit_Python35 | [cron](https://builds.apache.org/job/beam_PostCommit_Python35), [phrase](https://builds.apache.org/job/beam_PostCommit_Python35_PR/) | `Run Python 3.5 PostCommit` | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35) |
diff --git a/.test-infra/jenkins/job_LoadTests_CoGBK_Java.groovy b/.test-infra/jenkins/job_LoadTests_CoGBK_Java.groovy
index fc5c6ac..9c5a535 100644
--- a/.test-infra/jenkins/job_LoadTests_CoGBK_Java.groovy
+++ b/.test-infra/jenkins/job_LoadTests_CoGBK_Java.groovy
@@ -25,10 +25,10 @@
 def loadTestConfigurations = { mode, isStreaming, datasetName ->
     [
             [
-                    title        : 'Load test: CoGBK 2GB 100  byte records - single key',
-                    itClass      : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: CoGBK 2GB 100  byte records - single key',
+                    test           : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_CoGBK_1",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -52,17 +52,16 @@
                                             }
                                         """.trim().replaceAll("\\s", ""),
                             iterations            : 1,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
                     ]
             ],
             [
-                    title        : 'Load test: CoGBK 2GB 100 byte records - multiple keys',
-                    itClass      : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: CoGBK 2GB 100 byte records - multiple keys',
+                    test           : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_CoGBK_2",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -86,7 +85,6 @@
                                             }
                                         """.trim().replaceAll("\\s", ""),
                             iterations            : 1,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
@@ -94,10 +92,10 @@
             ],
             [
 
-                    title        : 'Load test: CoGBK 2GB reiteration 10kB value',
-                    itClass      : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: CoGBK 2GB reiteration 10kB value',
+                    test           : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_CoGBK_3",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -121,7 +119,6 @@
                                             }
                                         """.trim().replaceAll("\\s", ""),
                             iterations            : 4,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
@@ -129,10 +126,10 @@
 
             ],
             [
-                    title        : 'Load test: CoGBK 2GB reiteration 2MB value',
-                    itClass      : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: CoGBK 2GB reiteration 2MB value',
+                    test           : 'org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_CoGBK_4",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -156,7 +153,6 @@
                                             }
                                         """.trim().replaceAll("\\s", ""),
                             iterations            : 4,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
@@ -171,8 +167,8 @@
 
   def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
   for (testConfiguration in loadTestConfigurations('streaming', true, datasetName)) {
-    testConfiguration.jobProperties << [inputWindowDurationSec: 1200, coInputWindowDurationSec: 1200]
-    loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.jobProperties, testConfiguration.itClass)
+    testConfiguration.pipelineOptions << [inputWindowDurationSec: 1200, coInputWindowDurationSec: 1200]
+    loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.pipelineOptions, testConfiguration.test)
   }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_Combine_Flink_Python.groovy b/.test-infra/jenkins/job_LoadTests_Combine_Flink_Python.groovy
index b56ddd4..2f847e7 100644
--- a/.test-infra/jenkins/job_LoadTests_Combine_Flink_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_Combine_Flink_Python.groovy
@@ -27,10 +27,10 @@
 
 def scenarios = { datasetName, sdkHarnessImageTag -> [
         [
-                title        : 'Combine Python Load test: 2GB 10 byte records',
-                itClass      : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Combine Python Load test: 2GB 10 byte records',
+                test           : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : 'load-tests-python-flink-batch-combine-1-' + now,
                         project             : 'apache-beam-testing',
                         publish_to_big_query: false,
@@ -48,10 +48,10 @@
                 ]
         ],
         [
-                title        : 'Combine Python Load test: 2GB Fanout 4',
-                itClass      : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Combine Python Load test: 2GB Fanout 4',
+                test           : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : 'load-tests-python-flink-batch-combine-4-' + now,
                         project             : 'apache-beam-testing',
                         publish_to_big_query: false,
@@ -70,10 +70,10 @@
                 ]
         ],
         [
-                title        : 'Combine Python Load test: 2GB Fanout 8',
-                itClass      : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Combine Python Load test: 2GB Fanout 8',
+                test           : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : 'load-tests-python-flink-batch-combine-5-' + now,
                         project             : 'apache-beam-testing',
                         publish_to_big_query: false,
@@ -124,7 +124,7 @@
     return testScenarios
             .findAll { it.title in titles }
             .forEach {
-                loadTestsBuilder.loadTest(scope, it.title, it.runner, CommonTestProperties.SDK.PYTHON, it.jobProperties, it.itClass)
+                loadTestsBuilder.loadTest(scope, it.title, it.runner, CommonTestProperties.SDK.PYTHON, it.pipelineOptions, it.test)
             }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_Combine_Java.groovy b/.test-infra/jenkins/job_LoadTests_Combine_Java.groovy
index 40d3a9f..6d35339 100644
--- a/.test-infra/jenkins/job_LoadTests_Combine_Java.groovy
+++ b/.test-infra/jenkins/job_LoadTests_Combine_Java.groovy
@@ -26,10 +26,10 @@
 def commonLoadTestConfig = { jobType, isStreaming, datasetName ->
   [
           [
-                  title        : 'Load test: 2GB of 10B records',
-                  itClass      : 'org.apache.beam.sdk.loadtests.CombineLoadTest',
-                  runner       : CommonTestProperties.Runner.DATAFLOW,
-                  jobProperties: [
+                  title          : 'Load test: 2GB of 10B records',
+                  test           : 'org.apache.beam.sdk.loadtests.CombineLoadTest',
+                  runner         : CommonTestProperties.Runner.DATAFLOW,
+                  pipelineOptions: [
                           project             : 'apache-beam-testing',
                           appName             : "load_tests_Java_Dataflow_${jobType}_Combine_1",
                           tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -46,7 +46,6 @@
                           fanout              : 1,
                           iterations          : 1,
                           topCount            : 20,
-                          maxNumWorkers       : 5,
                           numWorkers          : 5,
                           autoscalingAlgorithm: "NONE",
                           perKeyCombiner      : "TOP_LARGEST",
@@ -54,10 +53,10 @@
                   ]
           ],
           [
-                    title        : 'Load test: fanout 4 times with 2GB 10-byte records total',
-                    itClass      : 'org.apache.beam.sdk.loadtests.CombineLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: fanout 4 times with 2GB 10-byte records total',
+                    test           : 'org.apache.beam.sdk.loadtests.CombineLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project             : 'apache-beam-testing',
                             appName             : "load_tests_Java_Dataflow_${jobType}_Combine_4",
                             tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -74,7 +73,6 @@
                             fanout              : 4,
                             iterations          : 1,
                             topCount            : 20,
-                            maxNumWorkers       : 16,
                             numWorkers          : 16,
                             autoscalingAlgorithm: "NONE",
                             perKeyCombiner      : "TOP_LARGEST",
@@ -82,10 +80,10 @@
                     ]
             ],
             [
-                    title        : 'Load test: fanout 8 times with 2GB 10-byte records total',
-                    itClass      : 'org.apache.beam.sdk.loadtests.CombineLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: fanout 8 times with 2GB 10-byte records total',
+                    test           : 'org.apache.beam.sdk.loadtests.CombineLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project             : 'apache-beam-testing',
                             appName             : "load_tests_Java_Dataflow_${jobType}_Combine_5",
                             tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -102,7 +100,6 @@
                             fanout              : 8,
                             iterations          : 1,
                             topCount            : 20,
-                            maxNumWorkers       : 16,
                             numWorkers          : 16,
                             autoscalingAlgorithm: "NONE",
                             perKeyCombiner      : "TOP_LARGEST",
@@ -124,8 +121,8 @@
 
     def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
     for (testConfiguration in commonLoadTestConfig('streaming', true, datasetName)) {
-        testConfiguration.jobProperties << [inputWindowDurationSec: 1200]
-        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.jobProperties, testConfiguration.itClass)
+        testConfiguration.pipelineOptions << [inputWindowDurationSec: 1200]
+        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.pipelineOptions, testConfiguration.test)
     }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_Combine_Python.groovy b/.test-infra/jenkins/job_LoadTests_Combine_Python.groovy
index d00196a..cf18a58 100644
--- a/.test-infra/jenkins/job_LoadTests_Combine_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_Combine_Python.groovy
@@ -24,11 +24,10 @@
 
 def loadTestConfigurations = { datasetName -> [
         [
-                title        : 'Combine Python Load test: 2GB 10 byte records',
-                itClass      : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'Combine Python Load test: 2GB 10 byte records',
+                test           : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-combine-1-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/smoketests',
@@ -39,18 +38,16 @@
                                 '"num_records": 200000000,' +
                                 '"key_size": 1,' +
                                 '"value_size": 9}\'',
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE",
                         top_count            : 20,
                 ]
         ],
         [
-                title        : 'Combine Python Load test: 2GB Fanout 4',
-                itClass      : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'Combine Python Load test: 2GB Fanout 4',
+                test           : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-combine-4-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/smoketests',
@@ -61,7 +58,6 @@
                                 '"num_records": 5000000,' +
                                 '"key_size": 10,' +
                                 '"value_size": 90}\'',
-                        max_num_workers      : 16,
                         num_workers          : 16,
                         autoscaling_algorithm: "NONE",
                         fanout               : 4,
@@ -69,11 +65,10 @@
                 ]
         ],
         [
-                title        : 'Combine Python Load test: 2GB Fanout 8',
-                itClass      : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'Combine Python Load test: 2GB Fanout 8',
+                test           : 'apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-combine-5-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/smoketests',
@@ -84,7 +79,6 @@
                                 '"num_records": 2500000,' +
                                 '"key_size": 10,' +
                                 '"value_size": 90}\'',
-                        max_num_workers      : 16,
                         num_workers          : 16,
                         autoscaling_algorithm: "NONE",
                         fanout               : 8,
@@ -99,7 +93,7 @@
 
     def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
     for (testConfiguration in loadTestConfigurations(datasetName)) {
-        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.jobProperties, testConfiguration.itClass)
+        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.pipelineOptions, testConfiguration.test)
     }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_GBK_Flink_Python.groovy b/.test-infra/jenkins/job_LoadTests_GBK_Flink_Python.groovy
index 410f3c7..7f3bb21 100644
--- a/.test-infra/jenkins/job_LoadTests_GBK_Flink_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_GBK_Flink_Python.groovy
@@ -27,10 +27,10 @@
 
 def scenarios = { datasetName, sdkHarnessImageTag -> [
         [
-                title        : 'Load test: 2GB of 10B records',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: 2GB of 10B records',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_1_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -46,10 +46,10 @@
                 ]
         ],
         [
-                title        : 'Load test: 2GB of 100B records',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: 2GB of 100B records',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_2_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -65,10 +65,10 @@
                 ]
         ],
         [
-                title        : 'Load test: 2GB of 100kB records',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: 2GB of 100kB records',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_3_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -84,10 +84,10 @@
                 ]
         ],
         [
-                title        : 'Load test: fanout 4 times with 2GB 10-byte records total',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: fanout 4 times with 2GB 10-byte records total',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_4_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -103,10 +103,10 @@
                 ]
         ],
         [
-                title        : 'Load test: fanout 8 times with 2GB 10-byte records total',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: fanout 8 times with 2GB 10-byte records total',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_5_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -122,10 +122,10 @@
                 ]
         ],
         [
-                title        : 'Load test: reiterate 4 times 10kB values',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: reiterate 4 times 10kB values',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_6_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -141,10 +141,10 @@
                 ]
         ],
         [
-                title        : 'Load test: reiterate 4 times 2MB values',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'Load test: reiterate 4 times 2MB values',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         job_name            : "load_tests_Python_Flink_Batch_GBK_7_${now}",
                         publish_to_big_query: false,
                         project             : 'apache-beam-testing',
@@ -176,13 +176,13 @@
   def flink = new Flink(scope, 'beam_LoadTests_Python_GBK_Flink_Batch')
   flink.setUp([pythonHarnessImageTag], numberOfWorkers, publisher.getFullImageName('flink-job-server'))
 
-  def configurations = testScenarios.findAll { it.jobProperties?.parallelism?.value == numberOfWorkers }
+  def configurations = testScenarios.findAll { it.pipelineOptions?.parallelism?.value == numberOfWorkers }
   loadTestsBuilder.loadTests(scope, sdk, configurations, "GBK", "batch")
 
   numberOfWorkers = 5
   flink.scaleCluster(numberOfWorkers)
 
-  configurations = testScenarios.findAll { it.jobProperties?.parallelism?.value == numberOfWorkers }
+  configurations = testScenarios.findAll { it.pipelineOptions?.parallelism?.value == numberOfWorkers }
   loadTestsBuilder.loadTests(scope, sdk, configurations, "GBK", "batch")
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_GBK_Java.groovy b/.test-infra/jenkins/job_LoadTests_GBK_Java.groovy
index e925da3..b048c54 100644
--- a/.test-infra/jenkins/job_LoadTests_GBK_Java.groovy
+++ b/.test-infra/jenkins/job_LoadTests_GBK_Java.groovy
@@ -25,10 +25,10 @@
 def loadTestConfigurations = { mode, isStreaming, datasetName ->
     [
             [
-                    title        : 'Load test: 2GB of 10B records',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: 2GB of 10B records',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_GBK_1",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -44,17 +44,16 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 1,
                             iterations            : 1,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
                     ]
             ],
             [
-                    title        : 'Load test: 2GB of 100B records',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: 2GB of 100B records',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_GBK_2",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -70,7 +69,6 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 1,
                             iterations            : 1,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
@@ -78,10 +76,10 @@
             ],
             [
 
-                    title        : 'Load test: 2GB of 100kB records',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: 2GB of 100kB records',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_GBK_3",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -97,7 +95,6 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 1,
                             iterations            : 1,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
@@ -105,10 +102,10 @@
 
             ],
             [
-                    title        : 'Load test: fanout 4 times with 2GB 10-byte records total',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: fanout 4 times with 2GB 10-byte records total',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : 'load_tests_Java_Dataflow_${mode}_GBK_4',
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -124,17 +121,16 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 4,
                             iterations            : 1,
-                            maxNumWorkers         : 16,
                             numWorkers            : 16,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
                     ]
             ],
             [
-                    title        : 'Load test: fanout 8 times with 2GB 10-byte records total',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: fanout 8 times with 2GB 10-byte records total',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_GBK_5",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -150,17 +146,16 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 8,
                             iterations            : 1,
-                            maxNumWorkers         : 16,
                             numWorkers            : 16,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
                     ]
             ],
             [
-                    title        : 'Load test: reiterate 4 times 10kB values',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: reiterate 4 times 10kB values',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_GBK_6",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -178,17 +173,16 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 1,
                             iterations            : 4,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
                     ]
             ],
             [
-                    title        : 'Load test: reiterate 4 times 2MB values',
-                    itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: reiterate 4 times 2MB values',
+                    test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project               : 'apache-beam-testing',
                             appName               : "load_tests_Java_Dataflow_${mode}_GBK_7",
                             tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -206,7 +200,6 @@
                                        """.trim().replaceAll("\\s", ""),
                             fanout                : 1,
                             iterations            : 4,
-                            maxNumWorkers         : 5,
                             numWorkers            : 5,
                             autoscalingAlgorithm  : "NONE",
                             streaming             : isStreaming
@@ -221,8 +214,8 @@
 
   def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
   for (testConfiguration in loadTestConfigurations('streaming', true, datasetName)) {
-    testConfiguration.jobProperties << [inputWindowDurationSec: 1200]
-    loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.jobProperties, testConfiguration.itClass)
+    testConfiguration.pipelineOptions << [inputWindowDurationSec: 1200]
+    loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.pipelineOptions, testConfiguration.test)
   }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_GBK_Python.groovy b/.test-infra/jenkins/job_LoadTests_GBK_Python.groovy
index caf4ba9..b4b8c68 100644
--- a/.test-infra/jenkins/job_LoadTests_GBK_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_GBK_Python.groovy
@@ -23,11 +23,10 @@
 
 def loadTestConfigurations = { datasetName -> [
         [
-                title        : 'GroupByKey Python Load test: 2GB of 10B records',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: 2GB of 10B records',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-gbk-1-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -39,17 +38,15 @@
                                 '"value_size": 9}\'',
                         iterations           : 1,
                         fanout               : 1,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE"
                 ]
         ],
         [
-                title        : 'GroupByKey Python Load test: 2GB of 100B records',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: 2GB of 100B records',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-gbk-2-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -61,17 +58,15 @@
                                 '"value_size": 90}\'',
                         iterations           : 1,
                         fanout               : 1,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE"
                 ]
         ],
         [
-                title        : 'GroupByKey Python Load test: 2GB of 100kB records',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: 2GB of 100kB records',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-gbk-3-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -83,17 +78,15 @@
                                 '"value_size": 900000}\'',
                         iterations           : 1,
                         fanout               : 1,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE"
                 ]
         ],
         [
-                title        : 'GroupByKey Python Load test: fanout 4 times with 2GB 10-byte records total',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: fanout 4 times with 2GB 10-byte records total',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-gbk-4-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -105,17 +98,15 @@
                                 '"value_size": 90}\'',
                         iterations           : 1,
                         fanout               : 4,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE"
                 ]
         ],
         [
-                title        : 'GroupByKey Python Load test: fanout 8 times with 2GB 10-byte records total',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: fanout 8 times with 2GB 10-byte records total',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-gbk-5-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -127,7 +118,6 @@
                                 '"value_size": 90}\'',
                         iterations           : 1,
                         fanout               : 8,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE"
                 ]
diff --git a/.test-infra/jenkins/job_LoadTests_GBK_Python_reiterate.groovy b/.test-infra/jenkins/job_LoadTests_GBK_Python_reiterate.groovy
index e8b7852..5bb1ac4 100644
--- a/.test-infra/jenkins/job_LoadTests_GBK_Python_reiterate.groovy
+++ b/.test-infra/jenkins/job_LoadTests_GBK_Python_reiterate.groovy
@@ -26,10 +26,10 @@
 
 def loadTestConfigurations = { datasetName -> [
         [
-                title        : 'GroupByKey Python Load test: reiterate 4 times 10kB values',
-                itClass      :  'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: reiterate 4 times 10kB values',
+                test           :  'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-dataflow-batch-gbk-6-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -43,16 +43,15 @@
                                 '"hot_key_fraction": 1}\'',
                         fanout               : 1,
                         iterations           : 4,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: "NONE"
                 ]
         ],
         [
-                title        : 'GroupByKey Python Load test: reiterate 4 times 2MB values',
-                itClass      :  'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'GroupByKey Python Load test: reiterate 4 times 2MB values',
+                test           :  'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-dataflow-batch-gbk-7-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -66,7 +65,6 @@
                                 '"hot_key_fraction": 1}\'',
                         fanout               : 1,
                         iterations           : 4,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: 'NONE'
                 ]
@@ -79,7 +77,7 @@
 
     def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
     for (testConfiguration in loadTestConfigurations(datasetName)) {
-        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.jobProperties, testConfiguration.itClass)
+        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.pipelineOptions, testConfiguration.test)
     }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_Java_Smoke.groovy b/.test-infra/jenkins/job_LoadTests_Java_Smoke.groovy
index c0bbfeb..62fc180 100644
--- a/.test-infra/jenkins/job_LoadTests_Java_Smoke.groovy
+++ b/.test-infra/jenkins/job_LoadTests_Java_Smoke.groovy
@@ -22,10 +22,10 @@
 
 def smokeTestConfigurations = { datasetName -> [
         [
-                title        : 'GroupByKey load test Direct',
-                itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                runner       : CommonTestProperties.Runner.DIRECT,
-                jobProperties: [
+                title          : 'GroupByKey load test Direct',
+                test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                runner         : CommonTestProperties.Runner.DIRECT,
+                pipelineOptions: [
                         publishToBigQuery: true,
                         bigQueryDataset  : datasetName,
                         bigQueryTable    : 'direct_gbk',
@@ -36,10 +36,10 @@
                 ]
         ],
         [
-                title        : 'GroupByKey load test Dataflow',
-                itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'GroupByKey load test Dataflow',
+                test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project          : 'apache-beam-testing',
                         tempLocation     : 'gs://temp-storage-for-perf-tests/smoketests',
                         publishToBigQuery: true,
@@ -52,10 +52,10 @@
                 ]
         ],
         [
-                title        : 'GroupByKey load test Flink',
-                itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                runner       : CommonTestProperties.Runner.FLINK,
-                jobProperties: [
+                title          : 'GroupByKey load test Flink',
+                test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                runner         : CommonTestProperties.Runner.FLINK,
+                pipelineOptions: [
                         publishToBigQuery: true,
                         bigQueryDataset  : datasetName,
                         bigQueryTable    : 'flink_gbk',
@@ -66,10 +66,10 @@
                 ]
         ],
         [
-                title        : 'GroupByKey load test Spark',
-                itClass      : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
-                runner       : CommonTestProperties.Runner.SPARK,
-                jobProperties: [
+                title          : 'GroupByKey load test Spark',
+                test           : 'org.apache.beam.sdk.loadtests.GroupByKeyLoadTest',
+                runner         : CommonTestProperties.Runner.SPARK,
+                pipelineOptions: [
                         sparkMaster      : 'local[4]',
                         publishToBigQuery: true,
                         bigQueryDataset  : datasetName,
diff --git a/.test-infra/jenkins/job_LoadTests_ParDo_Flink_Python.groovy b/.test-infra/jenkins/job_LoadTests_ParDo_Flink_Python.groovy
new file mode 100644
index 0000000..97c37dd
--- /dev/null
+++ b/.test-infra/jenkins/job_LoadTests_ParDo_Flink_Python.groovy
@@ -0,0 +1,150 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+import CommonJobProperties as commonJobProperties
+import CommonTestProperties
+import LoadTestsBuilder as loadTestsBuilder
+import PhraseTriggeringPostCommitBuilder
+import Flink
+import Docker
+
+String now = new Date().format("MMddHHmmss", TimeZone.getTimeZone('UTC'))
+
+def scenarios = { datasetName, sdkHarnessImageTag -> [
+        [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 10 times',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
+                        job_name             : 'load-tests-python-flink-batch-pardo-1-' + now,
+                        project              : 'apache-beam-testing',
+                        publish_to_big_query : false,
+                        metrics_dataset      : datasetName,
+                        metrics_table        : 'python_flink_batch_pardo_1',
+                        input_options        : '\'{' +
+                                '"num_records": 20000000,' +
+                                '"key_size": 10,' +
+                                '"value_size": 90}\'',
+                        iterations           : 10,
+                        number_of_counter_operations: 0,
+                        number_of_counters   : 0,
+                        parallelism          : 5,
+                        job_endpoint         : 'localhost:8099',
+                        environment_config   : sdkHarnessImageTag,
+                        environment_type     : 'DOCKER',
+                ]
+        ],
+        [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 200 times',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
+                        job_name             : 'load-tests-python-flink-batch-pardo-2-' + now,
+                        project              : 'apache-beam-testing',
+                        publish_to_big_query : false,
+                        metrics_dataset      : datasetName,
+                        metrics_table        : 'python_flink_batch_pardo_2',
+                        input_options        : '\'{' +
+                                '"num_records": 20000000,' +
+                                '"key_size": 10,' +
+                                '"value_size": 90}\'',
+                        iterations           : 200,
+                        number_of_counter_operations: 0,
+                        number_of_counters   : 0,
+                        parallelism          : 5,
+                        job_endpoint         : 'localhost:8099',
+                        environment_config   : sdkHarnessImageTag,
+                        environment_type     : 'DOCKER',
+                ]
+        ],
+        [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 10 counters',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
+                        job_name             : 'load-tests-python-flink-batch-pardo-3-' + now,
+                        project              : 'apache-beam-testing',
+                        publish_to_big_query : false,
+                        metrics_dataset      : datasetName,
+                        metrics_table        : 'python_flink_batch_pardo_3',
+                        input_options        : '\'{' +
+                                '"num_records": 20000000,' +
+                                '"key_size": 10,' +
+                                '"value_size": 90}\'',
+                        iterations           : 1,
+                        number_of_counter_operations: 10,
+                        number_of_counters   : 1,
+                        parallelism          : 5,
+                        job_endpoint         : 'localhost:8099',
+                        environment_config   : sdkHarnessImageTag,
+                        environment_type     : 'DOCKER',
+                ]
+        ],
+        [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 100 counters',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
+                        job_name             : 'load-tests-python-flink-batch-pardo-4-' + now,
+                        project              : 'apache-beam-testing',
+                        publish_to_big_query : false,
+                        metrics_dataset      : datasetName,
+                        metrics_table        : 'python_flink_batch_pardo_4',
+                        input_options        : '\'{' +
+                                '"num_records": 20000000,' +
+                                '"key_size": 10,' +
+                                '"value_size": 90}\'',
+                        iterations           : 1,
+                        number_of_counter_operations: 100,
+                        number_of_counters   : 1,
+                        parallelism          : 5,
+                        job_endpoint         : 'localhost:8099',
+                        environment_config   : sdkHarnessImageTag,
+                        environment_type     : 'DOCKER',
+                ]
+        ],
+]}
+
+def loadTest = { scope, triggeringContext ->
+  Docker publisher = new Docker(scope, loadTestsBuilder.DOCKER_CONTAINER_REGISTRY)
+  String pythonHarnessImageTag = publisher.getFullImageName('python2.7_sdk')
+
+  def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
+  def numberOfWorkers = 5
+  List<Map> testScenarios = scenarios(datasetName, pythonHarnessImageTag)
+
+  publisher.publish(':sdks:python:container:py2:docker', 'python2.7_sdk')
+  publisher.publish(':runners:flink:1.7:job-server-container:docker', 'flink-job-server')
+  Flink flink = new Flink(scope, 'beam_LoadTests_Python_ParDo_Flink_Batch')
+  flink.setUp([pythonHarnessImageTag], numberOfWorkers, publisher.getFullImageName('flink-job-server'))
+
+  loadTestsBuilder.loadTests(scope, CommonTestProperties.SDK.PYTHON, testScenarios, 'ParDo', 'batch')
+}
+
+PhraseTriggeringPostCommitBuilder.postCommitJob(
+  'beam_LoadTests_Python_ParDo_Flink_Batch',
+  'Run Python Load Tests ParDo Flink Batch',
+  'Load Tests Python ParDo Flink Batch suite',
+  this
+) {
+  loadTest(delegate, CommonTestProperties.TriggeringContext.PR)
+}
+
+CronJobBuilder.cronJob('beam_LoadTests_Python_ParDo_Flink_Batch', 'H 13 * * *', this) {
+  loadTest(delegate, CommonTestProperties.TriggeringContext.POST_COMMIT)
+}
diff --git a/.test-infra/jenkins/job_LoadTests_ParDo_Java.groovy b/.test-infra/jenkins/job_LoadTests_ParDo_Java.groovy
index 5813800..2969c72 100644
--- a/.test-infra/jenkins/job_LoadTests_ParDo_Java.groovy
+++ b/.test-infra/jenkins/job_LoadTests_ParDo_Java.groovy
@@ -25,10 +25,10 @@
 def commonLoadTestConfig = { jobType, isStreaming, datasetName ->
     [
             [
-            title        : 'Load test: ParDo 2GB 100 byte records 10 times',
-            itClass      : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
-            runner       : CommonTestProperties.Runner.DATAFLOW,
-            jobProperties: [
+            title          : 'Load test: ParDo 2GB 100 byte records 10 times',
+            test           : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
+            runner         : CommonTestProperties.Runner.DATAFLOW,
+            pipelineOptions: [
                     project             : 'apache-beam-testing',
                     appName             : "load_tests_Java_Dataflow_${jobType}_ParDo_1",
                     tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -45,17 +45,16 @@
                     iterations          : 10,
                     numberOfCounters    : 1,
                     numberOfCounterOperations: 0,
-                    maxNumWorkers       : 5,
                     numWorkers          : 5,
                     autoscalingAlgorithm: "NONE",
                     streaming           : isStreaming
             ]
             ],
             [
-                    title        : 'Load test: ParDo 2GB 100 byte records 200  times',
-                    itClass      : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: ParDo 2GB 100 byte records 200  times',
+                    test           : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project             : 'apache-beam-testing',
                             appName             : "load_tests_Java_Dataflow_${jobType}_ParDo_2",
                             tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -72,7 +71,6 @@
                             iterations          : 200,
                             numberOfCounters    : 1,
                             numberOfCounterOperations: 0,
-                            maxNumWorkers       : 5,
                             numWorkers          : 5,
                             autoscalingAlgorithm: "NONE",
                             streaming           : isStreaming
@@ -80,10 +78,10 @@
             ],
             [
 
-                    title        : 'Load test: ParDo 2GB 100 byte records 10 counters',
-                    itClass      : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: ParDo 2GB 100 byte records 10 counters',
+                    test           : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project             : 'apache-beam-testing',
                             appName             : "load_tests_Java_Dataflow_${jobType}_ParDo_3",
                             tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -100,7 +98,6 @@
                             iterations          : 1,
                             numberOfCounters    : 1,
                             numberOfCounterOperations: 10,
-                            maxNumWorkers       : 5,
                             numWorkers          : 5,
                             autoscalingAlgorithm: "NONE",
                             streaming           : isStreaming
@@ -108,10 +105,10 @@
 
             ],
             [
-                    title        : 'Load test: ParDo 2GB 100 byte records 100 counters',
-                    itClass      : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
-                    runner       : CommonTestProperties.Runner.DATAFLOW,
-                    jobProperties: [
+                    title          : 'Load test: ParDo 2GB 100 byte records 100 counters',
+                    test           : 'org.apache.beam.sdk.loadtests.ParDoLoadTest',
+                    runner         : CommonTestProperties.Runner.DATAFLOW,
+                    pipelineOptions: [
                             project             : 'apache-beam-testing',
                             appName             : "load_tests_Java_Dataflow_${jobType}_ParDo_4",
                             tempLocation        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -128,7 +125,6 @@
                             iterations          : 1,
                             numberOfCounters    : 1,
                             numberOfCounterOperations: 100,
-                            maxNumWorkers       : 5,
                             numWorkers          : 5,
                             autoscalingAlgorithm: "NONE",
                             streaming           : isStreaming
@@ -149,8 +145,8 @@
 
     def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
     for (testConfiguration in commonLoadTestConfig('streaming', true, datasetName)) {
-        testConfiguration.jobProperties << [inputWindowDurationSec: 1200]
-        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.jobProperties, testConfiguration.itClass)
+        testConfiguration.pipelineOptions << [inputWindowDurationSec: 1200]
+        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.JAVA, testConfiguration.pipelineOptions, testConfiguration.test)
     }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_ParDo_Python.groovy b/.test-infra/jenkins/job_LoadTests_ParDo_Python.groovy
index 20f6b92..f55ed6e 100644
--- a/.test-infra/jenkins/job_LoadTests_ParDo_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_ParDo_Python.groovy
@@ -24,10 +24,10 @@
 
 def loadTestConfigurations = { datasetName -> [
         [
-                title        : 'ParDo Python Load test: 2GB 100 byte records 10 times',
-                itClass      : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 10 times',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-pardo-1-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -46,10 +46,10 @@
                 ]
         ],
         [
-                title        : 'ParDo Python Load test: 2GB 100 byte records 200 times',
-                itClass      : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 200 times',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-pardo-2-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -68,10 +68,10 @@
                 ]
         ],
         [
-                title        : 'ParDo Python Load test: 2GB 100 byte records 10 counters',
-                itClass      : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 10 counters',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-pardo-3-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -90,10 +90,10 @@
                 ]
         ],
         [
-                title        : 'ParDo Python Load test: 2GB 100 byte records 100 counters',
-                itClass      : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'ParDo Python Load test: 2GB 100 byte records 100 counters',
+                test           : 'apache_beam.testing.load_tests.pardo_test:ParDoTest.testParDo',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name             : 'load-tests-python-dataflow-batch-pardo-4-' + now,
                         project              : 'apache-beam-testing',
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -119,7 +119,7 @@
 
     def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
     for (testConfiguration in loadTestConfigurations(datasetName)) {
-        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.jobProperties, testConfiguration.itClass)
+        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.pipelineOptions, testConfiguration.test)
     }
 }
 
diff --git a/.test-infra/jenkins/job_LoadTests_Python_Smoke.groovy b/.test-infra/jenkins/job_LoadTests_Python_Smoke.groovy
index c8f9ac3..e03f7d2 100644
--- a/.test-infra/jenkins/job_LoadTests_Python_Smoke.groovy
+++ b/.test-infra/jenkins/job_LoadTests_Python_Smoke.groovy
@@ -23,11 +23,10 @@
 
 def smokeTestConfigurations = { datasetName -> [
         [
-                title        : 'GroupByKey Python load test Direct',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DIRECT,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python load test Direct',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DIRECT,
+                pipelineOptions: [
                         publish_to_big_query: true,
                         project             : 'apache-beam-testing',
                         metrics_dataset     : datasetName,
@@ -39,11 +38,10 @@
                 ]
         ],
         [
-                title        : 'GroupByKey Python load test Dataflow',
-                itClass      : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                sdk          : CommonTestProperties.SDK.PYTHON,
-                jobProperties: [
+                title          : 'GroupByKey Python load test Dataflow',
+                test           : 'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         job_name            : 'load-tests-python-dataflow-batch-gbk-smoke-' + now,
                         project             : 'apache-beam-testing',
                         temp_location       : 'gs://temp-storage-for-perf-tests/smoketests',
diff --git a/.test-infra/jenkins/job_LoadTests_coGBK_Flink_Python.groovy b/.test-infra/jenkins/job_LoadTests_coGBK_Flink_Python.groovy
index e032bdc..026d197 100644
--- a/.test-infra/jenkins/job_LoadTests_coGBK_Flink_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_coGBK_Flink_Python.groovy
@@ -27,10 +27,10 @@
 
 def scenarios = { datasetName, sdkHarnessImageTag -> [
         [
-                title        : 'CoGroupByKey Python Load test: 2GB of 100B records with a single key',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: 2GB of 100B records with a single key',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-flink-batch-cogbk-1-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -57,10 +57,10 @@
                 ]
         ],
         [
-                title        : 'CoGroupByKey Python Load test: 2GB of 100B records with multiple keys',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: 2GB of 100B records with multiple keys',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-flink-batch-cogbk-2-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -87,10 +87,10 @@
                 ]
         ],
         [
-                title        : 'CoGroupByKey Python Load test: reiterate 4 times 10kB values',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: reiterate 4 times 10kB values',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-flink-batch-cogbk-3-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -117,10 +117,10 @@
                 ]
         ],
         [
-                title        : 'CoGroupByKey Python Load test: reiterate 4 times 2MB values',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.PORTABLE,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: reiterate 4 times 2MB values',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.PORTABLE,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-flink-batch-cogbk-4-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
diff --git a/.test-infra/jenkins/job_LoadTests_coGBK_Python.groovy b/.test-infra/jenkins/job_LoadTests_coGBK_Python.groovy
index 4be4433..470602c 100644
--- a/.test-infra/jenkins/job_LoadTests_coGBK_Python.groovy
+++ b/.test-infra/jenkins/job_LoadTests_coGBK_Python.groovy
@@ -26,10 +26,10 @@
 
 def loadTestConfigurations = { datasetName -> [
         [
-                title        : 'CoGroupByKey Python Load test: 2GB of 100B records with a single key',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: 2GB of 100B records with a single key',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-dataflow-batch-cogbk-1-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -49,16 +49,15 @@
                                 '"num_hot_keys": 1,' +
                                 '"hot_key_fraction": 1}\'',
                         iterations           : 1,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: 'NONE'
                 ]
         ],
         [
-                title        : 'CoGroupByKey Python Load test: 2GB of 100B records with multiple keys',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: 2GB of 100B records with multiple keys',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-dataflow-batch-cogbk-2-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -78,16 +77,15 @@
                                 '"num_hot_keys": 5,' +
                                 '"hot_key_fraction": 1}\'',
                         iterations           : 1,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: 'NONE'
                 ]
         ],
         [
-                title        : 'CoGroupByKey Python Load test: reiterate 4 times 10kB values',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: reiterate 4 times 10kB values',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-dataflow-batch-cogbk-3-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -107,16 +105,15 @@
                                 '"num_hot_keys": 200000,' +
                                 '"hot_key_fraction": 1}\'',
                         iterations           : 4,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: 'NONE'
                 ]
         ],
         [
-                title        : 'CoGroupByKey Python Load test: reiterate 4 times 2MB values',
-                itClass      : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
-                runner       : CommonTestProperties.Runner.DATAFLOW,
-                jobProperties: [
+                title          : 'CoGroupByKey Python Load test: reiterate 4 times 2MB values',
+                test           : 'apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey',
+                runner         : CommonTestProperties.Runner.DATAFLOW,
+                pipelineOptions: [
                         project              : 'apache-beam-testing',
                         job_name             : 'load-tests-python-dataflow-batch-cogbk-4-' + now,
                         temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -136,7 +133,6 @@
                                 '"num_hot_keys": 1000,' +
                                 '"hot_key_fraction": 1}\'',
                         iterations           : 4,
-                        max_num_workers      : 5,
                         num_workers          : 5,
                         autoscaling_algorithm: 'NONE'
                 ]
@@ -149,7 +145,7 @@
 
     def datasetName = loadTestsBuilder.getBigQueryDataset('load_test', triggeringContext)
     for (testConfiguration in loadTestConfigurations(datasetName)) {
-        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.jobProperties, testConfiguration.itClass)
+        loadTestsBuilder.loadTest(scope, testConfiguration.title, testConfiguration.runner, CommonTestProperties.SDK.PYTHON, testConfiguration.pipelineOptions, testConfiguration.test)
     }
 }
 
diff --git a/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Java.groovy b/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Java.groovy
index 9266af7..7231140 100644
--- a/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Java.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Java.groovy
@@ -23,10 +23,10 @@
 def now = new Date().format("MMddHHmmss", TimeZone.getTimeZone('UTC'))
 
 def bqioStreamTest = [
-        title        : 'BigQueryIO Streaming Performance Test Java 10 GB',
-        itClass      : 'org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT',
-        runner       : CommonTestProperties.Runner.DATAFLOW,
-        jobProperties: [
+        title          : 'BigQueryIO Streaming Performance Test Java 10 GB',
+        test           : 'org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT',
+        runner         : CommonTestProperties.Runner.DATAFLOW,
+        pipelineOptions: [
                 jobName               : 'performance-tests-bqio-java-stream-10gb' + now,
                 project               : 'apache-beam-testing',
                 tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -41,17 +41,16 @@
                         '"num_records": 10485760,' +
                         '"key_size": 1,' +
                         '"value_size": 1024}\'',
-                maxNumWorkers         : 5,
                 numWorkers            : 5,
                 autoscalingAlgorithm  : 'NONE',  // Disable autoscale the worker pool.
         ]
 ]
 
 def bqioBatchTest = [
-        title        : 'BigQueryIO Batch Performance Test Java 10 GB',
-        itClass      : 'org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT',
-        runner       : CommonTestProperties.Runner.DATAFLOW,
-        jobProperties: [
+        title          : 'BigQueryIO Batch Performance Test Java 10 GB',
+        test           : 'org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT',
+        runner         : CommonTestProperties.Runner.DATAFLOW,
+        pipelineOptions: [
                 jobName               : 'performance-tests-bqio-java-stream-10gb' + now,
                 project               : 'apache-beam-testing',
                 tempLocation          : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -66,7 +65,6 @@
                         '"num_records": 10485760,' +
                         '"key_size": 1,' +
                         '"value_size": 1024}\'',
-                maxNumWorkers         : 5,
                 numWorkers            : 5,
                 autoscalingAlgorithm  : 'NONE',  // Disable autoscale the worker pool.
         ]
@@ -81,9 +79,9 @@
                 rootBuildScriptDir(commonJobProperties.checkoutDir)
                 commonJobProperties.setGradleSwitches(delegate)
                 switches("--info")
-                switches("-DintegrationTestPipelineOptions=\'${commonJobProperties.joinPipelineOptions(testConfig.jobProperties)}\'")
+                switches("-DintegrationTestPipelineOptions=\'${commonJobProperties.joinPipelineOptions(testConfig.pipelineOptions)}\'")
                 switches("-DintegrationTestRunner=\'${testConfig.runner}\'")
-                tasks("${testTask} --tests ${testConfig.itClass}")
+                tasks("${testTask} --tests ${testConfig.test}")
             }
         }
 
diff --git a/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Python.groovy b/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Python.groovy
index 2f1bc86..471b394 100644
--- a/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Python.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_BigQueryIO_Python.groovy
@@ -23,10 +23,10 @@
 def now = new Date().format("MMddHHmmss", TimeZone.getTimeZone('UTC'))
 
 def bqio_read_test = [
-        title        : 'BigQueryIO Read Performance Test Python 10 GB',
-        itClass      : 'apache_beam.io.gcp.bigquery_read_perf_test:BigQueryReadPerfTest.test',
-        runner       : CommonTestProperties.Runner.DATAFLOW,
-        jobProperties: [
+        title          : 'BigQueryIO Read Performance Test Python 10 GB',
+        test           : 'apache_beam.io.gcp.bigquery_read_perf_test:BigQueryReadPerfTest.test',
+        runner         : CommonTestProperties.Runner.DATAFLOW,
+        pipelineOptions: [
                 job_name             : 'performance-tests-bqio-read-python-10gb' + now,
                 project              : 'apache-beam-testing',
                 temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -39,17 +39,16 @@
                         '"num_records": 10485760,' +
                         '"key_size": 1,' +
                         '"value_size": 1024}\'',
-                max_num_workers      : 5,
                 num_workers          : 5,
                 autoscaling_algorithm: 'NONE',  // Disable autoscale the worker pool.
         ]
 ]
 
 def bqio_write_test = [
-        title        : 'BigQueryIO Write Performance Test Python Batch 10 GB',
-        itClass      : 'apache_beam.io.gcp.bigquery_write_perf_test:BigQueryWritePerfTest.test',
-        runner       : CommonTestProperties.Runner.DATAFLOW,
-        jobProperties: [
+        title          : 'BigQueryIO Write Performance Test Python Batch 10 GB',
+        test           : 'apache_beam.io.gcp.bigquery_write_perf_test:BigQueryWritePerfTest.test',
+        runner         : CommonTestProperties.Runner.DATAFLOW,
+        pipelineOptions: [
                 job_name             : 'performance-tests-bqio-write-python-batch-10gb' + now,
                 project              : 'apache-beam-testing',
                 temp_location        : 'gs://temp-storage-for-perf-tests/loadtests',
@@ -62,7 +61,6 @@
                         '"num_records": 10485760,' +
                         '"key_size": 1,' +
                         '"value_size": 1024}\'',
-                max_num_workers      : 5,
                 num_workers          : 5,
                 autoscaling_algorithm: 'NONE',  // Disable autoscale the worker pool.
         ]
@@ -71,7 +69,7 @@
 def executeJob = { scope, testConfig ->
     commonJobProperties.setTopLevelMainJobProperties(scope, 'master', 240)
 
-    loadTestsBuilder.loadTest(scope, testConfig.title, testConfig.runner, CommonTestProperties.SDK.PYTHON, testConfig.jobProperties, testConfig.itClass)
+    loadTestsBuilder.loadTest(scope, testConfig.title, testConfig.runner, CommonTestProperties.SDK.PYTHON, testConfig.pipelineOptions, testConfig.test)
 }
 
 PhraseTriggeringPostCommitBuilder.postCommitJob(
diff --git a/.test-infra/jenkins/job_PerformanceTests_FileBasedIO_IT.groovy b/.test-infra/jenkins/job_PerformanceTests_FileBasedIO_IT.groovy
index 4775808..ddccd5d 100644
--- a/.test-infra/jenkins/job_PerformanceTests_FileBasedIO_IT.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_FileBasedIO_IT.groovy
@@ -26,9 +26,11 @@
                 githubTitle        : 'Java TextIO Performance Test',
                 githubTriggerPhrase: 'Run Java TextIO Performance Test',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'textioit_results',
-                        numberOfRecords: '1000000'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'textioit_results',
+                        numberOfRecords     : '1000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
 
         ],
@@ -39,10 +41,12 @@
                 githubTitle        : 'Java CompressedTextIO Performance Test',
                 githubTriggerPhrase: 'Run Java CompressedTextIO Performance Test',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'compressed_textioit_results',
-                        numberOfRecords: '1000000',
-                        compressionType: 'GZIP'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'compressed_textioit_results',
+                        numberOfRecords     : '1000000',
+                        compressionType     : 'GZIP',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
@@ -57,7 +61,9 @@
                         reportGcsPerformanceMetrics: 'true',
                         gcsPerformanceMetrics      : 'true',
                         numberOfRecords            : '1000000',
-                        numberOfShards             : '1000'
+                        numberOfShards             : '1000',
+                        numWorkers                 : '5',
+                        autoscalingAlgorithm       : 'NONE'
                 ]
 
         ],
@@ -68,9 +74,11 @@
                 githubTitle        : 'Java AvroIO Performance Test',
                 githubTriggerPhrase: 'Run Java AvroIO Performance Test',
                 pipelineOptions    : [
-                        numberOfRecords: '1000000',
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'avroioit_results',
+                        numberOfRecords     : '1000000',
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'avroioit_results',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
@@ -80,9 +88,11 @@
                 githubTitle        : 'Java TFRecordIO Performance Test',
                 githubTriggerPhrase: 'Run Java TFRecordIO Performance Test',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'tfrecordioit_results',
-                        numberOfRecords: '1000000'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'tfrecordioit_results',
+                        numberOfRecords     : '1000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
@@ -92,10 +102,12 @@
                 githubTitle        : 'Java XmlIOPerformance Test',
                 githubTriggerPhrase: 'Run Java XmlIO Performance Test',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'xmlioit_results',
-                        numberOfRecords: '100000000',
-                        charset        : 'UTF-8'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'xmlioit_results',
+                        numberOfRecords     : '100000000',
+                        charset             : 'UTF-8',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
@@ -105,40 +117,46 @@
                 githubTitle        : 'Java ParquetIOPerformance Test',
                 githubTriggerPhrase: 'Run Java ParquetIO Performance Test',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'parquetioit_results',
-                        numberOfRecords: '100000000'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'parquetioit_results',
+                        numberOfRecords     : '100000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
                 name               : 'beam_PerformanceTests_TextIOIT_HDFS',
-                description        : 'Runs PerfKit tests for TextIOIT on HDFS',
+                description        : 'Runs performance tests for TextIOIT on HDFS',
                 test               : 'org.apache.beam.sdk.io.text.TextIOIT',
                 githubTitle        : 'Java TextIO Performance Test on HDFS',
                 githubTriggerPhrase: 'Run Java TextIO Performance Test HDFS',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'textioit_hdfs_results',
-                        numberOfRecords: '1000000'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'textioit_hdfs_results',
+                        numberOfRecords     : '1000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
 
         ],
         [
                 name               : 'beam_PerformanceTests_Compressed_TextIOIT_HDFS',
-                description        : 'Runs PerfKit tests for TextIOIT with GZIP compression on HDFS',
+                description        : 'Runs performance tests for TextIOIT with GZIP compression on HDFS',
                 test               : 'org.apache.beam.sdk.io.text.TextIOIT',
                 githubTitle        : 'Java CompressedTextIO Performance Test on HDFS',
                 githubTriggerPhrase: 'Run Java CompressedTextIO Performance Test HDFS',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'compressed_textioit_hdfs_results',
-                        numberOfRecords: '1000000',
-                        compressionType: 'GZIP'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'compressed_textioit_hdfs_results',
+                        numberOfRecords     : '1000000',
+                        compressionType     : 'GZIP',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
                 name               : 'beam_PerformanceTests_ManyFiles_TextIOIT_HDFS',
-                description        : 'Runs PerfKit tests for TextIOIT with many output files on HDFS',
+                description        : 'Runs performance tests for TextIOIT with many output files on HDFS',
                 test               : 'org.apache.beam.sdk.io.text.TextIOIT',
                 githubTitle        : 'Java ManyFilesTextIO Performance Test on HDFS',
                 githubTriggerPhrase: 'Run Java ManyFilesTextIO Performance Test HDFS',
@@ -148,55 +166,65 @@
                         reportGcsPerformanceMetrics: 'true',
                         gcsPerformanceMetrics      : 'true',
                         numberOfRecords            : '1000000',
-                        numberOfShards             : '1000'
+                        numberOfShards             : '1000',
+                        numWorkers                 : '5',
+                        autoscalingAlgorithm       : 'NONE'
                 ]
 
         ],
         [
                 name               : 'beam_PerformanceTests_AvroIOIT_HDFS',
-                description        : 'Runs PerfKit tests for AvroIOIT on HDFS',
+                description        : 'Runs performance tests for AvroIOIT on HDFS',
                 test               : 'org.apache.beam.sdk.io.avro.AvroIOIT',
                 githubTitle        : 'Java AvroIO Performance Test on HDFS',
                 githubTriggerPhrase: 'Run Java AvroIO Performance Test HDFS',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'avroioit_hdfs_results',
-                        numberOfRecords: '1000000'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'avroioit_hdfs_results',
+                        numberOfRecords     : '1000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
                 name               : 'beam_PerformanceTests_TFRecordIOIT_HDFS',
-                description        : 'Runs PerfKit tests for beam_PerformanceTests_TFRecordIOIT on HDFS',
+                description        : 'Runs performance tests for beam_PerformanceTests_TFRecordIOIT on HDFS',
                 test               : 'org.apache.beam.sdk.io.tfrecord.TFRecordIOIT',
                 githubTitle        : 'Java TFRecordIO Performance Test on HDFS',
                 githubTriggerPhrase: 'Run Java TFRecordIO Performance Test HDFS',
                 pipelineOptions    : [
-                        numberOfRecords: '1000000'
+                        numberOfRecords     : '1000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
                 name               : 'beam_PerformanceTests_XmlIOIT_HDFS',
-                description        : 'Runs PerfKit tests for beam_PerformanceTests_XmlIOIT on HDFS',
+                description        : 'Runs performance tests for beam_PerformanceTests_XmlIOIT on HDFS',
                 test               : 'org.apache.beam.sdk.io.xml.XmlIOIT',
                 githubTitle        : 'Java XmlIOPerformance Test on HDFS',
                 githubTriggerPhrase: 'Run Java XmlIO Performance Test HDFS',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'xmlioit_hdfs_results',
-                        numberOfRecords: '100000',
-                        charset        : 'UTF-8'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'xmlioit_hdfs_results',
+                        numberOfRecords     : '100000',
+                        charset             : 'UTF-8',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ],
         [
                 name               : 'beam_PerformanceTests_ParquetIOIT_HDFS',
-                description        : 'Runs PerfKit tests for beam_PerformanceTests_ParquetIOIT on HDFS',
+                description        : 'Runs performance tests for beam_PerformanceTests_ParquetIOIT on HDFS',
                 test               : 'org.apache.beam.sdk.io.parquet.ParquetIOIT',
                 githubTitle        : 'Java ParquetIOPerformance Test on HDFS',
                 githubTriggerPhrase: 'Run Java ParquetIO Performance Test HDFS',
                 pipelineOptions    : [
-                        bigQueryDataset: 'beam_performance',
-                        bigQueryTable  : 'parquetioit_hdfs_results',
-                        numberOfRecords: '1000000'
+                        bigQueryDataset     : 'beam_performance',
+                        bigQueryTable       : 'parquetioit_hdfs_results',
+                        numberOfRecords     : '1000000',
+                        numWorkers          : '5',
+                        autoscalingAlgorithm: 'NONE'
                 ]
         ]
 ]
diff --git a/.test-infra/jenkins/job_PerformanceTests_HadoopFormat.groovy b/.test-infra/jenkins/job_PerformanceTests_HadoopFormat.groovy
index 795e3f4..0311859 100644
--- a/.test-infra/jenkins/job_PerformanceTests_HadoopFormat.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_HadoopFormat.groovy
@@ -37,18 +37,20 @@
   k8s.loadBalancerIP("postgres-for-dev", postgresHostName)
 
   Map pipelineOptions = [
-          tempRoot            : 'gs://temp-storage-for-perf-tests',
-          project             : 'apache-beam-testing',
-          runner              : 'DataflowRunner',
-          numberOfRecords     : '600000',
-          bigQueryDataset     : 'beam_performance',
-          bigQueryTable       : 'hadoopformatioit_results',
-          postgresUsername    : 'postgres',
-          postgresPassword    : 'uuinkks',
-          postgresDatabaseName: 'postgres',
-          postgresServerName  : "\$${postgresHostName}",
-          postgresSsl         : false,
-          postgresPort        : '5432',
+          tempRoot             : 'gs://temp-storage-for-perf-tests',
+          project              : 'apache-beam-testing',
+          runner               : 'DataflowRunner',
+          numberOfRecords      : '600000',
+          bigQueryDataset      : 'beam_performance',
+          bigQueryTable        : 'hadoopformatioit_results',
+          postgresUsername     : 'postgres',
+          postgresPassword     : 'uuinkks',
+          postgresDatabaseName : 'postgres',
+          postgresServerName   : "\$${postgresHostName}",
+          postgresSsl          : false,
+          postgresPort         : '5432',
+          numWorkers           : '5',
+          autoscalingAlgorithm : 'NONE'
   ]
 
   steps {
diff --git a/.test-infra/jenkins/job_PerformanceTests_JDBC.groovy b/.test-infra/jenkins/job_PerformanceTests_JDBC.groovy
index 1bb7e0b..abc6b98 100644
--- a/.test-infra/jenkins/job_PerformanceTests_JDBC.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_JDBC.groovy
@@ -38,18 +38,20 @@
   k8s.loadBalancerIP("postgres-for-dev", postgresHostName)
 
   Map pipelineOptions = [
-          tempRoot            : 'gs://temp-storage-for-perf-tests',
-          project             : 'apache-beam-testing',
-          runner              : 'DataflowRunner',
-          numberOfRecords     : '5000000',
-          bigQueryDataset     : 'beam_performance',
-          bigQueryTable       : 'jdbcioit_results',
-          postgresUsername    : 'postgres',
-          postgresPassword    : 'uuinkks',
-          postgresDatabaseName: 'postgres',
-          postgresServerName  : "\$${postgresHostName}",
-          postgresSsl         : false,
-          postgresPort        : '5432'
+          tempRoot             : 'gs://temp-storage-for-perf-tests',
+          project              : 'apache-beam-testing',
+          runner               : 'DataflowRunner',
+          numberOfRecords      : '5000000',
+          bigQueryDataset      : 'beam_performance',
+          bigQueryTable        : 'jdbcioit_results',
+          postgresUsername     : 'postgres',
+          postgresPassword     : 'uuinkks',
+          postgresDatabaseName : 'postgres',
+          postgresServerName   : "\$${postgresHostName}",
+          postgresSsl          : false,
+          postgresPort         : '5432',
+          autoscalingAlgorithm : 'NONE',
+          numWorkers           : '5'
   ]
 
   steps {
diff --git a/.test-infra/jenkins/job_PerformanceTests_MongoDBIO_IT.groovy b/.test-infra/jenkins/job_PerformanceTests_MongoDBIO_IT.groovy
index e9f379a..83e1199 100644
--- a/.test-infra/jenkins/job_PerformanceTests_MongoDBIO_IT.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_MongoDBIO_IT.groovy
@@ -37,15 +37,17 @@
   k8s.loadBalancerIP("mongo-load-balancer-service", mongoHostName)
 
   Map pipelineOptions = [
-          tempRoot       : 'gs://temp-storage-for-perf-tests',
-          project        : 'apache-beam-testing',
-          numberOfRecords: '10000000',
-          bigQueryDataset: 'beam_performance',
-          bigQueryTable  : 'mongodbioit_results',
-          mongoDBDatabaseName: 'beam',
-          mongoDBHostName: "\$${mongoHostName}",
-          mongoDBPort: 27017,
-          runner: 'DataflowRunner'
+          tempRoot            : 'gs://temp-storage-for-perf-tests',
+          project             : 'apache-beam-testing',
+          numberOfRecords     : '10000000',
+          bigQueryDataset     : 'beam_performance',
+          bigQueryTable       : 'mongodbioit_results',
+          mongoDBDatabaseName : 'beam',
+          mongoDBHostName     : "\$${mongoHostName}",
+          mongoDBPort         : 27017,
+          runner              : 'DataflowRunner',
+          autoscalingAlgorithm: 'NONE',
+          numWorkers          : '5'
   ]
 
   steps {
diff --git a/.test-infra/jenkins/job_PerformanceTests_Python.groovy b/.test-infra/jenkins/job_PerformanceTests_Python.groovy
index 131417f..e10fd41 100644
--- a/.test-infra/jenkins/job_PerformanceTests_Python.groovy
+++ b/.test-infra/jenkins/job_PerformanceTests_Python.groovy
@@ -33,7 +33,7 @@
   // A benchmark defined flag, will pass to benchmark as "--bigqueryTable"
   String resultTable
   // A benchmark defined flag, will pass to benchmark as "--beam_it_class"
-  String itClass
+  String test
   // A benchmark defined flag, will pass to benchmark as "--beam_it_module".
   // It's a Gradle project that defines 'integrationTest' task. This task is executed by Perfkit
   // Beam benchmark launcher and can be added by enablePythonPerformanceTest() defined in
@@ -65,7 +65,7 @@
         jobDescription    : 'Python SDK Performance Test - Run WordCountIT in Py27 with 1Gb files',
         jobTriggerPhrase  : 'Run Python27 WordCountIT Performance Test',
         resultTable       : 'beam_performance.wordcount_py27_pkb_results',
-        itClass           : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
+        test              : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
         itModule          : ':sdks:python:test-suites:dataflow:py2',
         extraPipelineArgs : dataflowPipelineArgs + [
             input: 'gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000*', // 1Gb
@@ -80,7 +80,7 @@
         jobDescription    : 'Python SDK Performance Test - Run WordCountIT in Py35 with 1Gb files',
         jobTriggerPhrase  : 'Run Python35 WordCountIT Performance Test',
         resultTable       : 'beam_performance.wordcount_py35_pkb_results',
-        itClass           : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
+        test              : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
         itModule          : ':sdks:python:test-suites:dataflow:py35',
         extraPipelineArgs : dataflowPipelineArgs + [
             input: 'gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000*', // 1Gb
@@ -95,7 +95,7 @@
         jobDescription    : 'Python SDK Performance Test - Run WordCountIT in Py36 with 1Gb files',
         jobTriggerPhrase  : 'Run Python36 WordCountIT Performance Test',
         resultTable       : 'beam_performance.wordcount_py36_pkb_results',
-        itClass           : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
+        test              : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
         itModule          : ':sdks:python:test-suites:dataflow:py36',
         extraPipelineArgs : dataflowPipelineArgs + [
             input: 'gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000*', // 1Gb
@@ -110,7 +110,7 @@
         jobDescription    : 'Python SDK Performance Test - Run WordCountIT in Py37 with 1Gb files',
         jobTriggerPhrase  : 'Run Python37 WordCountIT Performance Test',
         resultTable       : 'beam_performance.wordcount_py37_pkb_results',
-        itClass           : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
+        test              : 'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
         itModule          : ':sdks:python:test-suites:dataflow:py37',
         extraPipelineArgs : dataflowPipelineArgs + [
             input: 'gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000*', // 1Gb
@@ -149,7 +149,7 @@
         beam_sdk                : 'python',
         benchmarks              : testConfig.benchmarkName,
         bigquery_table          : testConfig.resultTable,
-        beam_it_class           : testConfig.itClass,
+        beam_it_class           : testConfig.test,
         beam_it_module          : testConfig.itModule,
         beam_prebuilt           : 'true',   // Python benchmark don't need to prebuild repo before running
         beam_python_sdk_location: testConfig.pythonSdkLocation,
diff --git a/.test-infra/jenkins/job_PostCommit_Java_Nexmark_Flink.groovy b/.test-infra/jenkins/job_PostCommit_Java_Nexmark_Flink.groovy
index 065f74d..038d6b3 100644
--- a/.test-infra/jenkins/job_PostCommit_Java_Nexmark_Flink.groovy
+++ b/.test-infra/jenkins/job_PostCommit_Java_Nexmark_Flink.groovy
@@ -40,7 +40,7 @@
       rootBuildScriptDir(commonJobProperties.checkoutDir)
       tasks(':sdks:java:testing:nexmark:run')
       commonJobProperties.setGradleSwitches(delegate)
-      switches('-Pnexmark.runner=":runners:flink:1.5"' +
+      switches('-Pnexmark.runner=":runners:flink:1.8"' +
               ' -Pnexmark.args="' +
               [NexmarkBigqueryProperties.nexmarkBigQueryArgs,
               '--streaming=false',
@@ -55,7 +55,7 @@
       rootBuildScriptDir(commonJobProperties.checkoutDir)
       tasks(':sdks:java:testing:nexmark:run')
       commonJobProperties.setGradleSwitches(delegate)
-      switches('-Pnexmark.runner=":runners:flink:1.5"' +
+      switches('-Pnexmark.runner=":runners:flink:1.8"' +
               ' -Pnexmark.args="' +
               [NexmarkBigqueryProperties.nexmarkBigQueryArgs,
               '--streaming=true',
@@ -70,7 +70,7 @@
       rootBuildScriptDir(commonJobProperties.checkoutDir)
       tasks(':sdks:java:testing:nexmark:run')
       commonJobProperties.setGradleSwitches(delegate)
-      switches('-Pnexmark.runner=":runners:flink:1.5"' +
+      switches('-Pnexmark.runner=":runners:flink:1.8"' +
               ' -Pnexmark.args="' +
               [NexmarkBigqueryProperties.nexmarkBigQueryArgs,
               '--queryLanguage=sql',
@@ -85,7 +85,7 @@
       rootBuildScriptDir(commonJobProperties.checkoutDir)
       tasks(':sdks:java:testing:nexmark:run')
       commonJobProperties.setGradleSwitches(delegate)
-      switches('-Pnexmark.runner=":runners:flink:1.5"' +
+      switches('-Pnexmark.runner=":runners:flink:1.8"' +
               ' -Pnexmark.args="' +
               [NexmarkBigqueryProperties.nexmarkBigQueryArgs,
               '--queryLanguage=sql',
diff --git a/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Batch.groovy b/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Batch.groovy
index b8a59b3..4da75f9 100644
--- a/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Batch.groovy
+++ b/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Batch.groovy
@@ -36,7 +36,7 @@
   steps {
     gradle {
       rootBuildScriptDir(commonJobProperties.checkoutDir)
-      tasks(':runners:flink:1.5:job-server:validatesPortableRunnerBatch')
+      tasks(':runners:flink:1.8:job-server:validatesPortableRunnerBatch')
       commonJobProperties.setGradleSwitches(delegate)
     }
   }
diff --git a/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Streaming.groovy b/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Streaming.groovy
index 6a48e31..612c154 100644
--- a/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Streaming.groovy
+++ b/.test-infra/jenkins/job_PostCommit_Java_PortableValidatesRunner_Flink_Streaming.groovy
@@ -36,7 +36,7 @@
   steps {
     gradle {
       rootBuildScriptDir(commonJobProperties.checkoutDir)
-      tasks(':runners:flink:1.5:job-server:validatesPortableRunnerStreaming')
+      tasks(':runners:flink:1.8:job-server:validatesPortableRunnerStreaming')
       commonJobProperties.setGradleSwitches(delegate)
     }
   }
diff --git a/.test-infra/jenkins/job_PostCommit_Java_ValidatesRunner_Flink.groovy b/.test-infra/jenkins/job_PostCommit_Java_ValidatesRunner_Flink.groovy
index 2c8b212..d5e6da9 100644
--- a/.test-infra/jenkins/job_PostCommit_Java_ValidatesRunner_Flink.groovy
+++ b/.test-infra/jenkins/job_PostCommit_Java_ValidatesRunner_Flink.groovy
@@ -37,7 +37,7 @@
   steps {
     gradle {
       rootBuildScriptDir(commonJobProperties.checkoutDir)
-      tasks(':runners:flink:1.5:validatesRunner')
+      tasks(':runners:flink:1.8:validatesRunner')
       commonJobProperties.setGradleSwitches(delegate)
     }
   }
diff --git a/.test-infra/jenkins/job_PostCommit_Python35_ValidatesRunner_Flink.groovy b/.test-infra/jenkins/job_PostCommit_Python35_ValidatesRunner_Flink.groovy
new file mode 100644
index 0000000..8056d9c
--- /dev/null
+++ b/.test-infra/jenkins/job_PostCommit_Python35_ValidatesRunner_Flink.groovy
@@ -0,0 +1,38 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+import CommonJobProperties as commonJobProperties
+import PostcommitJobBuilder
+
+// This job runs the suite of Python ValidatesRunner tests against the Flink runner on Python 3.5.
+PostcommitJobBuilder.postCommitJob('beam_PostCommit_Python35_VR_Flink',
+  'Run Python 3.5 Flink ValidatesRunner', 'Run Python 3.5 Flink ValidatesRunner', this) {
+  description('Runs the Python 3.5 ValidatesRunner suite on the Flink runner.')
+
+  // Set common parameters.
+  commonJobProperties.setTopLevelMainJobProperties(delegate)
+
+  // Gradle goals for this job.
+  steps {
+    gradle {
+      rootBuildScriptDir(commonJobProperties.checkoutDir)
+      tasks(':sdks:python:test-suites:portable:py35:flinkValidatesRunner')
+      commonJobProperties.setGradleSwitches(delegate)
+    }
+  }
+}
diff --git a/.test-infra/jenkins/job_PostCommit_Python_MongoDBIO_IT.groovy b/.test-infra/jenkins/job_PostCommit_Python_MongoDBIO_IT.groovy
index fdf3caa..175ad68 100644
--- a/.test-infra/jenkins/job_PostCommit_Python_MongoDBIO_IT.groovy
+++ b/.test-infra/jenkins/job_PostCommit_Python_MongoDBIO_IT.groovy
@@ -32,6 +32,7 @@
     gradle {
       rootBuildScriptDir(commonJobProperties.checkoutDir)
       tasks(':sdks:python:test-suites:direct:py2:mongodbioIT')
+      tasks(':sdks:python:test-suites:direct:py35:mongodbioIT')
       commonJobProperties.setGradleSwitches(delegate)
     }
   }
diff --git a/.test-infra/jenkins/job_PreCommit_Python_ValidatesRunner_Flink.groovy b/.test-infra/jenkins/job_PreCommit_Python_ValidatesRunner_Flink.groovy
index eb34f1e..2812681 100644
--- a/.test-infra/jenkins/job_PreCommit_Python_ValidatesRunner_Flink.groovy
+++ b/.test-infra/jenkins/job_PreCommit_Python_ValidatesRunner_Flink.groovy
@@ -1,4 +1,3 @@
-
 /*
  * Licensed to the Apache Software Foundation (ASF) under one
  * or more contributor license agreements.  See the NOTICE file
@@ -19,10 +18,10 @@
 
 import PrecommitJobBuilder
 
-// This job runs the suite of ValidatesRunner tests against the Flink runner.
+// This job runs the suite of Python ValidatesRunner tests against the Flink runner on Python 2.
 PrecommitJobBuilder builder = new PrecommitJobBuilder(
     scope: this,
-    nameBase: 'Python_PVR_Flink',
+    nameBase: 'Python2_PVR_Flink',
     gradleTask: ':sdks:python:test-suites:portable:py2:flinkValidatesRunner',
     triggerPathPatterns: [
       '^model/.*$',
@@ -39,5 +38,5 @@
     ]
 )
 builder.build {
-    previousNames('beam_PostCommit_Python_VR_Flink')
+    previousNames('beam_PreCommit_Python_PVR_Flink')
 }
diff --git a/.test-infra/metrics/README.md b/.test-infra/metrics/README.md
index a029e63..5503d99 100644
--- a/.test-infra/metrics/README.md
+++ b/.test-infra/metrics/README.md
@@ -17,12 +17,14 @@
     under the License.
 -->
 # BeamMonitoring
-This folder contains resources required to deploy the Beam community metrics
-stack.
+This folder contains resources required to deploy the Beam metrics stack.
+There are two types of metrics in Beam project:
+* Community metrics
+* Metrics published by tests (IO Performance tests, Load tests and Nexmark tests) 
 
-[Beam community dashboard is available here.](https://s.apache.org/beam-community-metrics)
+Both types of metrics are presented in [Grafana dashboard available here.](https://s.apache.org/beam-community-metrics)
 
-Whole stack can be deployed on your local machine as well.
+## Community metrics
 
 This includes
 * Python scripts for ingesting data from sources (Jenkins, JIRA,
@@ -30,6 +32,15 @@
 * Postgres analytics database
 * [Grafana](https://grafana.com) dashboarding UI
 
+## Test metrics
+Beam uses Prometheus to store metrics published by tests running on Jenkins.
+
+Prometheus stack consists of the following components
+* the main Prometheus server
+* Alertmanager
+* Pushgateway
+
+Both stacks can be deployed on your local machine.
 All components run within Docker containers. These are composed together via
 docker-compose for local hosting, and Kubernetes for the production instance on
 GCP.
@@ -90,17 +101,21 @@
 machine:
 
 * Grafana: http://localhost:3000
-* Postgres DB: localhost:5432
+* Postgres DB: http://localhost:5432
+* Prometheus: http://localhost:9090
+* Pushgateway: http://localhost:9091
+* Alertmanager: http://localhost:9093
 
 If you're deploying for the first time on your machine, follow the wiki instructions
 on how to manually [configure
 Grafana](https://cwiki.apache.org/confluence/display/BEAM/Community+Metrics#CommunityMetrics-GrafanaUI).
 
-Grafana and Postgres containers persist data to Docker volumes, which will be
+Grafana, Postgres and Prometheus containers persist data to Docker volumes, which will be
 restored on subsequent runs. To start from a clean state, you must also wipe out
 these volumes. (List volumes via `docker volume ls`)
 
 ## Kubernetes setup
 
-Kubernetes deployment instructions are maintained in the
-[wiki](https://cwiki.apache.org/confluence/display/BEAM/Community+Metrics).
+Kubernetes deployment instructions are maintained in the wiki:
+* [Community metrics](https://cwiki.apache.org/confluence/display/BEAM/Community+Metrics)
+* [Test metrics]() <!-- TODO(BEAM-8130): add a link to instructions -->
diff --git a/.test-infra/metrics/apply_configmaps.sh b/.test-infra/metrics/apply_configmaps.sh
new file mode 100755
index 0000000..2094ad4
--- /dev/null
+++ b/.test-infra/metrics/apply_configmaps.sh
@@ -0,0 +1,26 @@
+#!/usr/bin/env bash
+
+#    Licensed to the Apache Software Foundation (ASF) under one or more
+#    contributor license agreements.  See the NOTICE file distributed with
+#    this work for additional information regarding copyright ownership.
+#    The ASF licenses this file to You under the Apache License, Version 2.0
+#    (the "License"); you may not use this file except in compliance with
+#    the License.  You may obtain a copy of the License at
+#
+#       http://www.apache.org/licenses/LICENSE-2.0
+#
+#    Unless required by applicable law or agreed to in writing, software
+#    distributed under the License is distributed on an "AS IS" BASIS,
+#    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#    See the License for the specific language governing permissions and
+#    limitations under the License.
+#
+#    Creates config maps used by Prometheus deployment and deletes old ones.
+
+set -euxo pipefail
+
+kubectl delete configmap prometheus-config --ignore-not-found=true
+kubectl delete configmap alertmanager-config --ignore-not-found=true
+
+kubectl create configmap prometheus-config --from-file=prometheus/prometheus/config
+kubectl create configmap alertmanager-config --from-file=prometheus/alertmanager/config
diff --git a/.test-infra/metrics/beamprometheus-deploy.yaml b/.test-infra/metrics/beamprometheus-deploy.yaml
new file mode 100644
index 0000000..40df19d
--- /dev/null
+++ b/.test-infra/metrics/beamprometheus-deploy.yaml
@@ -0,0 +1,125 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+apiVersion: extensions/v1beta1
+kind: Deployment
+metadata:
+  name: prometheus
+  labels:
+    app: prometheus
+spec:
+  replicas: 1
+  template:
+    metadata:
+      labels:
+        app: prometheus
+    spec:
+      containers:
+      - image: prom/pushgateway
+        name: pushgateway
+        ports:
+        - containerPort: 9091
+      - image: prom/prometheus
+        name: prometheus
+        securityContext:
+          runAsUser: 0
+        ports:
+          - containerPort: 9090
+        args:
+          - --config.file=/etc/prometheus/prometheus.yml
+          - --web.console.libraries=/etc/prometheus/console_libraries
+          - --web.console.templates=/etc/prometheus/consoles
+          - --storage.tsdb.path=/prometheus
+          - --storage.tsdb.retention.time=365d
+        volumeMounts:
+          - mountPath: /prometheus
+            name: prometheus-storage
+          - mountPath: /etc/prometheus
+            name: prometheus-config
+            readOnly: true
+      - image: prom/alertmanager
+        name: alertmanager
+        ports:
+          - containerPort: 9093
+        volumeMounts:
+          - mountPath: /etc/alertmanager
+            name: alertmanager-config
+            readOnly: true
+      restartPolicy: Always
+      volumes:
+        - name: prometheus-storage
+          persistentVolumeClaim:
+            claimName: prometheus-storage
+        - name: prometheus-config
+          configMap:
+            name: prometheus-config
+        - name: alertmanager-config
+          configMap:
+            name: alertmanager-config
+---
+apiVersion: v1
+kind: Service
+metadata:
+  name: prometheus
+  labels:
+    app: prometheus
+spec:
+  ports:
+  - port: 9090
+    targetPort: 9090
+  selector:
+    app: prometheus
+---
+apiVersion: v1
+kind: Service
+metadata:
+  name: pushgateway
+  labels:
+    app: prometheus
+spec:
+  type: NodePort
+  ports:
+  - port: 9091
+    targetPort: 9091
+    nodePort: 30000
+  selector:
+    app: prometheus
+---
+apiVersion: v1
+kind: Service
+metadata:
+  name: alertmanager
+  labels:
+    app: prometheus
+spec:
+  ports:
+  - port: 9093
+    targetPort: 9093
+  selector:
+    app: prometheus
+---
+apiVersion: v1
+kind: PersistentVolumeClaim
+metadata:
+  name: prometheus-storage
+spec:
+  accessModes:
+  - ReadWriteOnce
+  resources:
+    requests:
+      storage: 10Gi
diff --git a/.test-infra/metrics/docker-compose.yml b/.test-infra/metrics/docker-compose.yml
index f2818cd..3ec1954 100644
--- a/.test-infra/metrics/docker-compose.yml
+++ b/.test-infra/metrics/docker-compose.yml
@@ -86,9 +86,35 @@
       - DB_DBNAME=beam_metrics
       - DB_DBUSERNAME=admin
       - DB_DBPWD=<PGPasswordHere>
+  prometheus:
+    image: prom/prometheus
+    ports:
+      - 9090:9090
+    container_name: prometheus
+    volumes:
+      - ./prometheus/prometheus/config:/etc/prometheus:ro
+      - prometheus-storage:/prometheus
+    command:
+      - --config.file=/etc/prometheus/prometheus.yml
+      - --web.console.libraries=/etc/prometheus/console_libraries
+      - --web.console.templates=/etc/prometheus/consoles
+      - --storage.tsdb.path=/prometheus
+      - --storage.tsdb.retention.time=365d
+  pushgateway:
+    image: prom/pushgateway
+    container_name: pushgateway
+    ports:
+      - 9091:9091
+  alertmanager:
+    image: prom/alertmanager
+    container_name: alertmanager
+    ports:
+      - 9093:9093
+    volumes:
+      - ./prometheus/alertmanager/config:/etc/alertmanager:ro
 volumes:
   beam-postgresql-data:
   beam-grafana-libdata:
   beam-grafana-etcdata:
   beam-grafana-logdata:
-
+  prometheus-storage:
diff --git a/.test-infra/metrics/prometheus/alertmanager/config/alertmanager.yml b/.test-infra/metrics/prometheus/alertmanager/config/alertmanager.yml
new file mode 100644
index 0000000..fe0b677
--- /dev/null
+++ b/.test-infra/metrics/prometheus/alertmanager/config/alertmanager.yml
@@ -0,0 +1,41 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version   2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+# A configuration file for alertmanager.
+# Can be reloaded at runtime by sending a SIGHUP signal to the alertmanager
+# process or sending a HTTP POST request to the /reload endpoint.
+
+global:
+  resolve_timeout: 7d
+
+route:
+  receiver: 'default'
+  group_by: ['alertname']
+  group_wait: 0s
+  group_interval: 3d
+  repeat_interval: 3d
+  routes:
+    - match_re:
+        job: 'beam'
+      receiver: 'emails-and-slack'
+      group_by: ['test']
+
+receivers:
+  - name: 'default'
+  # TODO: Add details about emails-and-slack receiver
+  - name: 'emails-and-slack'
diff --git a/sdks/java/extensions/sql/src/main/resources/org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.commons.compiler.properties b/.test-infra/metrics/prometheus/prometheus/config/prometheus.yml
similarity index 66%
copy from sdks/java/extensions/sql/src/main/resources/org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.commons.compiler.properties
copy to .test-infra/metrics/prometheus/prometheus/config/prometheus.yml
index 72a4eec..c0e5b61 100644
--- a/sdks/java/extensions/sql/src/main/resources/org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.commons.compiler.properties
+++ b/.test-infra/metrics/prometheus/prometheus/config/prometheus.yml
@@ -15,4 +15,26 @@
 #  See the License for the specific language governing permissions and
 # limitations under the License.
 ################################################################################
-compilerFactory=org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.janino.CompilerFactory
+
+# A configuration file for the main Prometheus server.
+# Can be reloaded at runtime by sending a SIGHUP signal to the Prometheus
+# process.
+
+global:
+  scrape_interval:     6h
+  evaluation_interval: 1m
+
+rule_files:
+   - 'rules.yml'
+
+scrape_configs:
+  - job_name: 'beam'
+    honor_labels: true
+    honor_timestamps: true
+    static_configs:
+      - targets: ['pushgateway:9091']
+
+alerting:
+  alertmanagers:
+  - static_configs:
+    - targets: ['alertmanager:9093']
diff --git a/.test-infra/metrics/prometheus/prometheus/config/rules.yml b/.test-infra/metrics/prometheus/prometheus/config/rules.yml
new file mode 100644
index 0000000..45bcfa4
--- /dev/null
+++ b/.test-infra/metrics/prometheus/prometheus/config/rules.yml
@@ -0,0 +1,35 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+# Defines alerting rules used by Prometheus to detect anomalous behaviours
+# among test results.
+# Can be reloaded at runtime by sending a SIGHUP signal to the Prometheus
+# process.
+
+groups:
+- name: beamTests
+  rules:
+  - alert: TestRegression
+    expr: ((avg_over_time({job="beam",instance="",__name__!="push_time_seconds"}[1d])
+      - avg_over_time({job="beam",instance="",__name__!="push_time_seconds"}[6d] offset 1d))
+      / avg_over_time({job="beam",instance="",__name__!="push_time_seconds"}[6d] offset 1d))
+      > 0.2
+    labels:
+      job: beamAlert
+    annotations:
+      summary: 'Average runtime over 24 hours is 20% greater than average from six previous days'
diff --git a/README.md b/README.md
index 8c67ca5..8d7b9ee 100644
--- a/README.md
+++ b/README.md
@@ -36,7 +36,8 @@
 --- | --- | --- | --- | --- | --- | --- | ---
 Go | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
 Java | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)
-Python | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/) | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/) <br> [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/)
+Python | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/) | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/)
+XLang | --- | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/) | --- | --- | ---
 
 ## Overview
 
diff --git a/build.gradle b/build.gradle
index 4d3b390..ba9a42d 100644
--- a/build.gradle
+++ b/build.gradle
@@ -33,10 +33,6 @@
   id "org.sonarqube" version "2.7"
 }
 
-// Add performanceTest task to this build.gradle file
-// so that running Performance tests using PerfKitBenchmarker is possible.
-createPerformanceTestHarness()
-
 /*************************************************************************************************/
 // Configure the root project
 
diff --git a/buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy b/buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy
index 3d5e2d6..f9299db 100644
--- a/buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy
+++ b/buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy
@@ -130,6 +130,14 @@
 
     /** Controls whether javadoc is exported for this project. */
     boolean exportJavadoc = true
+
+    /**
+     * Automatic-Module-Name Header value to be set in MANFIEST.MF file.
+     * This is a required parameter unless publishing to Maven is disabled for this project.
+     *
+     * @see: https://github.com/GoogleCloudPlatform/cloud-opensource-java/blob/master/library-best-practices/JLBP-20.md
+     */
+    String automaticModuleName = null
   }
 
   /** A class defining the set of configurable properties accepted by applyPortabilityNature. */
@@ -144,6 +152,17 @@
 
     /** Override the default "beam-" + `dash separated path` archivesBaseName. */
     String archivesBaseName = null;
+
+    /** Controls whether this project is published to Maven. */
+    boolean publish = true
+
+    /**
+     * Automatic-Module-Name Header value to be set in MANFIEST.MF file.
+     * This is a required parameter unless publishing to Maven is disabled for this project.
+     *
+     * @see: https://github.com/GoogleCloudPlatform/cloud-opensource-java/blob/master/library-best-practices/JLBP-20.md
+     */
+    String automaticModuleName
   }
 
   // A class defining the set of configurable properties for createJavaExamplesArchetypeValidationTask
@@ -171,72 +190,16 @@
 
   // Reads and contains all necessary performance test parameters
   class JavaPerformanceTestConfiguration {
-
-    /* Optional properties (set only if needed in your case): */
-
-    // Path to PerfKitBenchmarker application (pkb.py).
-    // It is only required when running Performance Tests with PerfKitBenchmarker
-    String pkbLocation = System.getProperty('pkbLocation')
-
-    // Data Processing Backend's log level.
-    String logLevel = System.getProperty('logLevel', 'INFO')
-
-    // Path to gradle binary.
-    String gradleBinary = System.getProperty('gradleBinary', './gradlew')
-
-    // If benchmark is official or not.
-    // Official benchmark results are meant to be displayed on PerfKitExplorer dashboards.
-    String isOfficial = System.getProperty('official', 'false')
-
-    // Specifies names of benchmarks to be run by PerfKitBenchmarker.
-    String benchmarks = System.getProperty('benchmarks', 'beam_integration_benchmark')
-
-    // If beam is not "prebuilt" then PerfKitBenchmarker runs the build task before running the tests.
-    String beamPrebuilt = System.getProperty('beamPrebuilt', 'true')
-
-    // Beam's sdk to be used by PerfKitBenchmarker.
-    String beamSdk = System.getProperty('beamSdk', 'java')
-
-    // Timeout (in seconds) after which PerfKitBenchmarker will stop executing the benchmark (and will fail).
-    String timeout = System.getProperty('itTimeout', '1200')
-
-    // Path to kubernetes configuration file.
-    String kubeconfig = System.getProperty('kubeconfig', System.getProperty('user.home') + '/.kube/config')
-
-    // Path to kubernetes executable.
-    String kubectl = System.getProperty('kubectl', 'kubectl')
-
-    // Paths to files with kubernetes infrastructure to setup before the test runs.
-    // PerfKitBenchmarker will have trouble reading 'null' path. It expects empty string if no scripts are expected.
-    String kubernetesScripts = System.getProperty('kubernetesScripts', '')
-
-    // Path to file with 'dynamic' and 'static' pipeline options.
-    // that will be appended by PerfKitBenchmarker to the test running command.
-    // PerfKitBenchmarker will have trouble reading 'null' path. It expects empty string if no config file is expected.
-    String optionsConfigFile = System.getProperty('beamITOptions', '')
-
-    // Any additional properties to be appended to benchmark execution command.
-    String extraProperties = System.getProperty('beamExtraProperties', '')
-
-    // Runner which will be used for running the tests. Possible values: dataflow/direct.
+    // Optional. Runner which will be used for running the tests. Possible values: dataflow/direct.
     // PerfKitBenchmarker will have trouble reading 'null' value. It expects empty string if no config file is expected.
     String runner = System.getProperty('integrationTestRunner', '')
 
-    // Filesystem which will be used for running the tests. Possible values: hdfs.
+    // Optional. Filesystem which will be used for running the tests. Possible values: hdfs.
     // if not specified runner's local filesystem will be used.
     String filesystem = System.getProperty('filesystem')
 
-    /* Always required properties: */
-
-    // Pipeline options to be used by the tested pipeline.
+    // Required. Pipeline options to be used by the tested pipeline.
     String integrationTestPipelineOptions = System.getProperty('integrationTestPipelineOptions')
-
-    // Fully qualified name of the test to be run, eg:
-    // 'org.apache.beam.sdks.java.io.jdbc.JdbcIOIT'.
-    String integrationTest = System.getProperty('integrationTest')
-
-    // Relative path to module where the test is, eg. 'sdks/java/io/jdbc.
-    String itModule = System.getProperty('itModule')
   }
 
   // Reads and contains all necessary performance test parameters
@@ -415,7 +378,7 @@
     def guava_version = "20.0"
     def hadoop_version = "2.7.3"
     def hamcrest_version = "2.1"
-    def jackson_version = "2.9.9"
+    def jackson_version = "2.9.10"
     def jaxb_api_version = "2.2.12"
     def kafka_version = "1.0.0"
     def nemo_version = "0.1"
@@ -520,7 +483,7 @@
         jackson_annotations                         : "com.fasterxml.jackson.core:jackson-annotations:$jackson_version",
         jackson_jaxb_annotations                    : "com.fasterxml.jackson.module:jackson-module-jaxb-annotations:$jackson_version",
         jackson_core                                : "com.fasterxml.jackson.core:jackson-core:$jackson_version",
-        jackson_databind                            : "com.fasterxml.jackson.core:jackson-databind:2.9.9.3",
+        jackson_databind                            : "com.fasterxml.jackson.core:jackson-databind:$jackson_version",
         jackson_dataformat_cbor                     : "com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:$jackson_version",
         jackson_dataformat_yaml                     : "com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:$jackson_version",
         jackson_datatype_joda                       : "com.fasterxml.jackson.datatype:jackson-datatype-joda:$jackson_version",
@@ -560,6 +523,7 @@
         vendored_bytebuddy_1_9_3                    : "org.apache.beam:beam-vendor-bytebuddy-1_9_3:0.1",
         vendored_grpc_1_21_0                        : "org.apache.beam:beam-vendor-grpc-1_21_0:0.1",
         vendored_guava_26_0_jre                     : "org.apache.beam:beam-vendor-guava-26_0-jre:0.1",
+        vendored_calcite_1_20_0                     : "org.apache.beam:beam-vendor-calcite-1_20_0:0.1",
         woodstox_core_asl                           : "org.codehaus.woodstox:woodstox-core-asl:4.4.1",
         zstd_jni                                    : "com.github.luben:zstd-jni:1.3.8-3",
         quickcheck_core                             : "com.pholser:junit-quickcheck-core:$quickcheck_version",
@@ -917,6 +881,8 @@
       }
 
       project.jar {
+        setAutomaticModuleNameHeader(configuration, project)
+
         zip64 true
         into("META-INF/") {
           from "${project.rootProject.projectDir}/LICENSE"
@@ -1278,7 +1244,7 @@
     }
 
     // When applied in a module's build.gradle file, this closure provides task for running
-    // IO integration tests (manually, without PerfKitBenchmarker).
+    // IO integration tests.
     project.ext.enableJavaPerformanceTesting = {
 
       // Use the implicit it parameter of the closure to handle zero argument or one argument map calls.
@@ -1336,7 +1302,7 @@
         }
 
         if (runner?.equalsIgnoreCase('flink')) {
-          testRuntime it.project(path: ":runners:flink:1.5", configuration: 'testRuntime')
+          testRuntime it.project(path: ":runners:flink:1.8", configuration: 'testRuntime')
         }
 
         if (runner?.equalsIgnoreCase('spark')) {
@@ -1361,62 +1327,9 @@
           testRuntime it.project(path: ":sdks:java:io:amazon-web-services", configuration: 'testRuntime')
         }
       }
-
       project.task('packageIntegrationTests', type: Jar)
     }
 
-    // When applied in a module's build gradle file, this closure provides a task
-    // that will involve PerfKitBenchmarker for running integrationTests.
-    project.ext.createPerformanceTestHarness = {
-
-      // Use the implicit it parameter of the closure to handle zero argument or one argument map calls.
-      // See: http://groovy-lang.org/closures.html#implicit-it
-      JavaPerformanceTestConfiguration configuration = it ? it as JavaPerformanceTestConfiguration : new JavaPerformanceTestConfiguration()
-
-      // This task runs PerfKitBenchmarker, which does benchmarking of the IO ITs.
-      // The arguments passed to it allows it to invoke gradle again with the desired benchmark.
-      //
-      // To invoke this, run:
-      //
-      // ./gradlew performanceTest \
-      //  -DpkbLocation="<path to pkb.py>"
-      //  -DintegrationTestPipelineOptions='["--numberOfRecords=1000", "<more options>"]' \
-      //  -DintegrationTest=<io test, eg. org.apache.beam.sdk.io.text.TextIOIT> \
-      //  -DitModule=<directory containing desired test, eg. sdks/java/io/file-based-io-tests> \
-      //  -DintegrationTestRunner=<runner to be used for testing, eg. dataflow>
-      //
-      // There are more options with default values that can be tweaked if needed (see below).
-      project.task('performanceTest', type: Exec) {
-
-        // PerfKitBenchmarker needs to work in the Beam's root directory,
-        // otherwise it requires absolute paths ./gradlew, kubernetes scripts etc.
-        commandLine "${configuration.pkbLocation}",
-                "--dpb_log_level=${configuration.logLevel}",
-                "--gradle_binary=${configuration.gradleBinary}",
-                "--official=${configuration.isOfficial}",
-                "--benchmarks=${configuration.benchmarks}",
-                "--beam_location=${project.rootProject.projectDir}",
-
-                "--beam_prebuilt=${configuration.beamPrebuilt}",
-                "--beam_sdk=${configuration.beamSdk}",
-
-                "--beam_it_timeout=${configuration.timeout}",
-
-                "--kubeconfig=${configuration.kubeconfig}",
-                "--kubectl=${configuration.kubectl}",
-                "--beam_kubernetes_scripts=${configuration.kubernetesScripts}",
-
-                "--beam_it_options=${configuration.integrationTestPipelineOptions}",
-                "--beam_options_config_file=${configuration.optionsConfigFile}",
-
-                "--beam_it_class=${configuration.integrationTest}",
-                "--beam_it_module=${configuration.itModule}",
-
-                "--beam_extra_properties=${configuration.extraProperties}",
-                "--beam_runner=${configuration.runner}"
-      }
-    }
-
     /** ***********************************************************************************************/
 
     project.ext.applyGoNature = {
@@ -1566,7 +1479,9 @@
       project.ext.applyJavaNature(
               exportJavadoc: false,
               enableSpotbugs: false,
+              publish: configuration.publish,
               archivesBaseName: configuration.archivesBaseName,
+              automaticModuleName: configuration.automaticModuleName,
               shadowJarValidationExcludes: it.shadowJarValidationExcludes,
               shadowClosure: GrpcVendoring.shadowClosure() << {
                 // We perform all the code relocations but don't include
@@ -1808,7 +1723,7 @@
           dependsOn setupTask
           // We need flink-job-server-container dependency since Python PortableRunner automatically
           // brings the flink-job-server-container up when --job_endpoint is not specified.
-          dependsOn ':runners:flink:1.5:job-server-container:docker'
+          dependsOn ':runners:flink:1.8:job-server-container:docker'
         }
         mainTask.dependsOn pythonTask
         cleanupTask.mustRunAfter pythonTask
@@ -1990,7 +1905,7 @@
         project.task('portableWordCount' + (isStreaming ? 'Streaming' : 'Batch')) {
           dependsOn = ['installGcpTest']
           mustRunAfter = [
-            ':runners:flink:1.5:job-server-container:docker',
+            ':runners:flink:1.8:job-server-container:docker',
             ':sdks:python:container:py2:docker',
             ':sdks:python:container:py35:docker',
             ':sdks:python:container:py36:docker',
@@ -2045,4 +1960,14 @@
       }
     }
   }
+
+  private void setAutomaticModuleNameHeader(JavaNatureConfiguration configuration, Project project) {
+    if (configuration.publish && !configuration.automaticModuleName) {
+      throw new GradleException("Expected automaticModuleName to be set for the module that is published to maven repository.")
+    } else if (configuration.automaticModuleName) {
+      project.jar.manifest {
+        attributes 'Automatic-Module-Name': configuration.automaticModuleName
+      }
+    }
+  }
 }
diff --git a/examples/java/build.gradle b/examples/java/build.gradle
index 7b817bf..3936398 100644
--- a/examples/java/build.gradle
+++ b/examples/java/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.examples')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
@@ -78,7 +78,7 @@
   // https://issues.apache.org/jira/browse/BEAM-3583
   // apexRunnerPreCommit project(":runners:apex")
   directRunnerPreCommit project(path: ":runners:direct-java", configuration: "shadow")
-  flinkRunnerPreCommit project(":runners:flink:1.5")
+  flinkRunnerPreCommit project(":runners:flink:1.8")
   // TODO: Make the netty version used configurable, we add netty-all 4.1.17.Final so it appears on the classpath
   // before 4.1.8.Final defined by Apache Beam
   sparkRunnerPreCommit "io.netty:netty-all:4.1.17.Final"
diff --git a/examples/java/src/main/java/org/apache/beam/examples/complete/game/UserScore.java b/examples/java/src/main/java/org/apache/beam/examples/complete/game/UserScore.java
index db5b722..2938fb0 100644
--- a/examples/java/src/main/java/org/apache/beam/examples/complete/game/UserScore.java
+++ b/examples/java/src/main/java/org/apache/beam/examples/complete/game/UserScore.java
@@ -205,8 +205,11 @@
   public interface Options extends PipelineOptions {
 
     @Description("Path to the data file(s) containing game data.")
-    // The default maps to two large Google Cloud Storage files (each ~12GB) holding two subsequent
-    // day's worth (roughly) of data.
+    /* The default maps to two large Google Cloud Storage files (each ~12GB) holding two subsequent
+    day's worth (roughly) of data.
+
+    Note: You may want to use a small sample dataset to test it locally/quickly : gs://apache-beam-samples/game/small/gaming_data.csv
+    You can also download it via the command line gsutil cp gs://apache-beam-samples/game/small/gaming_data.csv ./destination_folder/gaming_data.csv */
     @Default.String("gs://apache-beam-samples/game/gaming_data*.csv")
     String getInput();
 
diff --git a/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java b/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java
index af30620..e100166 100644
--- a/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java
+++ b/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java
@@ -725,7 +725,7 @@
       return new DynamicSessions(gapDuration);
     }
 
-    // [START CustomSessionWindow4]
+    // [END CustomSessionWindow4]
 
     @Override
     public void mergeWindows(MergeContext c) throws Exception {}
diff --git a/examples/kotlin/build.gradle b/examples/kotlin/build.gradle
index b0fa7f3..7b55870 100644
--- a/examples/kotlin/build.gradle
+++ b/examples/kotlin/build.gradle
@@ -22,7 +22,7 @@
     id 'org.jetbrains.kotlin.jvm' version '1.3.21'
 }
 
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.examples.kotlin')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
@@ -81,7 +81,7 @@
   // https://issues.apache.org/jira/browse/BEAM-3583
   // apexRunnerPreCommit project(":runners:apex")
   directRunnerPreCommit project(path: ":runners:direct-java", configuration: "shadow")
-  flinkRunnerPreCommit project(":runners:flink:1.5")
+  flinkRunnerPreCommit project(":runners:flink:1.8")
   // TODO: Make the netty version used configurable, we add netty-all 4.1.17.Final so it appears on the classpath
   // before 4.1.8.Final defined by Apache Beam
   sparkRunnerPreCommit "io.netty:netty-all:4.1.17.Final"
diff --git a/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb b/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb
index 7ffe297..f302d83 100644
--- a/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb
+++ b/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb
@@ -58,14 +58,12 @@
     "localStorage.setItem('language', 'language-py')\n",
     "</script>\n",
     "\n",
-    "</p><table align=\"left\" style=\"margin-right:1em\">\n",
+    "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.Filter\">\n",
-    "      <img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/>\n",
-    "      Pydoc\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.Filter\"><img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/> Pydoc</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>\n",
     "\n",
     "Given a predicate, filter out all elements that don't satisfy that predicate.\n",
@@ -163,12 +161,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -217,12 +213,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -276,12 +270,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -338,12 +330,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -402,12 +392,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>\n",
     "\n",
     "> **Note**: You can pass the `PCollection` as a *list* with `beam.pvalue.AsList(pcollection)`,\n",
@@ -470,12 +458,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -494,12 +480,10 @@
     "\n",
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.Filter\">\n",
-    "      <img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/>\n",
-    "      Pydoc\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.Filter\"><img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/> Pydoc</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
diff --git a/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb b/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb
index 8d5433b..c6e1fb2 100644
--- a/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb
+++ b/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb
@@ -58,14 +58,12 @@
     "localStorage.setItem('language', 'language-py')\n",
     "</script>\n",
     "\n",
-    "</p><table align=\"left\" style=\"margin-right:1em\">\n",
+    "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.FlatMap\">\n",
-    "      <img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/>\n",
-    "      Pydoc\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.FlatMap\"><img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/> Pydoc</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>\n",
     "\n",
     "Applies a simple 1-to-many mapping function over each element in the collection.\n",
@@ -158,12 +156,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -211,12 +207,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -263,12 +257,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -319,12 +311,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -378,12 +368,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -434,12 +422,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -493,12 +479,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -562,12 +546,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>\n",
     "\n",
     "> **Note**: You can pass the `PCollection` as a *list* with `beam.pvalue.AsList(pcollection)`,\n",
@@ -635,12 +617,10 @@
    "source": [
     "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\">\n",
-    "      <img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/>\n",
-    "      View source code\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" width=\"32px\" height=\"32px\" alt=\"View source code\"/> View source code</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
@@ -658,14 +638,12 @@
     "  operation, and includes other abilities such as multiple output collections and side-inputs.\n",
     "* [Map](https://beam.apache.org/documentation/transforms/python/elementwise/map) behaves the same, but produces exactly one output for each input.\n",
     "\n",
-    "<table>\n",
+    "<table align=\"left\" style=\"margin-right:1em\">\n",
     "  <td>\n",
-    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.FlatMap\">\n",
-    "      <img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/>\n",
-    "      Pydoc\n",
-    "    </a>\n",
+    "    <a class=\"button\" target=\"_blank\" href=\"https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.FlatMap\"><img src=\"https://beam.apache.org/images/logos/sdks/python.png\" width=\"32px\" height=\"32px\" alt=\"Pydoc\"/> Pydoc</a>\n",
     "  </td>\n",
     "</table>\n",
+    "\n",
     "<br/><br/><br/>"
    ]
   },
diff --git a/model/fn-execution/build.gradle b/model/fn-execution/build.gradle
index 82f4e81..42f1fe0 100644
--- a/model/fn-execution/build.gradle
+++ b/model/fn-execution/build.gradle
@@ -17,7 +17,10 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyPortabilityNature(shadowJarValidationExcludes: ["org/apache/beam/model/fnexecution/v1/**"])
+applyPortabilityNature(
+    automaticModuleName: 'org.apache.beam.model.fn.execution',
+    shadowJarValidationExcludes: ["org/apache/beam/model/fnexecution/v1/**"]
+)
 
 description = "Apache Beam :: Model :: Fn Execution"
 ext.summary = "Portable definitions for execution user-defined functions."
diff --git a/model/fn-execution/src/main/proto/beam_fn_api.proto b/model/fn-execution/src/main/proto/beam_fn_api.proto
index 04a59a0..ed2f013 100644
--- a/model/fn-execution/src/main/proto/beam_fn_api.proto
+++ b/model/fn-execution/src/main/proto/beam_fn_api.proto
@@ -45,16 +45,6 @@
 import "google/protobuf/wrappers.proto";
 import "metrics.proto";
 
-/*
- * Constructs that define the pipeline shape.
- *
- * These are mostly unstable due to the missing pieces to be shared with
- * the Runner Api like windowing strategy, display data, .... There are still
- * some modelling questions related to whether a side input is modelled
- * as another field on a PrimitiveTransform or as part of inputs and we
- * still are missing things like the CompositeTransform.
- */
-
 // A descriptor for connecting to a remote port using the Beam Fn Data API.
 // Allows for communication between two environments (for example between the
 // runner and the SDK).
@@ -180,8 +170,8 @@
 // https://docs.google.com/document/d/1tUDb45sStdR8u7-jBkGdw3OGFK7aa2-V7eo86zYSE_4/edit#heading=h.9g3g5weg2u9
 // for further details.
 message BundleApplication {
-  // (Required) The primitive transform to which to pass the element
-  string ptransform_id = 1;
+  // (Required) The transform to which to pass the element
+  string transform_id = 1;
 
   // (Required) Name of the transform's input to which to pass the element.
   string input_id = 2;
@@ -201,15 +191,12 @@
   // (Required) Whether this application potentially produces an unbounded
   // amount of data. Note that this should only be set to BOUNDED if and
   // only if the application is known to produce a finite amount of output.
-  //
-  // Note that this is different from the backlog as the backlog represents
-  // how much work there is currently outstanding.
   org.apache.beam.model.pipeline.v1.IsBounded.Enum is_bounded = 5;
 
   // Contains additional monitoring information related to this application.
   //
   // Each application is able to report information that some runners
-  // will use consume when providing a UI or for making scaling and performance
+  // will use when providing a UI or for making scaling and performance
   // decisions. See https://s.apache.org/beam-bundles-backlog-splitting for
   // details about what types of signals may be useful to report.
   repeated org.apache.beam.model.pipeline.v1.MonitoringInfo monitoring_infos = 6;
@@ -230,7 +217,7 @@
 message ProcessBundleRequest {
   // (Required) A reference to the process bundle descriptor that must be
   // instantiated and executed by the SDK harness.
-  string process_bundle_descriptor_reference = 1;
+  string process_bundle_descriptor_id = 1;
 
   // A cache token which can be used by an SDK to check for the validity
   // of cached elements which have a cache token associated.
@@ -289,7 +276,7 @@
 message ProcessBundleProgressRequest {
   // (Required) A reference to an active process bundle request with the given
   // instruction id.
-  string instruction_reference = 1;
+  string instruction_id = 1;
 }
 
 // DEPRECATED
@@ -298,7 +285,7 @@
   // These metrics are split into processed and active element groups for
   // progress reporting purposes. This allows a Runner to see what is measured,
   // what is estimated and what can be extrapolated to be able to accurately
-  // estimate the backlog of remaining work.
+  // estimate the amount of remaining work.
   message PTransform {
     // Metrics that are measured for processed and active element groups.
     message Measured {
@@ -426,20 +413,7 @@
 message ProcessBundleSplitRequest {
   // (Required) A reference to an active process bundle request with the given
   // instruction id.
-  string instruction_reference = 1;
-
-  // (Required) Specifies that the Runner would like the bundle to split itself
-  // such that it performs no more work than the backlog specified for each
-  // PTransform. The interpretation of how much work should be processed is up
-  // to the PTransform.
-  //
-  // For example, A backlog of "" tells the SDK to perform as little work as
-  // possible, effectively checkpointing when able. The remaining backlog
-  // will be relative to the backlog reported during processing.
-  //
-  // If the backlog is unspecified for a PTransform, the runner would like
-  // the SDK to process all data received for that PTransform.
-  map<string, bytes> backlog_remaining = 2;
+  string instruction_id = 1;
 
   // A message specifying the desired split for a single transform.
   message DesiredSplit {
@@ -498,7 +472,7 @@
   // as some range in an underlying dataset).
   message ChannelSplit {
     // (Required) The grpc read transform reading this channel.
-    string ptransform_id = 1;
+    string transform_id = 1;
 
     // The last element of the input channel that should be entirely considered
     // part of the primary, identified by its absolute index in the (ordered)
@@ -521,7 +495,7 @@
 message FinalizeBundleRequest {
   // (Required) A reference to a completed process bundle request with the given
   // instruction id.
-  string instruction_reference = 1;
+  string instruction_id = 1;
 }
 
 message FinalizeBundleResponse {
@@ -540,7 +514,7 @@
   message Data {
     // (Required) A reference to an active instruction request with the given
     // instruction id.
-    string instruction_reference = 1;
+    string instruction_id = 1;
 
     // (Required) A definition representing a consumer or producer of this data.
     // If received by a harness, this represents the consumer within that
@@ -550,7 +524,7 @@
     // Note that a single element may span multiple Data messages.
     //
     // Note that a sending/receiving pair should share the same identifier.
-    string ptransform_id = 2;
+    string transform_id = 2;
 
     // (Optional) Represents a part of a logical byte stream. Elements within
     // the logical byte stream are encoded in the nested context and
@@ -589,7 +563,7 @@
   // (Required) The associated instruction id of the work that is currently
   // being processed. This allows for the runner to associate any modifications
   // to state to be committed with the appropriate work execution.
-  string instruction_reference = 2;
+  string instruction_id = 2;
 
   // (Required) The state key this request is for.
   StateKey state_key = 3;
@@ -656,7 +630,7 @@
 
   message MultimapSideInput {
     // (Required) The id of the PTransform containing a side input.
-    string ptransform_id = 1;
+    string transform_id = 1;
     // (Required) The id of the side input.
     string side_input_id = 2;
     // (Required) The window (after mapping the currently executing elements
@@ -668,7 +642,7 @@
 
   message BagUserState {
     // (Required) The id of the PTransform containing user state.
-    string ptransform_id = 1;
+    string transform_id = 1;
     // (Required) The id of the user state.
     string user_state_id = 2;
     // (Required) The window encoded in a nested context.
@@ -795,11 +769,11 @@
 
   // (Optional) A reference to the instruction this log statement is associated
   // with.
-  string instruction_reference = 5;
+  string instruction_id = 5;
 
-  // (Optional) A reference to the primitive transform this log statement is
+  // (Optional) A reference to the transform this log statement is
   // associated with.
-  string primitive_transform_reference = 6;
+  string transform_id = 6;
 
   // (Optional) Human-readable name of the function or method being invoked,
   // with optional context such as the class or package name. The format can
diff --git a/model/job-management/build.gradle b/model/job-management/build.gradle
index 38c7938..1568d8b 100644
--- a/model/job-management/build.gradle
+++ b/model/job-management/build.gradle
@@ -17,10 +17,12 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyPortabilityNature(shadowJarValidationExcludes:[
-    "org/apache/beam/model/expansion/v1/**",
-    "org/apache/beam/model/jobmanagement/v1/**",
-])
+applyPortabilityNature(
+    automaticModuleName: 'org.apache.beam.model.job.management',
+    shadowJarValidationExcludes: [
+        "org/apache/beam/model/expansion/v1/**",
+        "org/apache/beam/model/jobmanagement/v1/**",
+    ])
 
 description = "Apache Beam :: Model :: Job Management"
 ext.summary = "Portable definitions for submitting pipelines."
diff --git a/model/pipeline/build.gradle b/model/pipeline/build.gradle
index a305985..7698e84 100644
--- a/model/pipeline/build.gradle
+++ b/model/pipeline/build.gradle
@@ -17,7 +17,10 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyPortabilityNature(shadowJarValidationExcludes: ["org/apache/beam/model/pipeline/v1/**"])
+applyPortabilityNature(
+    automaticModuleName: 'org.apache.beam.model.pipeline',
+    shadowJarValidationExcludes: ["org/apache/beam/model/pipeline/v1/**"]
+)
 
 description = "Apache Beam :: Model :: Pipeline"
 ext.summary = "Portable definitions for building pipelines"
diff --git a/model/pipeline/src/main/proto/beam_runner_api.proto b/model/pipeline/src/main/proto/beam_runner_api.proto
index aa3184d..736bcdc 100644
--- a/model/pipeline/src/main/proto/beam_runner_api.proto
+++ b/model/pipeline/src/main/proto/beam_runner_api.proto
@@ -645,61 +645,6 @@
   }
 }
 
-// Experimental: A representation of a Beam Schema.
-message Schema {
-  enum TypeName {
-    BYTE = 0;
-    INT16 = 1;
-    INT32 = 2;
-    INT64 = 3;
-    DECIMAL = 4;
-    FLOAT = 5;
-    DOUBLE = 6;
-    STRING = 7;
-    DATETIME = 8;
-    BOOLEAN = 9;
-    BYTES = 10;
-    ARRAY = 11;
-    MAP = 13;
-    ROW = 14;
-    LOGICAL_TYPE = 15;
-  }
-
-  message LogicalType {
-    string id = 1;
-    string args = 2;
-    FieldType base_type = 3;
-    bytes serialized_class = 4;
-  }
-
-  message MapType {
-    FieldType key_type = 1;
-    FieldType value_type = 2;
-  }
-
-  message FieldType {
-    TypeName type_name = 1;
-    bool nullable = 2;
-    oneof type_info {
-      FieldType collection_element_type = 3;
-      MapType map_type = 4;
-      Schema row_schema = 5;
-      LogicalType logical_type = 6;
-    }
-  }
-
-  message Field {
-    string name = 1;
-    string description = 2;
-    FieldType type = 3;
-    int32 id = 4;
-    int32 encoding_position = 5;
-  }
-
-  repeated Field fields = 1;
-  string id = 2;
-}
-
 // A windowing strategy describes the window function, triggering, allowed
 // lateness, and accumulation mode for a PCollection.
 //
diff --git a/model/pipeline/src/main/proto/schema.proto b/model/pipeline/src/main/proto/schema.proto
new file mode 100644
index 0000000..e420e3c
--- /dev/null
+++ b/model/pipeline/src/main/proto/schema.proto
@@ -0,0 +1,85 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+// ** Experimental **
+// Protocol Buffers describing Beam Schemas, a portable representation for
+// complex types.
+
+syntax = "proto3";
+
+package org.apache.beam.model.pipeline.v1;
+
+option go_package = "pipeline_v1";
+option java_package = "org.apache.beam.model.pipeline.v1";
+option java_outer_classname = "SchemaApi";
+
+message Schema {
+  repeated Field fields = 1;
+  string id = 2;
+}
+
+message Field {
+  string name = 1;
+  string description = 2;
+  FieldType type = 3;
+  int32 id = 4;
+  int32 encoding_position = 5;
+}
+
+message FieldType {
+  bool nullable = 1;
+  oneof type_info {
+    AtomicType atomic_type = 2;
+    ArrayType array_type = 3;
+    MapType map_type = 4;
+    RowType row_type = 5;
+    LogicalType logical_type = 6;
+  }
+}
+
+enum AtomicType {
+  UNSPECIFIED = 0;
+  BYTE = 1;
+  INT16 = 2;
+  INT32 = 3;
+  INT64 = 4;
+  FLOAT = 5;
+  DOUBLE = 6;
+  STRING = 7;
+  BOOLEAN = 8;
+  BYTES = 9;
+}
+
+message ArrayType {
+  FieldType element_type = 1;
+}
+
+message MapType {
+  FieldType key_type = 1;
+  FieldType value_type = 2;
+}
+
+message RowType {
+  Schema schema = 1;
+}
+
+message LogicalType {
+  string urn = 1;
+  bytes payload = 2;
+  FieldType representation = 3;
+}
diff --git a/project-mappings b/project-mappings
index db61653..f8ac258 100644
--- a/project-mappings
+++ b/project-mappings
@@ -94,9 +94,9 @@
 :beam-runners-google-cloud-dataflow-java-examples :runners:google-cloud-dataflow-java:examples
 :beam-runners-google-cloud-dataflow-java :runners:google-cloud-dataflow-java
 :beam-runners-gearpump :runners:gearpump
-:beam-runners-flink_2.11-job-server-container :runners:flink:1.5:job-server-container
-:beam-runners-flink_2.11-job-server :runners:flink:1.5:job-server
-:beam-runners-flink_2.11 :runners:flink:1.5
+:beam-runners-flink_2.11-job-server-container :runners:flink:1.8:job-server-container
+:beam-runners-flink_2.11-job-server :runners:flink:1.8:job-server
+:beam-runners-flink_2.11 :runners:flink:1.8
 :beam-runners-flink-1.7-job-server-container :runners:flink:1.7:job-server-container
 :beam-runners-flink-1.7-job-server :runners:flink:1.7:job-server
 :beam-runners-flink-1.7 :runners:flink:1.7
diff --git a/release/build.gradle b/release/build.gradle
index c5228ba..44e9f98 100644
--- a/release/build.gradle
+++ b/release/build.gradle
@@ -34,7 +34,7 @@
   dependsOn ":runners:google-cloud-dataflow-java:runQuickstartJavaDataflow"
   dependsOn ":runners:apex:runQuickstartJavaApex"
   dependsOn ":runners:spark:runQuickstartJavaSpark"
-  dependsOn ":runners:flink:1.5:runQuickstartJavaFlinkLocal"
+  dependsOn ":runners:flink:1.8:runQuickstartJavaFlinkLocal"
   dependsOn ":runners:direct-java:runMobileGamingJavaDirect"
   dependsOn ":runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow"
 }
diff --git a/release/src/main/python-release/python_release_automation_utils.sh b/release/src/main/python-release/python_release_automation_utils.sh
index 14ac40c..83d6a03 100644
--- a/release/src/main/python-release/python_release_automation_utils.sh
+++ b/release/src/main/python-release/python_release_automation_utils.sh
@@ -322,7 +322,7 @@
 
 # Python RC configurations
 VERSION=$(get_version)
-RC_STAGING_URL="https://dist.apache.org/repos/dist/dev/beam/$VERSION/"
+RC_STAGING_URL="https://dist.apache.org/repos/dist/dev/beam/$VERSION/python"
 
 # Cloud Configurations
 PROJECT_ID='apache-beam-testing'
diff --git a/release/src/main/scripts/run_rc_validation.sh b/release/src/main/scripts/run_rc_validation.sh
index 02464e0..0057f24 100755
--- a/release/src/main/scripts/run_rc_validation.sh
+++ b/release/src/main/scripts/run_rc_validation.sh
@@ -209,7 +209,7 @@
   echo "*************************************************************"
   echo "* Running Java Quickstart with Flink local runner"
   echo "*************************************************************"
-  ./gradlew :runners:flink:1.5:runQuickstartJavaFlinkLocal \
+  ./gradlew :runners:flink:1.8:runQuickstartJavaFlinkLocal \
   -Prepourl=${REPO_URL} \
   -Pver=${RELEASE_VER}
 else
diff --git a/runners/apex/build.gradle b/runners/apex/build.gradle
index ce7b1cf..06a3d33 100644
--- a/runners/apex/build.gradle
+++ b/runners/apex/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.apex')
 
 description = "Apache Beam :: Runners :: Apex"
 
diff --git a/runners/core-construction-java/build.gradle b/runners/core-construction-java/build.gradle
index 219aa06..e7f4899 100644
--- a/runners/core-construction-java/build.gradle
+++ b/runners/core-construction-java/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.core.construction')
 
 description = "Apache Beam :: Runners :: Core Construction Java"
 ext.summary = """Beam Runners Core provides utilities to aid runner authors interact with a Pipeline
diff --git a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SchemaTranslation.java b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SchemaTranslation.java
index 26b154d..6d6eb57 100644
--- a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SchemaTranslation.java
+++ b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SchemaTranslation.java
@@ -19,7 +19,7 @@
 
 import java.util.Map;
 import java.util.UUID;
-import org.apache.beam.model.pipeline.v1.RunnerApi;
+import org.apache.beam.model.pipeline.v1.SchemaApi;
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.Field;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
@@ -27,37 +27,21 @@
 import org.apache.beam.sdk.schemas.Schema.TypeName;
 import org.apache.beam.sdk.util.SerializableUtils;
 import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.BiMap;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableBiMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Maps;
 
 /** Utility methods for translating schemas. */
 public class SchemaTranslation {
-  private static final BiMap<TypeName, RunnerApi.Schema.TypeName> TYPE_NAME_MAPPING =
-      ImmutableBiMap.<TypeName, RunnerApi.Schema.TypeName>builder()
-          .put(TypeName.BYTE, RunnerApi.Schema.TypeName.BYTE)
-          .put(TypeName.INT16, RunnerApi.Schema.TypeName.INT16)
-          .put(TypeName.INT32, RunnerApi.Schema.TypeName.INT32)
-          .put(TypeName.INT64, RunnerApi.Schema.TypeName.INT64)
-          .put(TypeName.DECIMAL, RunnerApi.Schema.TypeName.DECIMAL)
-          .put(TypeName.FLOAT, RunnerApi.Schema.TypeName.FLOAT)
-          .put(TypeName.DOUBLE, RunnerApi.Schema.TypeName.DOUBLE)
-          .put(TypeName.STRING, RunnerApi.Schema.TypeName.STRING)
-          .put(TypeName.DATETIME, RunnerApi.Schema.TypeName.DATETIME)
-          .put(TypeName.BOOLEAN, RunnerApi.Schema.TypeName.BOOLEAN)
-          .put(TypeName.BYTES, RunnerApi.Schema.TypeName.BYTES)
-          .put(TypeName.ARRAY, RunnerApi.Schema.TypeName.ARRAY)
-          .put(TypeName.MAP, RunnerApi.Schema.TypeName.MAP)
-          .put(TypeName.ROW, RunnerApi.Schema.TypeName.ROW)
-          .put(TypeName.LOGICAL_TYPE, RunnerApi.Schema.TypeName.LOGICAL_TYPE)
-          .build();
 
-  public static RunnerApi.Schema toProto(Schema schema) {
+  private static final String URN_BEAM_LOGICAL_DATETIME = "beam:logical_type:datetime:v1";
+  private static final String URN_BEAM_LOGICAL_DECIMAL = "beam:logical_type:decimal:v1";
+  private static final String URN_BEAM_LOGICAL_JAVASDK = "beam:logical_type:javasdk:v1";
+
+  public static SchemaApi.Schema schemaToProto(Schema schema) {
     String uuid = schema.getUUID() != null ? schema.getUUID().toString() : "";
-    RunnerApi.Schema.Builder builder = RunnerApi.Schema.newBuilder().setId(uuid);
+    SchemaApi.Schema.Builder builder = SchemaApi.Schema.newBuilder().setId(uuid);
     for (Field field : schema.getFields()) {
-      RunnerApi.Schema.Field protoField =
-          toProto(
+      SchemaApi.Field protoField =
+          fieldToProto(
               field,
               schema.indexOf(field.getName()),
               schema.getEncodingPositions().get(field.getName()));
@@ -66,60 +50,103 @@
     return builder.build();
   }
 
-  private static RunnerApi.Schema.Field toProto(Field field, int fieldId, int position) {
-    return RunnerApi.Schema.Field.newBuilder()
+  private static SchemaApi.Field fieldToProto(Field field, int fieldId, int position) {
+    return SchemaApi.Field.newBuilder()
         .setName(field.getName())
         .setDescription(field.getDescription())
-        .setType(toProto(field.getType()))
+        .setType(fieldTypeToProto(field.getType()))
         .setId(fieldId)
         .setEncodingPosition(position)
         .build();
   }
 
-  private static RunnerApi.Schema.FieldType toProto(FieldType fieldType) {
-    RunnerApi.Schema.FieldType.Builder builder =
-        RunnerApi.Schema.FieldType.newBuilder()
-            .setTypeName(TYPE_NAME_MAPPING.get(fieldType.getTypeName()));
+  private static SchemaApi.FieldType fieldTypeToProto(FieldType fieldType) {
+    SchemaApi.FieldType.Builder builder = SchemaApi.FieldType.newBuilder();
     switch (fieldType.getTypeName()) {
       case ROW:
-        builder.setRowSchema(toProto(fieldType.getRowSchema()));
+        builder.setRowType(
+            SchemaApi.RowType.newBuilder().setSchema(schemaToProto(fieldType.getRowSchema())));
         break;
 
       case ARRAY:
-        builder.setCollectionElementType(toProto(fieldType.getCollectionElementType()));
+        builder.setArrayType(
+            SchemaApi.ArrayType.newBuilder()
+                .setElementType(fieldTypeToProto(fieldType.getCollectionElementType())));
         break;
 
       case MAP:
         builder.setMapType(
-            RunnerApi.Schema.MapType.newBuilder()
-                .setKeyType(toProto(fieldType.getMapKeyType()))
-                .setValueType(toProto(fieldType.getMapValueType()))
+            SchemaApi.MapType.newBuilder()
+                .setKeyType(fieldTypeToProto(fieldType.getMapKeyType()))
+                .setValueType(fieldTypeToProto(fieldType.getMapValueType()))
                 .build());
         break;
 
       case LOGICAL_TYPE:
         LogicalType logicalType = fieldType.getLogicalType();
         builder.setLogicalType(
-            RunnerApi.Schema.LogicalType.newBuilder()
-                .setId(logicalType.getIdentifier())
-                .setArgs(logicalType.getArgument())
-                .setBaseType(toProto(logicalType.getBaseType()))
-                .setSerializedClass(
+            SchemaApi.LogicalType.newBuilder()
+                // TODO(BEAM-7855): "javasdk" types should only be a last resort. Types defined in
+                // Beam should have their own URN, and there should be a mechanism for users to
+                // register their own types by URN.
+                .setUrn(URN_BEAM_LOGICAL_JAVASDK)
+                .setPayload(
                     ByteString.copyFrom(SerializableUtils.serializeToByteArray(logicalType)))
+                .setRepresentation(fieldTypeToProto(logicalType.getBaseType()))
                 .build());
         break;
-
-      default:
+        // Special-case for DATETIME and DECIMAL which are logical types in portable representation,
+        // but not yet in Java. (BEAM-7554)
+      case DATETIME:
+        builder.setLogicalType(
+            SchemaApi.LogicalType.newBuilder()
+                .setUrn(URN_BEAM_LOGICAL_DATETIME)
+                .setRepresentation(fieldTypeToProto(FieldType.INT64))
+                .build());
+        break;
+      case DECIMAL:
+        builder.setLogicalType(
+            SchemaApi.LogicalType.newBuilder()
+                .setUrn(URN_BEAM_LOGICAL_DECIMAL)
+                .setRepresentation(fieldTypeToProto(FieldType.BYTES))
+                .build());
+        break;
+      case BYTE:
+        builder.setAtomicType(SchemaApi.AtomicType.BYTE);
+        break;
+      case INT16:
+        builder.setAtomicType(SchemaApi.AtomicType.INT16);
+        break;
+      case INT32:
+        builder.setAtomicType(SchemaApi.AtomicType.INT32);
+        break;
+      case INT64:
+        builder.setAtomicType(SchemaApi.AtomicType.INT64);
+        break;
+      case FLOAT:
+        builder.setAtomicType(SchemaApi.AtomicType.FLOAT);
+        break;
+      case DOUBLE:
+        builder.setAtomicType(SchemaApi.AtomicType.DOUBLE);
+        break;
+      case STRING:
+        builder.setAtomicType(SchemaApi.AtomicType.STRING);
+        break;
+      case BOOLEAN:
+        builder.setAtomicType(SchemaApi.AtomicType.BOOLEAN);
+        break;
+      case BYTES:
+        builder.setAtomicType(SchemaApi.AtomicType.BYTES);
         break;
     }
     builder.setNullable(fieldType.getNullable());
     return builder.build();
   }
 
-  public static Schema fromProto(RunnerApi.Schema protoSchema) {
+  public static Schema fromProto(SchemaApi.Schema protoSchema) {
     Schema.Builder builder = Schema.builder();
     Map<String, Integer> encodingLocationMap = Maps.newHashMap();
-    for (RunnerApi.Schema.Field protoField : protoSchema.getFieldsList()) {
+    for (SchemaApi.Field protoField : protoSchema.getFieldsList()) {
       Field field = fieldFromProto(protoField);
       builder.addField(field);
       encodingLocationMap.put(protoField.getName(), protoField.getEncodingPosition());
@@ -133,41 +160,76 @@
     return schema;
   }
 
-  private static Field fieldFromProto(RunnerApi.Schema.Field protoField) {
+  private static Field fieldFromProto(SchemaApi.Field protoField) {
     return Field.of(protoField.getName(), fieldTypeFromProto(protoField.getType()))
         .withDescription(protoField.getDescription());
   }
 
-  private static FieldType fieldTypeFromProto(RunnerApi.Schema.FieldType protoFieldType) {
-    TypeName typeName = TYPE_NAME_MAPPING.inverse().get(protoFieldType.getTypeName());
-    FieldType fieldType;
-    switch (typeName) {
-      case ROW:
-        fieldType = FieldType.row(fromProto(protoFieldType.getRowSchema()));
-        break;
-      case ARRAY:
-        fieldType = FieldType.array(fieldTypeFromProto(protoFieldType.getCollectionElementType()));
-        break;
-      case MAP:
-        fieldType =
-            FieldType.map(
-                fieldTypeFromProto(protoFieldType.getMapType().getKeyType()),
-                fieldTypeFromProto(protoFieldType.getMapType().getValueType()));
-        break;
-      case LOGICAL_TYPE:
-        LogicalType logicalType =
-            (LogicalType)
-                SerializableUtils.deserializeFromByteArray(
-                    protoFieldType.getLogicalType().getSerializedClass().toByteArray(),
-                    "logicalType");
-        fieldType = FieldType.logicalType(logicalType);
-        break;
-      default:
-        fieldType = FieldType.of(typeName);
-    }
+  private static FieldType fieldTypeFromProto(SchemaApi.FieldType protoFieldType) {
+    FieldType fieldType = fieldTypeFromProtoWithoutNullable(protoFieldType);
+
     if (protoFieldType.getNullable()) {
       fieldType = fieldType.withNullable(true);
     }
+
     return fieldType;
   }
+
+  private static FieldType fieldTypeFromProtoWithoutNullable(SchemaApi.FieldType protoFieldType) {
+    switch (protoFieldType.getTypeInfoCase()) {
+      case ATOMIC_TYPE:
+        switch (protoFieldType.getAtomicType()) {
+          case BYTE:
+            return FieldType.of(TypeName.BYTE);
+          case INT16:
+            return FieldType.of(TypeName.INT16);
+          case INT32:
+            return FieldType.of(TypeName.INT32);
+          case INT64:
+            return FieldType.of(TypeName.INT64);
+          case FLOAT:
+            return FieldType.of(TypeName.FLOAT);
+          case DOUBLE:
+            return FieldType.of(TypeName.DOUBLE);
+          case STRING:
+            return FieldType.of(TypeName.STRING);
+          case BOOLEAN:
+            return FieldType.of(TypeName.BOOLEAN);
+          case BYTES:
+            return FieldType.of(TypeName.BYTES);
+          case UNSPECIFIED:
+            throw new IllegalArgumentException("Encountered UNSPECIFIED AtomicType");
+          default:
+            throw new IllegalArgumentException(
+                "Encountered unknown AtomicType: " + protoFieldType.getAtomicType());
+        }
+      case ROW_TYPE:
+        return FieldType.row(fromProto(protoFieldType.getRowType().getSchema()));
+      case ARRAY_TYPE:
+        return FieldType.array(fieldTypeFromProto(protoFieldType.getArrayType().getElementType()));
+      case MAP_TYPE:
+        return FieldType.map(
+            fieldTypeFromProto(protoFieldType.getMapType().getKeyType()),
+            fieldTypeFromProto(protoFieldType.getMapType().getValueType()));
+      case LOGICAL_TYPE:
+        // Special-case for DATETIME and DECIMAL which are logical types in portable representation,
+        // but not yet in Java. (BEAM-7554)
+        String urn = protoFieldType.getLogicalType().getUrn();
+        if (urn.equals(URN_BEAM_LOGICAL_DATETIME)) {
+          return FieldType.DATETIME;
+        } else if (urn.equals(URN_BEAM_LOGICAL_DECIMAL)) {
+          return FieldType.DECIMAL;
+        } else if (urn.equals(URN_BEAM_LOGICAL_JAVASDK)) {
+          return FieldType.logicalType(
+              (LogicalType)
+                  SerializableUtils.deserializeFromByteArray(
+                      protoFieldType.getLogicalType().getPayload().toByteArray(), "logicalType"));
+        } else {
+          throw new IllegalArgumentException("Encountered unsupported logical type URN: " + urn);
+        }
+      default:
+        throw new IllegalArgumentException(
+            "Unexpected type_info: " + protoFieldType.getTypeInfoCase());
+    }
+  }
 }
diff --git a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/SchemaTranslationTest.java b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/SchemaTranslationTest.java
new file mode 100644
index 0000000..2020814
--- /dev/null
+++ b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/SchemaTranslationTest.java
@@ -0,0 +1,89 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.core.construction;
+
+import static org.hamcrest.Matchers.equalTo;
+import static org.junit.Assert.assertThat;
+
+import org.apache.beam.model.pipeline.v1.SchemaApi;
+import org.apache.beam.sdk.schemas.LogicalTypes;
+import org.apache.beam.sdk.schemas.Schema;
+import org.apache.beam.sdk.schemas.Schema.Field;
+import org.apache.beam.sdk.schemas.Schema.FieldType;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;
+
+/** Tests for {@link SchemaTranslation}. */
+public class SchemaTranslationTest {
+
+  /** Tests round-trip proto encodings for {@link Schema}. */
+  @RunWith(Parameterized.class)
+  public static class ToFromProtoTest {
+    @Parameters(name = "{index}: {0}")
+    public static Iterable<Schema> data() {
+      return ImmutableList.<Schema>builder()
+          .add(Schema.of(Field.of("string", FieldType.STRING)))
+          .add(
+              Schema.of(
+                  Field.of("boolean", FieldType.BOOLEAN),
+                  Field.of("byte", FieldType.BYTE),
+                  Field.of("int16", FieldType.INT16),
+                  Field.of("int32", FieldType.INT32),
+                  Field.of("int64", FieldType.INT64)))
+          .add(
+              Schema.of(
+                  Field.of(
+                      "row",
+                      FieldType.row(
+                          Schema.of(
+                              Field.of("foo", FieldType.STRING),
+                              Field.of("bar", FieldType.DOUBLE),
+                              Field.of("baz", FieldType.BOOLEAN))))))
+          .add(
+              Schema.of(
+                  Field.of(
+                      "array(array(int64)))",
+                      FieldType.array(FieldType.array(FieldType.INT64.withNullable(true))))))
+          .add(
+              Schema.of(
+                  Field.of("nullable", FieldType.STRING.withNullable(true)),
+                  Field.of("non_nullable", FieldType.STRING.withNullable(false))))
+          .add(
+              Schema.of(
+                  Field.of("decimal", FieldType.DECIMAL), Field.of("datetime", FieldType.DATETIME)))
+          .add(
+              Schema.of(Field.of("logical", FieldType.logicalType(LogicalTypes.FixedBytes.of(24)))))
+          .build();
+    }
+
+    @Parameter(0)
+    public Schema schema;
+
+    @Test
+    public void toAndFromProto() throws Exception {
+      SchemaApi.Schema schemaProto = SchemaTranslation.schemaToProto(schema);
+
+      Schema decodedSchema = SchemaTranslation.fromProto(schemaProto);
+      assertThat(decodedSchema, equalTo(schema));
+    }
+  }
+}
diff --git a/runners/core-java/build.gradle b/runners/core-java/build.gradle
index 3a347f6..99795ce 100644
--- a/runners/core-java/build.gradle
+++ b/runners/core-java/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.core')
 
 description = "Apache Beam :: Runners :: Core Java"
 ext.summary = "Beam Runners Core provides utilities to aid runner authors."
diff --git a/runners/direct-java/build.gradle b/runners/direct-java/build.gradle
index 5c3f7dd..b8836a8 100644
--- a/runners/direct-java/build.gradle
+++ b/runners/direct-java/build.gradle
@@ -28,17 +28,19 @@
                         ":runners:java-fn-execution",
                         ":sdks:java:fn-execution"]
 
-applyJavaNature(shadowClosure: {
-  dependencies {
-    dependOnProjects.each {
-      include(project(path: it, configuration: "shadow"))
-    }
-  }
-  relocate "org.apache.beam.runners.core", getJavaRelocatedPath("runners.core")
-  relocate "org.apache.beam.runners.fnexecution", getJavaRelocatedPath("runners.fnexecution")
-  relocate "org.apache.beam.sdk.fn", getJavaRelocatedPath("sdk.fn")
-  relocate "org.apache.beam.runners.local", getJavaRelocatedPath("runners.local")
-})
+applyJavaNature(
+        automaticModuleName: 'org.apache.beam.runners.direct',
+        shadowClosure: {
+          dependencies {
+            dependOnProjects.each {
+              include(project(path: it, configuration: "shadow"))
+            }
+          }
+          relocate "org.apache.beam.runners.core", getJavaRelocatedPath("runners.core")
+          relocate "org.apache.beam.runners.fnexecution", getJavaRelocatedPath("runners.fnexecution")
+          relocate "org.apache.beam.sdk.fn", getJavaRelocatedPath("sdk.fn")
+          relocate "org.apache.beam.runners.local", getJavaRelocatedPath("runners.local")
+        })
 
 description = "Apache Beam :: Runners :: Direct Java"
 
diff --git a/runners/extensions-java/metrics/build.gradle b/runners/extensions-java/metrics/build.gradle
index 6c07a13..022b15c 100644
--- a/runners/extensions-java/metrics/build.gradle
+++ b/runners/extensions-java/metrics/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.runners.extensions.metrics')
 
 description = "Apache Beam :: Runners :: Extensions Java :: Metrics"
 ext.summary = "Beam Runners Extensions Metrics provides implementations of runners core metrics APIs."
diff --git a/runners/flink/1.5/build.gradle b/runners/flink/1.5/build.gradle
deleted file mode 100644
index b063395..0000000
--- a/runners/flink/1.5/build.gradle
+++ /dev/null
@@ -1,34 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * License); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-def basePath = '..'
-
-/* All properties required for loading the Flink build script. */
-project.ext {
-  // Set the version of all Flink-related dependencies here.
-  flink_version = '1.5.6'
-  // Main source directory and Flink version specific code.
-  main_source_dirs = ["$basePath/src/main/java", "./src/main/java"]
-  test_source_dirs = ["$basePath/src/test/java", "./src/test/java"]
-  main_resources_dirs = ["$basePath/src/main/resources"]
-  test_resources_dirs = ["$basePath/src/test/resources"]
-  archives_base_name = 'beam-runners-flink_2.11'
-}
-
-// Load the main build script which contains all build logic.
-apply from: "$basePath/flink_runner.gradle"
diff --git a/runners/flink/1.5/job-server-container/build.gradle b/runners/flink/1.5/job-server-container/build.gradle
deleted file mode 100644
index afdb68a..0000000
--- a/runners/flink/1.5/job-server-container/build.gradle
+++ /dev/null
@@ -1,26 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * License); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-def basePath = '../../job-server-container'
-
-project.ext {
-  resource_path = basePath
-}
-
-// Load the main build script which contains all build logic.
-apply from: "$basePath/flink_job_server_container.gradle"
diff --git a/runners/flink/1.5/job-server/build.gradle b/runners/flink/1.5/job-server/build.gradle
deleted file mode 100644
index fbba7a3..0000000
--- a/runners/flink/1.5/job-server/build.gradle
+++ /dev/null
@@ -1,31 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * License); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-def basePath = '../../job-server'
-
-project.ext {
-  // Look for the source code in the parent module
-  main_source_dirs = ["$basePath/src/main/java"]
-  test_source_dirs = ["$basePath/src/test/java"]
-  main_resources_dirs = ["$basePath/src/main/resources"]
-  test_resources_dirs = ["$basePath/src/test/resources"]
-  archives_base_name = 'beam-runners-flink_2.11-job-server'
-}
-
-// Load the main build script which contains all build logic.
-apply from: "$basePath/flink_job_server.gradle"
diff --git a/runners/flink/1.6/build.gradle b/runners/flink/1.6/build.gradle
deleted file mode 100644
index e8541cc..0000000
--- a/runners/flink/1.6/build.gradle
+++ /dev/null
@@ -1,34 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * License); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-def basePath = '..'
-
-/* All properties required for loading the Flink build script */
-project.ext {
-  // Set the version of all Flink-related dependencies here.
-  flink_version = '1.6.4'
-  // Main source directory and Flink version specific code.
-  main_source_dirs = ["$basePath/src/main/java", "../1.5/src/main/java"]
-  test_source_dirs = ["$basePath/src/test/java", "../1.5/src/test/java"]
-  main_resources_dirs = ["$basePath/src/main/resources"]
-  test_resources_dirs = ["$basePath/src/test/resources"]
-  archives_base_name = 'beam-runners-flink-1.6'
-}
-
-// Load the main build script which contains all build logic.
-apply from: "$basePath/flink_runner.gradle"
diff --git a/runners/flink/1.6/job-server-container/build.gradle b/runners/flink/1.6/job-server-container/build.gradle
deleted file mode 100644
index afdb68a..0000000
--- a/runners/flink/1.6/job-server-container/build.gradle
+++ /dev/null
@@ -1,26 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * License); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-def basePath = '../../job-server-container'
-
-project.ext {
-  resource_path = basePath
-}
-
-// Load the main build script which contains all build logic.
-apply from: "$basePath/flink_job_server_container.gradle"
diff --git a/runners/flink/1.6/job-server/build.gradle b/runners/flink/1.6/job-server/build.gradle
deleted file mode 100644
index 39f1810..0000000
--- a/runners/flink/1.6/job-server/build.gradle
+++ /dev/null
@@ -1,31 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * License); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-def basePath = '../../job-server'
-
-project.ext {
-  // Look for the source code in the parent module
-  main_source_dirs = ["$basePath/src/main/java"]
-  test_source_dirs = ["$basePath/src/test/java"]
-  main_resources_dirs = ["$basePath/src/main/resources"]
-  test_resources_dirs = ["$basePath/src/test/resources"]
-  archives_base_name = 'beam-runners-flink-1.6-job-server'
-}
-
-// Load the main build script which contains all build logic.
-apply from: "$basePath/flink_job_server.gradle"
diff --git a/runners/flink/1.7/build.gradle b/runners/flink/1.7/build.gradle
index 9029153..4013247 100644
--- a/runners/flink/1.7/build.gradle
+++ b/runners/flink/1.7/build.gradle
@@ -23,8 +23,8 @@
   // Set the version of all Flink-related dependencies here.
   flink_version = '1.7.2'
   // Main source directory and Flink version specific code.
-  main_source_dirs = ["$basePath/src/main/java", "../1.5/src/main/java"]
-  test_source_dirs = ["$basePath/src/test/java", "../1.5/src/test/java"]
+  main_source_dirs = ["$basePath/src/main/java", "./src/main/java"]
+  test_source_dirs = ["$basePath/src/test/java", "./src/test/java"]
   main_resources_dirs = ["$basePath/src/main/resources"]
   test_resources_dirs = ["$basePath/src/test/resources"]
   archives_base_name = 'beam-runners-flink-1.7'
diff --git a/runners/flink/1.5/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java b/runners/flink/1.7/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
similarity index 100%
rename from runners/flink/1.5/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
rename to runners/flink/1.7/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
diff --git a/runners/flink/1.5/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java b/runners/flink/1.7/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
similarity index 100%
rename from runners/flink/1.5/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
rename to runners/flink/1.7/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
diff --git a/runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java b/runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
similarity index 100%
rename from runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
rename to runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
diff --git a/runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java b/runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
similarity index 96%
rename from runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
rename to runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
index edc44f6..a80a483 100644
--- a/runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
+++ b/runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
@@ -118,7 +118,7 @@
     assertThat(stateInternals.watermarkHold(), is(noHold));
   }
 
-  private KeyedStateBackend<ByteBuffer> createStateBackend() throws Exception {
+  public static KeyedStateBackend<ByteBuffer> createStateBackend() throws Exception {
     MemoryStateBackend backend = new MemoryStateBackend();
 
     AbstractKeyedStateBackend<ByteBuffer> keyedStateBackend =
@@ -136,7 +136,8 @@
     return keyedStateBackend;
   }
 
-  private void changeKey(KeyedStateBackend<ByteBuffer> keyedStateBackend) throws CoderException {
+  private static void changeKey(KeyedStateBackend<ByteBuffer> keyedStateBackend)
+      throws CoderException {
     keyedStateBackend.setCurrentKey(
         ByteBuffer.wrap(
             CoderUtils.encodeToByteArray(StringUtf8Coder.of(), UUID.randomUUID().toString())));
diff --git a/runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java b/runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
similarity index 100%
rename from runners/flink/1.5/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
rename to runners/flink/1.7/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
diff --git a/runners/flink/1.8/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java b/runners/flink/1.8/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
index 82d2c91..ff4b220 100644
--- a/runners/flink/1.8/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
+++ b/runners/flink/1.8/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
@@ -121,7 +121,7 @@
     assertThat(stateInternals.watermarkHold(), is(noHold));
   }
 
-  private KeyedStateBackend<ByteBuffer> createStateBackend() throws Exception {
+  public static KeyedStateBackend<ByteBuffer> createStateBackend() throws Exception {
     MemoryStateBackend backend = new MemoryStateBackend();
 
     AbstractKeyedStateBackend<ByteBuffer> keyedStateBackend =
@@ -143,7 +143,8 @@
     return keyedStateBackend;
   }
 
-  private void changeKey(KeyedStateBackend<ByteBuffer> keyedStateBackend) throws CoderException {
+  public static void changeKey(KeyedStateBackend<ByteBuffer> keyedStateBackend)
+      throws CoderException {
     keyedStateBackend.setCurrentKey(
         ByteBuffer.wrap(
             CoderUtils.encodeToByteArray(StringUtf8Coder.of(), UUID.randomUUID().toString())));
diff --git a/runners/flink/flink_runner.gradle b/runners/flink/flink_runner.gradle
index 9b3d967..893f153 100644
--- a/runners/flink/flink_runner.gradle
+++ b/runners/flink/flink_runner.gradle
@@ -27,7 +27,8 @@
 
 apply plugin: 'org.apache.beam.module'
 applyJavaNature(
-    archivesBaseName: project.hasProperty('archives_base_name') ? archives_base_name : archivesBaseName
+    automaticModuleName: 'org.apache.beam.runners.flink',
+    archivesBaseName: (project.hasProperty('archives_base_name') ? archives_base_name : archivesBaseName)
 )
 
 description = "Apache Beam :: Runners :: Flink $flink_version"
@@ -83,14 +84,7 @@
   }
   // TODO Running tests of all Flink versions in parallel can be too harsh on Jenkins memory
   // Run them serially for now, to avoid "Exit code 137", i.e. Jenkins host killing the Gradle test process
-  if (project.path == ":runners:flink:1.6") {
-    mustRunAfter(":runners:flink:1.5:test")
-  } else if (project.path == ":runners:flink:1.7") {
-    mustRunAfter(":runners:flink:1.5:test")
-    mustRunAfter(":runners:flink:1.6:test")
-  } else if (project.path == ":runners:flink:1.8") {
-    mustRunAfter(":runners:flink:1.5:test")
-    mustRunAfter(":runners:flink:1.6:test")
+  if (project.path == ":runners:flink:1.8") {
     mustRunAfter(":runners:flink:1.7:test")
   }
 }
@@ -100,12 +94,13 @@
 }
 
 dependencies {
+  compileOnly project(":sdks:java:build-tools")
   compile library.java.vendored_guava_26_0_jre
   compile project(path: ":sdks:java:core", configuration: "shadow")
   compile project(":runners:core-java")
   compile project(":runners:core-construction-java")
   compile project(":runners:java-fn-execution")
-  compile project(":sdks:java:build-tools")
+  compile project(":sdks:java:extensions:google-cloud-platform-core")
   compile library.java.vendored_grpc_1_21_0
   compile library.java.jackson_annotations
   compile library.java.slf4j_api
@@ -190,5 +185,5 @@
   dependsOn validatesRunnerStreaming
 }
 
-// Generates :runners:flink:1.5:runQuickstartJavaFlinkLocal
+// Generates :runners:flink:1.8:runQuickstartJavaFlinkLocal
 createJavaExamplesArchetypeValidationTask(type: 'Quickstart', runner: 'FlinkLocal')
diff --git a/runners/flink/job-server/flink_job_server.gradle b/runners/flink/job-server/flink_job_server.gradle
index 3789007..ee856e7 100644
--- a/runners/flink/job-server/flink_job_server.gradle
+++ b/runners/flink/job-server/flink_job_server.gradle
@@ -30,6 +30,7 @@
 mainClassName = "org.apache.beam.runners.flink.FlinkJobServerDriver"
 
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.runners.flink.jobserver',
   archivesBaseName: project.hasProperty('archives_base_name') ? archives_base_name : archivesBaseName,
   validateShadowJar: false,
   exportJavadoc: false,
@@ -188,7 +189,8 @@
         "--flink_job_server_jar ${shadowJar.archivePath}",
         "--env_dir ${project.rootProject.buildDir}/gradleenv/${project.path.hashCode()}",
         "--python_root_dir ${project.rootDir}/sdks/python",
-        "--python_version 3.5"
+        "--python_version 3.5",
+        "--python_container_image apachebeam/python3.5_sdk:${project['python_sdk_version']}",
       ]
       args "-c", "../../job-server/test_pipeline_jar.sh ${options.join(' ')}"
     }
diff --git a/runners/flink/job-server/test_pipeline_jar.sh b/runners/flink/job-server/test_pipeline_jar.sh
index c59facf..9db6b79 100755
--- a/runners/flink/job-server/test_pipeline_jar.sh
+++ b/runners/flink/job-server/test_pipeline_jar.sh
@@ -43,6 +43,11 @@
         shift # past argument
         shift # past value
         ;;
+    --python_container_image)
+        PYTHON_CONTAINER_IMAGE="$2"
+        shift # past argument
+        shift # past value
+        ;;
     *)    # unknown option
         echo "Unknown option: $1"
         exit 1
@@ -57,10 +62,8 @@
 command -v docker
 docker -v
 
-CONTAINER=$USER-docker-apache.bintray.io/beam/python$PYTHON_VERSION
-TAG=latest
 # Verify container has already been built
-docker images $CONTAINER:$TAG | grep $TAG
+docker images --format "{{.Repository}}:{{.Tag}}" | grep $PYTHON_CONTAINER_IMAGE
 
 # Set up Python environment
 virtualenv -p python$PYTHON_VERSION $ENV_DIR
@@ -102,7 +105,7 @@
   --parallelism 1 \
   --sdk_worker_parallelism 1 \
   --environment_type DOCKER \
-  --environment_config=$CONTAINER:$TAG \
+  --environment_config=$PYTHON_CONTAINER_IMAGE \
 ) || TEST_EXIT_CODE=$? # don't fail fast here; clean up before exiting
 
 if [[ "$TEST_EXIT_CODE" -eq 0 ]]; then
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchPortablePipelineTranslator.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchPortablePipelineTranslator.java
index 21ed404..0eca69d 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchPortablePipelineTranslator.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchPortablePipelineTranslator.java
@@ -327,6 +327,7 @@
 
     final FlinkExecutableStageFunction<InputT> function =
         new FlinkExecutableStageFunction<>(
+            context.getPipelineOptions(),
             stagePayload,
             context.getJobInfo(),
             outputMap,
@@ -601,7 +602,7 @@
       String collectionId) {
     TypeInformation<WindowedValue<?>> outputType = new CoderTypeInformation<>(outputCoder);
     FlinkExecutableStagePruningFunction pruningFunction =
-        new FlinkExecutableStagePruningFunction(unionTag);
+        new FlinkExecutableStagePruningFunction(unionTag, context.getPipelineOptions());
     FlatMapOperator<RawUnionValue, WindowedValue<?>> pruningOperator =
         new FlatMapOperator<>(
             taggedDataset,
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchTransformTranslators.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchTransformTranslators.java
index bc41841..229eca5 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchTransformTranslators.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkBatchTransformTranslators.java
@@ -640,7 +640,7 @@
       TypeInformation<WindowedValue<T>> outputType = context.getTypeInfo(collection);
 
       FlinkMultiOutputPruningFunction<T> pruningFunction =
-          new FlinkMultiOutputPruningFunction<>(integerTag);
+          new FlinkMultiOutputPruningFunction<>(integerTag, context.getPipelineOptions());
 
       FlatMapOperator<WindowedValue<RawUnionValue>, WindowedValue<T>> pruningOperator =
           new FlatMapOperator<>(taggedDataSet, outputType, pruningFunction, collection.getName());
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkJobServerDriver.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkJobServerDriver.java
index 2f3f981..0c283d8 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkJobServerDriver.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkJobServerDriver.java
@@ -21,7 +21,9 @@
 import org.apache.beam.runners.fnexecution.ServerFactory;
 import org.apache.beam.runners.fnexecution.jobsubmission.JobInvoker;
 import org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver;
+import org.apache.beam.sdk.extensions.gcp.options.GcsOptions;
 import org.apache.beam.sdk.io.FileSystems;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.kohsuke.args4j.CmdLineException;
 import org.kohsuke.args4j.CmdLineParser;
@@ -59,8 +61,11 @@
 
   public static void main(String[] args) throws Exception {
     // TODO: Expose the fileSystem related options.
+    PipelineOptions options = PipelineOptionsFactory.create();
+    // Limiting gcs upload buffer to reduce memory usage while doing parallel artifact uploads.
+    options.as(GcsOptions.class).setGcsUploadBufferSizeBytes(1024 * 1024);
     // Register standard file systems.
-    FileSystems.setDefaultPipelineOptions(PipelineOptionsFactory.create());
+    FileSystems.setDefaultPipelineOptions(options);
     fromParams(args).run();
   }
 
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingPortablePipelineTranslator.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingPortablePipelineTranslator.java
index 241fbbd..92b07a4 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingPortablePipelineTranslator.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingPortablePipelineTranslator.java
@@ -44,6 +44,7 @@
 import org.apache.beam.runners.core.construction.ReadTranslation;
 import org.apache.beam.runners.core.construction.RehydratedComponents;
 import org.apache.beam.runners.core.construction.RunnerPCollectionView;
+import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
 import org.apache.beam.runners.core.construction.TestStreamTranslation;
 import org.apache.beam.runners.core.construction.WindowingStrategyTranslation;
 import org.apache.beam.runners.core.construction.graph.ExecutableStage;
@@ -68,8 +69,8 @@
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.IterableCoder;
 import org.apache.beam.sdk.coders.KvCoder;
-import org.apache.beam.sdk.coders.LengthPrefixCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.io.UnboundedSource;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.testing.TestStream;
@@ -101,9 +102,10 @@
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Sets;
 import org.apache.flink.api.common.JobExecutionResult;
 import org.apache.flink.api.common.functions.FlatMapFunction;
-import org.apache.flink.api.common.functions.MapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
 import org.apache.flink.api.common.typeinfo.TypeInformation;
 import org.apache.flink.api.java.functions.KeySelector;
+import org.apache.flink.configuration.Configuration;
 import org.apache.flink.streaming.api.datastream.DataStream;
 import org.apache.flink.streaming.api.datastream.DataStreamSource;
 import org.apache.flink.streaming.api.datastream.KeyedStream;
@@ -400,7 +402,9 @@
 
     DataStream<WindowedValue<SingletonKeyedWorkItem<K, V>>> workItemStream =
         inputDataStream
-            .flatMap(new FlinkStreamingTransformTranslators.ToKeyedWorkItem<>())
+            .flatMap(
+                new FlinkStreamingTransformTranslators.ToKeyedWorkItem<>(
+                    context.getPipelineOptions()))
             .returns(workItemTypeInfo)
             .name("ToKeyedWorkItem");
 
@@ -518,12 +522,12 @@
         source =
             nonDedupSource
                 .keyBy(new FlinkStreamingTransformTranslators.ValueWithRecordIdKeySelector<>())
-                .transform("deduping", outputTypeInfo, new DedupingOperator<>())
+                .transform("deduping", outputTypeInfo, new DedupingOperator<>(pipelineOptions))
                 .uid(format("%s/__deduplicated__", transformName));
       } else {
         source =
             nonDedupSource
-                .flatMap(new FlinkStreamingTransformTranslators.StripIdsMap<>())
+                .flatMap(new FlinkStreamingTransformTranslators.StripIdsMap<>(pipelineOptions))
                 .returns(outputTypeInfo);
       }
     } catch (Exception e) {
@@ -676,11 +680,6 @@
                 valueCoder.getClass().getSimpleName()));
       }
       keyCoder = ((KvCoder) valueCoder).getKeyCoder();
-      if (keyCoder instanceof LengthPrefixCoder) {
-        // Remove any unnecessary length prefixes which add more payload
-        // but also are not expected for state requests inside the operator.
-        keyCoder = ((LengthPrefixCoder) keyCoder).getValueCoder();
-      }
       keySelector = new KvToByteBufferKeySelector(keyCoder);
       inputDataStream = inputDataStream.keyBy(keySelector);
     }
@@ -920,7 +919,7 @@
           sideInput.getKey().getTransformId() + "-" + sideInput.getKey().getLocalName();
       WindowedValueCoder<KV<Void, Object>> kvCoder = kvCoders.get(intTag);
       DataStream<WindowedValue<KV<Void, Object>>> keyedSideInputStream =
-          sideInputStream.map(new ToVoidKeyValue());
+          sideInputStream.map(new ToVoidKeyValue(context.getPipelineOptions()));
 
       SingleOutputStreamOperator<WindowedValue<KV<Void, Iterable<Object>>>> viewStream =
           addGBK(
@@ -934,7 +933,9 @@
 
       DataStream<RawUnionValue> unionValueStream =
           viewStream
-              .map(new FlinkStreamingTransformTranslators.ToRawUnion<>(intTag))
+              .map(
+                  new FlinkStreamingTransformTranslators.ToRawUnion<>(
+                      intTag, context.getPipelineOptions()))
               .returns(unionTypeInformation);
 
       if (sideInputUnion == null) {
@@ -960,7 +961,21 @@
   }
 
   private static class ToVoidKeyValue<T>
-      implements MapFunction<WindowedValue<T>, WindowedValue<KV<Void, T>>> {
+      extends RichMapFunction<WindowedValue<T>, WindowedValue<KV<Void, T>>> {
+
+    private final SerializablePipelineOptions options;
+
+    public ToVoidKeyValue(PipelineOptions pipelineOptions) {
+      this.options = new SerializablePipelineOptions(pipelineOptions);
+    }
+
+    @Override
+    public void open(Configuration parameters) {
+      // Initialize FileSystems for any coders which may want to use the FileSystem,
+      // see https://issues.apache.org/jira/browse/BEAM-8303
+      FileSystems.setDefaultPipelineOptions(options.get());
+    }
+
     @Override
     public WindowedValue<KV<Void, T>> map(WindowedValue<T> value) {
       return value.withValue(KV.of(null, value.getValue()));
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
index 88ed53a..785bf2b 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
@@ -36,6 +36,7 @@
 import org.apache.beam.runners.core.construction.PTransformTranslation;
 import org.apache.beam.runners.core.construction.ParDoTranslation;
 import org.apache.beam.runners.core.construction.ReadTranslation;
+import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
 import org.apache.beam.runners.core.construction.SplittableParDo;
 import org.apache.beam.runners.core.construction.TransformPayloadTranslatorRegistrar;
 import org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource.BoundedToUnboundedSourceAdapter;
@@ -57,7 +58,9 @@
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
 import org.apache.beam.sdk.io.BoundedSource;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.io.UnboundedSource;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.runners.AppliedPTransform;
 import org.apache.beam.sdk.testing.TestStream;
 import org.apache.beam.sdk.transforms.Combine;
@@ -90,8 +93,8 @@
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Maps;
 import org.apache.flink.api.common.functions.FlatMapFunction;
-import org.apache.flink.api.common.functions.MapFunction;
 import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
 import org.apache.flink.api.common.functions.StoppableFunction;
 import org.apache.flink.api.common.typeinfo.TypeInformation;
 import org.apache.flink.api.java.functions.KeySelector;
@@ -224,10 +227,16 @@
           source =
               nonDedupSource
                   .keyBy(new ValueWithRecordIdKeySelector<>())
-                  .transform("deduping", outputTypeInfo, new DedupingOperator<>())
+                  .transform(
+                      "deduping",
+                      outputTypeInfo,
+                      new DedupingOperator<>(context.getPipelineOptions()))
                   .uid(format("%s/__deduplicated__", fullName));
         } else {
-          source = nonDedupSource.flatMap(new StripIdsMap<>()).returns(outputTypeInfo);
+          source =
+              nonDedupSource
+                  .flatMap(new StripIdsMap<>(context.getPipelineOptions()))
+                  .returns(outputTypeInfo);
         }
       } catch (Exception e) {
         throw new RuntimeException("Error while translating UnboundedSource: " + rawSource, e);
@@ -253,7 +262,20 @@
   }
 
   public static class StripIdsMap<T>
-      implements FlatMapFunction<WindowedValue<ValueWithRecordId<T>>, WindowedValue<T>> {
+      extends RichFlatMapFunction<WindowedValue<ValueWithRecordId<T>>, WindowedValue<T>> {
+
+    private final SerializablePipelineOptions options;
+
+    StripIdsMap(PipelineOptions options) {
+      this.options = new SerializablePipelineOptions(options);
+    }
+
+    @Override
+    public void open(Configuration parameters) {
+      // Initialize FileSystems for any coders which may want to use the FileSystem,
+      // see https://issues.apache.org/jira/browse/BEAM-8303
+      FileSystems.setDefaultPipelineOptions(options.get());
+    }
 
     @Override
     public void flatMap(
@@ -332,11 +354,20 @@
   }
 
   /** Wraps each element in a {@link RawUnionValue} with the given tag id. */
-  public static class ToRawUnion<T> implements MapFunction<T, RawUnionValue> {
+  public static class ToRawUnion<T> extends RichMapFunction<T, RawUnionValue> {
     private final int intTag;
+    private final SerializablePipelineOptions options;
 
-    public ToRawUnion(int intTag) {
+    ToRawUnion(int intTag, PipelineOptions pipelineOptions) {
       this.intTag = intTag;
+      this.options = new SerializablePipelineOptions(pipelineOptions);
+    }
+
+    @Override
+    public void open(Configuration parameters) {
+      // Initialize FileSystems for any coders which may want to use the FileSystem,
+      // see https://issues.apache.org/jira/browse/BEAM-8303
+      FileSystems.setDefaultPipelineOptions(options.get());
     }
 
     @Override
@@ -385,7 +416,9 @@
       final int intTag = tagToIntMapping.get(tag);
       DataStream<Object> sideInputStream = context.getInputDataStream(sideInput);
       DataStream<RawUnionValue> unionValueStream =
-          sideInputStream.map(new ToRawUnion<>(intTag)).returns(unionTypeInformation);
+          sideInputStream
+              .map(new ToRawUnion<>(intTag, context.getPipelineOptions()))
+              .returns(unionTypeInformation);
 
       if (sideInputUnion == null) {
         sideInputUnion = unionValueStream;
@@ -854,7 +887,7 @@
 
       DataStream<WindowedValue<SingletonKeyedWorkItem<K, InputT>>> workItemStream =
           inputDataStream
-              .flatMap(new ToKeyedWorkItem<>())
+              .flatMap(new ToKeyedWorkItem<>(context.getPipelineOptions()))
               .returns(workItemTypeInfo)
               .name("ToKeyedWorkItem");
 
@@ -954,7 +987,7 @@
 
       DataStream<WindowedValue<SingletonKeyedWorkItem<K, InputT>>> workItemStream =
           inputDataStream
-              .flatMap(new ToKeyedWorkItem<>())
+              .flatMap(new ToKeyedWorkItem<>(context.getPipelineOptions()))
               .returns(workItemTypeInfo)
               .name("ToKeyedWorkItem");
 
@@ -1089,7 +1122,7 @@
 
       DataStream<WindowedValue<SingletonKeyedWorkItem<K, InputT>>> workItemStream =
           inputDataStream
-              .flatMap(new ToKeyedWorkItemInGlobalWindow<>())
+              .flatMap(new ToKeyedWorkItemInGlobalWindow<>(context.getPipelineOptions()))
               .returns(workItemTypeInfo)
               .name("ToKeyedWorkItem");
 
@@ -1105,6 +1138,19 @@
       extends RichFlatMapFunction<
           WindowedValue<KV<K, InputT>>, WindowedValue<SingletonKeyedWorkItem<K, InputT>>> {
 
+    private final SerializablePipelineOptions options;
+
+    ToKeyedWorkItemInGlobalWindow(PipelineOptions options) {
+      this.options = new SerializablePipelineOptions(options);
+    }
+
+    @Override
+    public void open(Configuration parameters) {
+      // Initialize FileSystems for any coders which may want to use the FileSystem,
+      // see https://issues.apache.org/jira/browse/BEAM-8303
+      FileSystems.setDefaultPipelineOptions(options.get());
+    }
+
     @Override
     public void flatMap(
         WindowedValue<KV<K, InputT>> inWithMultipleWindows,
@@ -1199,6 +1245,19 @@
       extends RichFlatMapFunction<
           WindowedValue<KV<K, InputT>>, WindowedValue<SingletonKeyedWorkItem<K, InputT>>> {
 
+    private final SerializablePipelineOptions options;
+
+    ToKeyedWorkItem(PipelineOptions options) {
+      this.options = new SerializablePipelineOptions(options);
+    }
+
+    @Override
+    public void open(Configuration parameters) {
+      // Initialize FileSystems for any coders which may want to use the FileSystem,
+      // see https://issues.apache.org/jira/browse/BEAM-8303
+      FileSystems.setDefaultPipelineOptions(options.get());
+    }
+
     @Override
     public void flatMap(
         WindowedValue<KV<K, InputT>> inWithMultipleWindows,
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
index 6004008..56e147c 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
@@ -27,6 +27,7 @@
 import org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate;
 import org.apache.beam.runners.flink.translation.utils.FlinkClassloading;
 import org.apache.beam.sdk.coders.Coder;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.DoFn;
 import org.apache.beam.sdk.transforms.DoFnSchemaInformation;
@@ -143,7 +144,11 @@
   }
 
   @Override
-  public void open(Configuration parameters) throws Exception {
+  public void open(Configuration parameters) {
+    // Note that the SerializablePipelineOptions already initialize FileSystems in the readObject()
+    // deserialization method. However, this is a hack, and we want to properly initialize the
+    // options where they are needed.
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
     doFnInvoker = DoFnInvokers.tryInvokeSetupFor(doFn);
   }
 
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunction.java
index 1b011c6..19a6aec 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunction.java
@@ -29,6 +29,7 @@
 import org.apache.beam.model.pipeline.v1.RunnerApi;
 import org.apache.beam.runners.core.InMemoryTimerInternals;
 import org.apache.beam.runners.core.TimerInternals;
+import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
 import org.apache.beam.runners.core.construction.graph.ExecutableStage;
 import org.apache.beam.runners.flink.metrics.FlinkMetricContainer;
 import org.apache.beam.runners.fnexecution.control.BundleProgressHandler;
@@ -47,7 +48,7 @@
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.fn.data.FnDataReceiver;
 import org.apache.beam.sdk.io.FileSystems;
-import org.apache.beam.sdk.options.PipelineOptionsFactory;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.join.RawUnionValue;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
 import org.apache.beam.sdk.util.WindowedValue;
@@ -79,6 +80,8 @@
   // Main constructor fields. All must be Serializable because Flink distributes Functions to
   // task managers via java serialization.
 
+  // Pipeline options for initializing the FileSystems
+  private final SerializablePipelineOptions pipelineOptions;
   // The executable stage this function will run.
   private final RunnerApi.ExecutableStagePayload stagePayload;
   // Pipeline options. Used for provisioning api.
@@ -104,11 +107,13 @@
   private transient Object currentTimerKey;
 
   public FlinkExecutableStageFunction(
+      PipelineOptions pipelineOptions,
       RunnerApi.ExecutableStagePayload stagePayload,
       JobInfo jobInfo,
       Map<String, Integer> outputMap,
       FlinkExecutableStageContextFactory contextFactory,
       Coder windowCoder) {
+    this.pipelineOptions = new SerializablePipelineOptions(pipelineOptions);
     this.stagePayload = stagePayload;
     this.jobInfo = jobInfo;
     this.outputMap = outputMap;
@@ -120,8 +125,7 @@
   @Override
   public void open(Configuration parameters) throws Exception {
     // Register standard file systems.
-    // TODO Use actual pipeline options.
-    FileSystems.setDefaultPipelineOptions(PipelineOptionsFactory.create());
+    FileSystems.setDefaultPipelineOptions(pipelineOptions.get());
     executableStage = ExecutableStage.fromPayload(stagePayload);
     runtimeContext = getRuntimeContext();
     container = new FlinkMetricContainer(getRuntimeContext());
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStagePruningFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStagePruningFunction.java
index 12d5a51..639f297 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStagePruningFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStagePruningFunction.java
@@ -17,23 +17,36 @@
  */
 package org.apache.beam.runners.flink.translation.functions;
 
+import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
+import org.apache.beam.sdk.io.FileSystems;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.join.RawUnionValue;
 import org.apache.beam.sdk.util.WindowedValue;
-import org.apache.flink.api.common.functions.FlatMapFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.configuration.Configuration;
 import org.apache.flink.util.Collector;
 
 /** A Flink function that demultiplexes output from a {@link FlinkExecutableStageFunction}. */
 public class FlinkExecutableStagePruningFunction
-    implements FlatMapFunction<RawUnionValue, WindowedValue<?>> {
+    extends RichFlatMapFunction<RawUnionValue, WindowedValue<?>> {
 
   private final int unionTag;
+  private final SerializablePipelineOptions options;
 
   /**
    * Creates a {@link FlinkExecutableStagePruningFunction} that extracts elements of the given union
    * tag.
    */
-  public FlinkExecutableStagePruningFunction(int unionTag) {
+  public FlinkExecutableStagePruningFunction(int unionTag, PipelineOptions pipelineOptions) {
     this.unionTag = unionTag;
+    this.options = new SerializablePipelineOptions(pipelineOptions);
+  }
+
+  @Override
+  public void open(Configuration parameters) {
+    // Initialize FileSystems for any coders which may want to use the FileSystem,
+    // see https://issues.apache.org/jira/browse/BEAM-8303
+    FileSystems.setDefaultPipelineOptions(options.get());
   }
 
   @Override
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMergingNonShuffleReduceFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMergingNonShuffleReduceFunction.java
index b9af5ad..b34649f 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMergingNonShuffleReduceFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMergingNonShuffleReduceFunction.java
@@ -19,6 +19,7 @@
 
 import java.util.Map;
 import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.CombineFnBase;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
@@ -28,6 +29,7 @@
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.WindowingStrategy;
 import org.apache.flink.api.common.functions.RichGroupReduceFunction;
+import org.apache.flink.configuration.Configuration;
 import org.apache.flink.util.Collector;
 
 /**
@@ -64,6 +66,13 @@
   }
 
   @Override
+  public void open(Configuration parameters) {
+    // Initialize FileSystems for any coders which may want to use the FileSystem,
+    // see https://issues.apache.org/jira/browse/BEAM-8303
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
+  }
+
+  @Override
   public void reduce(
       Iterable<WindowedValue<KV<K, InputT>>> elements, Collector<WindowedValue<KV<K, OutputT>>> out)
       throws Exception {
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputPruningFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputPruningFunction.java
index 27801e3..787b172 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputPruningFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputPruningFunction.java
@@ -17,9 +17,14 @@
  */
 package org.apache.beam.runners.flink.translation.functions;
 
+import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
+import org.apache.beam.sdk.io.FileSystems;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.join.RawUnionValue;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.flink.api.common.functions.FlatMapFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.configuration.Configuration;
 import org.apache.flink.util.Collector;
 
 /**
@@ -28,12 +33,21 @@
  * FlinkDoFnFunction}.
  */
 public class FlinkMultiOutputPruningFunction<T>
-    implements FlatMapFunction<WindowedValue<RawUnionValue>, WindowedValue<T>> {
+    extends RichFlatMapFunction<WindowedValue<RawUnionValue>, WindowedValue<T>> {
 
   private final int ourOutputTag;
+  private final SerializablePipelineOptions options;
 
-  public FlinkMultiOutputPruningFunction(int ourOutputTag) {
+  public FlinkMultiOutputPruningFunction(int ourOutputTag, PipelineOptions options) {
     this.ourOutputTag = ourOutputTag;
+    this.options = new SerializablePipelineOptions(options);
+  }
+
+  @Override
+  public void open(Configuration parameters) {
+    // Initialize FileSystems for any coders which may want to use the FileSystem,
+    // see https://issues.apache.org/jira/browse/BEAM-8303
+    FileSystems.setDefaultPipelineOptions(options.get());
   }
 
   @Override
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkPartialReduceFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkPartialReduceFunction.java
index 94f6778..b073304 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkPartialReduceFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkPartialReduceFunction.java
@@ -19,6 +19,7 @@
 
 import java.util.Map;
 import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.CombineFnBase;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
@@ -28,6 +29,7 @@
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.WindowingStrategy;
 import org.apache.flink.api.common.functions.RichGroupCombineFunction;
+import org.apache.flink.configuration.Configuration;
 import org.apache.flink.util.Collector;
 
 /**
@@ -63,6 +65,13 @@
   }
 
   @Override
+  public void open(Configuration parameters) {
+    // Initialize FileSystems for any coders which may want to use the FileSystem,
+    // see https://issues.apache.org/jira/browse/BEAM-8303
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
+  }
+
+  @Override
   public void combine(
       Iterable<WindowedValue<KV<K, InputT>>> elements, Collector<WindowedValue<KV<K, AccumT>>> out)
       throws Exception {
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkReduceFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkReduceFunction.java
index 36cfd69..8ebf63c 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkReduceFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkReduceFunction.java
@@ -19,6 +19,7 @@
 
 import java.util.Map;
 import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.CombineFnBase;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
@@ -28,6 +29,7 @@
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.WindowingStrategy;
 import org.apache.flink.api.common.functions.RichGroupReduceFunction;
+import org.apache.flink.configuration.Configuration;
 import org.apache.flink.util.Collector;
 
 /**
@@ -65,6 +67,13 @@
   }
 
   @Override
+  public void open(Configuration parameters) {
+    // Initialize FileSystems for any coders which may want to use the FileSystem,
+    // see https://issues.apache.org/jira/browse/BEAM-8303
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
+  }
+
+  @Override
   public void reduce(
       Iterable<WindowedValue<KV<K, AccumT>>> elements, Collector<WindowedValue<KV<K, OutputT>>> out)
       throws Exception {
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
index 9a8e085..6017f54 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
@@ -36,6 +36,7 @@
 import org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate;
 import org.apache.beam.runners.flink.translation.utils.FlinkClassloading;
 import org.apache.beam.sdk.coders.Coder;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.DoFn;
 import org.apache.beam.sdk.transforms.DoFnSchemaInformation;
@@ -214,7 +215,11 @@
   }
 
   @Override
-  public void open(Configuration parameters) throws Exception {
+  public void open(Configuration parameters) {
+    // Note that the SerializablePipelineOptions already initialize FileSystems in the readObject()
+    // deserialization method. However, this is a hack, and we want to properly initialize the
+    // options where they are needed.
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
     doFnInvoker = DoFnInvokers.tryInvokeSetupFor(dofn);
   }
 
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
index 6fd5e85..34489c5 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
@@ -312,8 +312,7 @@
       Output<StreamRecord<WindowedValue<OutputT>>> output) {
 
     // make sure that FileSystems is initialized correctly
-    FlinkPipelineOptions options = serializedOptions.get().as(FlinkPipelineOptions.class);
-    FileSystems.setDefaultPipelineOptions(options);
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
 
     super.setup(containingTask, config, output);
   }
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperator.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperator.java
index 3113c95..6fccdae 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperator.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperator.java
@@ -31,6 +31,8 @@
 import java.util.List;
 import java.util.Locale;
 import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
 import java.util.concurrent.LinkedBlockingQueue;
 import java.util.concurrent.locks.Lock;
 import java.util.concurrent.locks.ReentrantLock;
@@ -71,6 +73,7 @@
 import org.apache.beam.runners.fnexecution.state.StateRequestHandlers;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.VoidCoder;
+import org.apache.beam.sdk.fn.IdGenerator;
 import org.apache.beam.sdk.fn.data.FnDataReceiver;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.state.BagState;
@@ -86,6 +89,7 @@
 import org.apache.beam.sdk.values.TupleTag;
 import org.apache.beam.sdk.values.WindowingStrategy;
 import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Charsets;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.sdk.v2.sdk.extensions.protobuf.ByteStringCoder;
 import org.apache.flink.api.common.state.ListStateDescriptor;
@@ -238,7 +242,10 @@
           StateRequestHandlers.forBagUserStateHandlerFactory(
               stageBundleFactory.getProcessBundleDescriptor(),
               new BagUserStateFactory(
-                  keyedStateInternals, getKeyedStateBackend(), stateBackendLock));
+                  () -> UUID.randomUUID().toString(),
+                  keyedStateInternals,
+                  getKeyedStateBackend(),
+                  stateBackendLock));
     } else {
       userStateRequestHandler = StateRequestHandler.unsupported();
     }
@@ -250,31 +257,37 @@
     return StateRequestHandlers.delegateBasedUponType(handlerMap);
   }
 
-  private static class BagUserStateFactory<K extends ByteString, V, W extends BoundedWindow>
+  static class BagUserStateFactory<K extends ByteString, V, W extends BoundedWindow>
       implements StateRequestHandlers.BagUserStateHandlerFactory<K, V, W> {
 
     private final StateInternals stateInternals;
     private final KeyedStateBackend<ByteBuffer> keyedStateBackend;
     private final Lock stateBackendLock;
+    /** Holds the valid cache token for user state for this operator. */
+    private final ByteString cacheToken;
 
-    private BagUserStateFactory(
+    BagUserStateFactory(
+        IdGenerator cacheTokenGenerator,
         StateInternals stateInternals,
         KeyedStateBackend<ByteBuffer> keyedStateBackend,
         Lock stateBackendLock) {
-
       this.stateInternals = stateInternals;
       this.keyedStateBackend = keyedStateBackend;
       this.stateBackendLock = stateBackendLock;
+      this.cacheToken = ByteString.copyFrom(cacheTokenGenerator.getId().getBytes(Charsets.UTF_8));
     }
 
     @Override
     public StateRequestHandlers.BagUserStateHandler<K, V, W> forUserState(
+        // Transform id not used because multiple operators with state will not
+        // be fused together. See GreedyPCollectionFusers
         String pTransformId,
         String userStateId,
         Coder<K> keyCoder,
         Coder<V> valueCoder,
         Coder<W> windowCoder) {
       return new StateRequestHandlers.BagUserStateHandler<K, V, W>() {
+
         @Override
         public Iterable<V> get(K key, W window) {
           try {
@@ -291,6 +304,7 @@
             }
             BagState<V> bagState =
                 stateInternals.state(namespace, StateTags.bag(userStateId, valueCoder));
+
             return bagState.read();
           } finally {
             stateBackendLock.unlock();
@@ -343,12 +357,15 @@
           }
         }
 
+        @Override
+        public Optional<ByteString> getCacheToken() {
+          // Cache tokens remains valid for the life time of the operator
+          return Optional.of(cacheToken);
+        }
+
         private void prepareStateBackend(K key) {
-          // Key for state request is shipped already encoded as ByteString,
-          // this is mostly a wrapping with ByteBuffer. We still follow the
-          // usual key encoding procedure.
-          // final ByteBuffer encodedKey = FlinkKeyUtils.encodeKey(key, keyCoder);
-          final ByteBuffer encodedKey = ByteBuffer.wrap(key.toByteArray());
+          // Key for state request is shipped encoded with NESTED context.
+          ByteBuffer encodedKey = FlinkKeyUtils.fromEncodedKey(key);
           keyedStateBackend.setCurrentKey(encodedKey);
         }
       };
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtils.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtils.java
index 61eaae8..ccd10d4 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtils.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtils.java
@@ -32,6 +32,7 @@
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.StructuredCoder;
 import org.apache.beam.sdk.util.CoderUtils;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
 
 /**
  * Utility functions for dealing with key encoding. Beam requires keys to be compared in binary
@@ -44,7 +45,7 @@
     checkNotNull(keyCoder, "Provided coder must not be null");
     final byte[] keyBytes;
     try {
-      keyBytes = CoderUtils.encodeToByteArray(keyCoder, key);
+      keyBytes = CoderUtils.encodeToByteArray(keyCoder, key, Coder.Context.NESTED);
     } catch (Exception e) {
       throw new RuntimeException(String.format(Locale.ENGLISH, "Failed to encode key: %s", key), e);
     }
@@ -52,14 +53,14 @@
   }
 
   /** Decodes a key from a ByteBuffer containing a byte array. */
-  static <K> K decodeKey(ByteBuffer byteBuffer, Coder<K> keyCoder) {
+  public static <K> K decodeKey(ByteBuffer byteBuffer, Coder<K> keyCoder) {
     checkNotNull(byteBuffer, "Provided ByteBuffer must not be null");
     checkNotNull(keyCoder, "Provided coder must not be null");
     checkState(byteBuffer.hasArray(), "ByteBuffer key must contain an array.");
     @SuppressWarnings("ByteBufferBackingArray")
     final byte[] keyBytes = byteBuffer.array();
     try {
-      return CoderUtils.decodeFromByteArray(keyCoder, keyBytes);
+      return CoderUtils.decodeFromByteArray(keyCoder, keyBytes, Coder.Context.NESTED);
     } catch (Exception e) {
       throw new RuntimeException(
           String.format(
@@ -68,6 +69,10 @@
     }
   }
 
+  static ByteBuffer fromEncodedKey(ByteString encodedKey) {
+    return ByteBuffer.wrap(encodedKey.toByteArray());
+  }
+
   /** The Coder for the Runner's encoded representation of a key. */
   static class ByteBufferCoder extends StructuredCoder<ByteBuffer> {
 
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/DedupingOperator.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/DedupingOperator.java
index 60d937e..5677e1a 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/DedupingOperator.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/DedupingOperator.java
@@ -18,6 +18,9 @@
 package org.apache.beam.runners.flink.translation.wrappers.streaming.io;
 
 import java.nio.ByteBuffer;
+import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
+import org.apache.beam.sdk.io.FileSystems;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.values.ValueWithRecordId;
 import org.apache.flink.api.common.state.ValueState;
@@ -40,6 +43,7 @@
         Triggerable<ByteBuffer, VoidNamespace> {
 
   private static final long MAX_RETENTION_SINCE_ACCESS = Duration.standardMinutes(10L).getMillis();
+  private final SerializablePipelineOptions options;
 
   // we keep the time when we last saw an element id for cleanup
   private ValueStateDescriptor<Long> dedupingStateDescriptor =
@@ -47,6 +51,10 @@
 
   private transient InternalTimerService<VoidNamespace> timerService;
 
+  public DedupingOperator(PipelineOptions options) {
+    this.options = new SerializablePipelineOptions(options);
+  }
+
   @Override
   public void initializeState(StateInitializationContext context) throws Exception {
     super.initializeState(context);
@@ -56,6 +64,13 @@
   }
 
   @Override
+  public void open() {
+    // Initialize FileSystems for any coders which may want to use the FileSystem,
+    // see https://issues.apache.org/jira/browse/BEAM-8303
+    FileSystems.setDefaultPipelineOptions(options.get());
+  }
+
+  @Override
   public void processElement(StreamRecord<WindowedValue<ValueWithRecordId<T>>> streamRecord)
       throws Exception {
 
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
index 0c0c371..28cc507 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
@@ -31,6 +31,7 @@
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.SerializableCoder;
+import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.io.UnboundedSource;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
@@ -172,6 +173,7 @@
   /** Initialize and restore state before starting execution of the source. */
   @Override
   public void open(Configuration parameters) throws Exception {
+    FileSystems.setDefaultPipelineOptions(serializedOptions.get());
     runtimeContext = (StreamingRuntimeContext) getRuntimeContext();
 
     // figure out which split sources we're responsible for
diff --git a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
index 2eb4508..8d979fd 100644
--- a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
+++ b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
@@ -28,8 +28,8 @@
 import org.apache.beam.runners.core.StateNamespace;
 import org.apache.beam.runners.core.StateTag;
 import org.apache.beam.runners.flink.translation.types.CoderTypeSerializer;
+import org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils;
 import org.apache.beam.sdk.coders.Coder;
-import org.apache.beam.sdk.coders.CoderException;
 import org.apache.beam.sdk.coders.InstantCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
 import org.apache.beam.sdk.state.BagState;
@@ -48,7 +48,6 @@
 import org.apache.beam.sdk.transforms.CombineWithContext;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
 import org.apache.beam.sdk.transforms.windowing.TimestampCombiner;
-import org.apache.beam.sdk.util.CoderUtils;
 import org.apache.beam.sdk.util.CombineContextFactory;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
@@ -105,14 +104,7 @@
   @Override
   public K getKey() {
     ByteBuffer keyBytes = flinkStateBackend.getCurrentKey();
-    byte[] bytes = new byte[keyBytes.remaining()];
-    keyBytes.get(bytes);
-    keyBytes.position(keyBytes.position() - bytes.length);
-    try {
-      return CoderUtils.decodeFromByteArray(keyCoder, bytes);
-    } catch (CoderException e) {
-      throw new RuntimeException("Error decoding key.", e);
-    }
+    return FlinkKeyUtils.decodeKey(keyBytes, keyCoder);
   }
 
   @Override
diff --git a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunctionTest.java b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunctionTest.java
index 192f17e..d655152 100644
--- a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunctionTest.java
+++ b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/functions/FlinkExecutableStageFunctionTest.java
@@ -41,6 +41,7 @@
 import org.apache.beam.runners.fnexecution.provisioning.JobInfo;
 import org.apache.beam.runners.fnexecution.state.StateRequestHandler;
 import org.apache.beam.sdk.fn.data.FnDataReceiver;
+import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.transforms.join.RawUnionValue;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.Struct;
@@ -257,7 +258,13 @@
         Mockito.mock(FlinkExecutableStageContextFactory.class);
     when(contextFactory.get(any())).thenReturn(stageContext);
     FlinkExecutableStageFunction<Integer> function =
-        new FlinkExecutableStageFunction<>(stagePayload, jobInfo, outputMap, contextFactory, null);
+        new FlinkExecutableStageFunction<>(
+            PipelineOptionsFactory.create(),
+            stagePayload,
+            jobInfo,
+            outputMap,
+            contextFactory,
+            null);
     function.setRuntimeContext(runtimeContext);
     Whitebox.setInternalState(function, "stateRequestHandler", stateRequestHandler);
     return function;
diff --git a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DedupingOperatorTest.java b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DedupingOperatorTest.java
index 3a2c4a3..dbc6ea6 100644
--- a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DedupingOperatorTest.java
+++ b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DedupingOperatorTest.java
@@ -24,6 +24,7 @@
 import java.nio.ByteBuffer;
 import java.nio.charset.StandardCharsets;
 import org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator;
+import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.values.ValueWithRecordId;
 import org.apache.flink.api.common.typeinfo.TypeInformation;
@@ -100,7 +101,7 @@
   private KeyedOneInputStreamOperatorTestHarness<
           ByteBuffer, WindowedValue<ValueWithRecordId<String>>, WindowedValue<String>>
       getDebupingHarness() throws Exception {
-    DedupingOperator<String> operator = new DedupingOperator<>();
+    DedupingOperator<String> operator = new DedupingOperator<>(PipelineOptionsFactory.create());
 
     return new KeyedOneInputStreamOperatorTestHarness<>(
         operator,
diff --git a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperatorTest.java b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperatorTest.java
index ec4a2b9..57f7694 100644
--- a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperatorTest.java
+++ b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperatorTest.java
@@ -1631,10 +1631,10 @@
     assertThat(
         stripStreamRecordFromWindowedValue(testHarness.getOutput()),
         contains(
-            WindowedValue.valueInGlobalWindow(KV.of("key2", "c")),
-            WindowedValue.valueInGlobalWindow(KV.of("key2", "d")),
             WindowedValue.valueInGlobalWindow(KV.of("key", "a")),
             WindowedValue.valueInGlobalWindow(KV.of("key", "b")),
+            WindowedValue.valueInGlobalWindow(KV.of("key2", "c")),
+            WindowedValue.valueInGlobalWindow(KV.of("key2", "d")),
             WindowedValue.valueInGlobalWindow(KV.of("key3", "finishBundle"))));
 
     doFnOperator = doFnOperatorSupplier.get();
@@ -1652,10 +1652,10 @@
     assertThat(
         stripStreamRecordFromWindowedValue(testHarness.getOutput()),
         contains(
-            WindowedValue.valueInGlobalWindow(KV.of("key2", "c")),
-            WindowedValue.valueInGlobalWindow(KV.of("key2", "d")),
             WindowedValue.valueInGlobalWindow(KV.of("key", "a")),
             WindowedValue.valueInGlobalWindow(KV.of("key", "b")),
+            WindowedValue.valueInGlobalWindow(KV.of("key2", "c")),
+            WindowedValue.valueInGlobalWindow(KV.of("key2", "d")),
             WindowedValue.valueInGlobalWindow(KV.of("key3", "finishBundle"))));
 
     // repeat to see if elements are evicted
@@ -1665,10 +1665,10 @@
     assertThat(
         stripStreamRecordFromWindowedValue(testHarness.getOutput()),
         contains(
-            WindowedValue.valueInGlobalWindow(KV.of("key2", "c")),
-            WindowedValue.valueInGlobalWindow(KV.of("key2", "d")),
             WindowedValue.valueInGlobalWindow(KV.of("key", "a")),
             WindowedValue.valueInGlobalWindow(KV.of("key", "b")),
+            WindowedValue.valueInGlobalWindow(KV.of("key2", "c")),
+            WindowedValue.valueInGlobalWindow(KV.of("key2", "d")),
             WindowedValue.valueInGlobalWindow(KV.of("key3", "finishBundle"))));
   }
 
diff --git a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperatorTest.java b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperatorTest.java
index f9718ce..8134b24 100644
--- a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperatorTest.java
+++ b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/ExecutableStageDoFnOperatorTest.java
@@ -22,6 +22,7 @@
 import static org.hamcrest.Matchers.hasSize;
 import static org.hamcrest.Matchers.instanceOf;
 import static org.hamcrest.Matchers.is;
+import static org.hamcrest.Matchers.iterableWithSize;
 import static org.hamcrest.collection.IsIterableContainingInOrder.contains;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertFalse;
@@ -29,6 +30,7 @@
 import static org.junit.Assert.assertNotNull;
 import static org.junit.Assert.assertTrue;
 import static org.mockito.Matchers.any;
+import static org.mockito.Matchers.anyString;
 import static org.mockito.Mockito.doAnswer;
 import static org.mockito.Mockito.doThrow;
 import static org.mockito.Mockito.verify;
@@ -37,6 +39,7 @@
 
 import java.nio.ByteBuffer;
 import java.nio.charset.StandardCharsets;
+import java.util.Arrays;
 import java.util.Collection;
 import java.util.Collections;
 import java.util.HashMap;
@@ -45,6 +48,7 @@
 import java.util.concurrent.atomic.AtomicBoolean;
 import java.util.concurrent.locks.Lock;
 import javax.annotation.Nullable;
+import org.apache.beam.model.fnexecution.v1.BeamFnApi;
 import org.apache.beam.model.pipeline.v1.RunnerApi;
 import org.apache.beam.model.pipeline.v1.RunnerApi.Components;
 import org.apache.beam.model.pipeline.v1.RunnerApi.ExecutableStagePayload;
@@ -58,8 +62,10 @@
 import org.apache.beam.runners.core.TimerInternals;
 import org.apache.beam.runners.flink.FlinkPipelineOptions;
 import org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate;
+import org.apache.beam.runners.flink.streaming.FlinkStateInternalsTest;
 import org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageContextFactory;
 import org.apache.beam.runners.flink.translation.types.CoderTypeInformation;
+import org.apache.beam.runners.flink.translation.utils.NoopLock;
 import org.apache.beam.runners.fnexecution.control.BundleProgressHandler;
 import org.apache.beam.runners.fnexecution.control.ExecutableStageContext;
 import org.apache.beam.runners.fnexecution.control.OutputReceiverFactory;
@@ -68,11 +74,13 @@
 import org.apache.beam.runners.fnexecution.control.StageBundleFactory;
 import org.apache.beam.runners.fnexecution.provisioning.JobInfo;
 import org.apache.beam.runners.fnexecution.state.StateRequestHandler;
+import org.apache.beam.runners.fnexecution.state.StateRequestHandlers;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.coders.VarIntCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
+import org.apache.beam.sdk.fn.IdGenerator;
 import org.apache.beam.sdk.fn.data.FnDataReceiver;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.state.BagState;
@@ -81,14 +89,18 @@
 import org.apache.beam.sdk.transforms.windowing.GlobalWindow;
 import org.apache.beam.sdk.transforms.windowing.IntervalWindow;
 import org.apache.beam.sdk.transforms.windowing.PaneInfo;
+import org.apache.beam.sdk.util.CoderUtils;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.TupleTag;
 import org.apache.beam.sdk.values.WindowingStrategy;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
 import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.Struct;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Charsets;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
+import org.apache.beam.vendor.sdk.v2.sdk.extensions.protobuf.ByteStringCoder;
 import org.apache.commons.lang3.SerializationUtils;
 import org.apache.commons.lang3.mutable.MutableObject;
 import org.apache.flink.api.common.cache.DistributedCache;
@@ -170,6 +182,7 @@
     when(runtimeContext.getDistributedCache()).thenReturn(distributedCache);
     when(stageContext.getStageBundleFactory(any())).thenReturn(stageBundleFactory);
     when(processBundleDescriptor.getTimerSpecs()).thenReturn(Collections.emptyMap());
+    when(processBundleDescriptor.getBagUserStateSpecs()).thenReturn(Collections.emptyMap());
     when(stageBundleFactory.getProcessBundleDescriptor()).thenReturn(processBundleDescriptor);
   }
 
@@ -640,6 +653,134 @@
   }
 
   @Test
+  public void testCacheTokenHandling() throws Exception {
+    InMemoryStateInternals test = InMemoryStateInternals.forKey("test");
+    KeyedStateBackend<ByteBuffer> stateBackend = FlinkStateInternalsTest.createStateBackend();
+
+    // User state the cache token is valid for the lifetime of the operator
+    for (String expectedToken : new String[] {"first token", "second token"}) {
+      final IdGenerator cacheTokenGenerator = () -> expectedToken;
+      ExecutableStageDoFnOperator.BagUserStateFactory<ByteString, Integer, GlobalWindow>
+          bagUserStateFactory =
+              new ExecutableStageDoFnOperator.BagUserStateFactory<>(
+                  cacheTokenGenerator, test, stateBackend, NoopLock.get());
+
+      ByteString key1 = ByteString.copyFrom("key1", Charsets.UTF_8);
+      ByteString key2 = ByteString.copyFrom("key2", Charsets.UTF_8);
+
+      Map<String, Map<String, ProcessBundleDescriptors.BagUserStateSpec>> userStateMapMock =
+          Mockito.mock(Map.class);
+      Map<String, ProcessBundleDescriptors.BagUserStateSpec> transformMap = Mockito.mock(Map.class);
+
+      final String userState1 = "userstate1";
+      ProcessBundleDescriptors.BagUserStateSpec bagUserStateSpec1 = mockBagUserState(userState1);
+      when(transformMap.get(userState1)).thenReturn(bagUserStateSpec1);
+
+      final String userState2 = "userstate2";
+      ProcessBundleDescriptors.BagUserStateSpec bagUserStateSpec2 = mockBagUserState(userState2);
+      when(transformMap.get(userState2)).thenReturn(bagUserStateSpec2);
+
+      when(userStateMapMock.get(anyString())).thenReturn(transformMap);
+      when(processBundleDescriptor.getBagUserStateSpecs()).thenReturn(userStateMapMock);
+      StateRequestHandler stateRequestHandler =
+          StateRequestHandlers.forBagUserStateHandlerFactory(
+              processBundleDescriptor, bagUserStateFactory);
+
+      // There should be no cache token available before any requests have been made
+      assertThat(stateRequestHandler.getCacheTokens(), iterableWithSize(0));
+
+      // Make a request to generate initial cache token
+      stateRequestHandler.handle(getRequest(key1, userState1));
+      BeamFnApi.ProcessBundleRequest.CacheToken cacheTokenStruct =
+          Iterables.getOnlyElement(stateRequestHandler.getCacheTokens());
+      assertThat(cacheTokenStruct.hasUserState(), is(true));
+      ByteString cacheToken = cacheTokenStruct.getToken();
+      final ByteString expectedCacheToken =
+          ByteString.copyFrom(expectedToken.getBytes(Charsets.UTF_8));
+      assertThat(cacheToken, is(expectedCacheToken));
+
+      List<RequestGenerator> generators =
+          Arrays.asList(
+              ExecutableStageDoFnOperatorTest::getRequest,
+              ExecutableStageDoFnOperatorTest::getAppend,
+              ExecutableStageDoFnOperatorTest::getClear);
+
+      for (RequestGenerator req : generators) {
+        // For every state read the tokens remains unchanged
+        stateRequestHandler.handle(req.makeRequest(key1, userState1));
+        assertThat(
+            Iterables.getOnlyElement(stateRequestHandler.getCacheTokens()).getToken(),
+            is(expectedCacheToken));
+
+        // The token is still valid for another key in the same key range
+        stateRequestHandler.handle(req.makeRequest(key2, userState1));
+        assertThat(
+            Iterables.getOnlyElement(stateRequestHandler.getCacheTokens()).getToken(),
+            is(expectedCacheToken));
+
+        // The token is still valid for another state cell in the same key range
+        stateRequestHandler.handle(req.makeRequest(key2, userState2));
+        assertThat(
+            Iterables.getOnlyElement(stateRequestHandler.getCacheTokens()).getToken(),
+            is(expectedCacheToken));
+      }
+    }
+  }
+
+  private interface RequestGenerator {
+    BeamFnApi.StateRequest makeRequest(ByteString key, String userStateId) throws Exception;
+  }
+
+  private static BeamFnApi.StateRequest getRequest(ByteString key, String userStateId)
+      throws Exception {
+    BeamFnApi.StateRequest.Builder builder = stateRequest(key, userStateId);
+    builder.setGet(BeamFnApi.StateGetRequest.newBuilder().build());
+    return builder.build();
+  }
+
+  private static BeamFnApi.StateRequest getAppend(ByteString key, String userStateId)
+      throws Exception {
+    BeamFnApi.StateRequest.Builder builder = stateRequest(key, userStateId);
+    builder.setAppend(BeamFnApi.StateAppendRequest.newBuilder().build());
+    return builder.build();
+  }
+
+  private static BeamFnApi.StateRequest getClear(ByteString key, String userStateId)
+      throws Exception {
+    BeamFnApi.StateRequest.Builder builder = stateRequest(key, userStateId);
+    builder.setClear(BeamFnApi.StateClearRequest.newBuilder().build());
+    return builder.build();
+  }
+
+  private static BeamFnApi.StateRequest.Builder stateRequest(ByteString key, String userStateId)
+      throws Exception {
+    return BeamFnApi.StateRequest.newBuilder()
+        .setStateKey(
+            BeamFnApi.StateKey.newBuilder()
+                .setBagUserState(
+                    BeamFnApi.StateKey.BagUserState.newBuilder()
+                        .setTransformId("transform")
+                        .setKey(key)
+                        .setUserStateId(userStateId)
+                        .setWindow(
+                            ByteString.copyFrom(
+                                CoderUtils.encodeToByteArray(
+                                    GlobalWindow.Coder.INSTANCE, GlobalWindow.INSTANCE)))
+                        .build()));
+  }
+
+  private static ProcessBundleDescriptors.BagUserStateSpec mockBagUserState(String userStateId) {
+    ProcessBundleDescriptors.BagUserStateSpec bagUserStateMock =
+        Mockito.mock(ProcessBundleDescriptors.BagUserStateSpec.class);
+    when(bagUserStateMock.keyCoder()).thenReturn(ByteStringCoder.of());
+    when(bagUserStateMock.valueCoder()).thenReturn(ByteStringCoder.of());
+    when(bagUserStateMock.transformId()).thenReturn("transformId");
+    when(bagUserStateMock.userStateId()).thenReturn(userStateId);
+    when(bagUserStateMock.windowCoder()).thenReturn(GlobalWindow.Coder.INSTANCE);
+    return bagUserStateMock;
+  }
+
+  @Test
   public void testSerialization() {
     WindowedValue.ValueOnlyWindowedValueCoder<Integer> coder =
         WindowedValue.getValueOnlyCoder(VarIntCoder.of());
diff --git a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtilsTest.java b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtilsTest.java
index 06b5d01..274b2bf 100644
--- a/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtilsTest.java
+++ b/runners/flink/src/test/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkKeyUtilsTest.java
@@ -21,11 +21,13 @@
 import static org.hamcrest.MatcherAssert.assertThat;
 import static org.hamcrest.core.Is.is;
 
-import com.google.protobuf.ByteString;
 import java.nio.ByteBuffer;
+import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.coders.VoidCoder;
-import org.apache.beam.sdk.extensions.protobuf.ByteStringCoder;
+import org.apache.beam.sdk.util.CoderUtils;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Charsets;
 import org.junit.Test;
 
 /** Tests for {@link FlinkKeyUtils}. */
@@ -52,12 +54,20 @@
   @Test
   @SuppressWarnings("ByteBufferBackingArray")
   public void testCoderContext() throws Exception {
-    byte[] bytes = {1, 1, 1};
-    ByteString key = ByteString.copyFrom(bytes);
-    ByteStringCoder coder = ByteStringCoder.of();
+    String input = "hello world";
+    Coder<String> coder = StringUtf8Coder.of();
 
-    ByteBuffer encoded = FlinkKeyUtils.encodeKey(key, coder);
-    // Ensure outer context is used where no length encoding is used.
-    assertThat(encoded.array(), is(bytes));
+    ByteBuffer encoded = FlinkKeyUtils.encodeKey(input, coder);
+    // Ensure NESTED context is used
+    assertThat(
+        encoded.array(), is(CoderUtils.encodeToByteArray(coder, input, Coder.Context.NESTED)));
+  }
+
+  @Test
+  @SuppressWarnings("ByteBufferBackingArray")
+  public void testFromEncodedKey() {
+    ByteString input = ByteString.copyFrom("hello world".getBytes(Charsets.UTF_8));
+    ByteBuffer encodedKey = FlinkKeyUtils.fromEncodedKey(input);
+    assertThat(encodedKey.array(), is(input.toByteArray()));
   }
 }
diff --git a/runners/gearpump/build.gradle b/runners/gearpump/build.gradle
index 221a83f..c1744f7 100644
--- a/runners/gearpump/build.gradle
+++ b/runners/gearpump/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.gearpump')
 
 description = "Apache Beam :: Runners :: Gearpump"
 
diff --git a/runners/google-cloud-dataflow-java/build.gradle b/runners/google-cloud-dataflow-java/build.gradle
index 831498a..1569d29 100644
--- a/runners/google-cloud-dataflow-java/build.gradle
+++ b/runners/google-cloud-dataflow-java/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.dataflow')
 
 description = "Apache Beam :: Runners :: Google Cloud Dataflow"
 
diff --git a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslator.java b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslator.java
index c0c5e55..41e5cbb 100644
--- a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslator.java
+++ b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslator.java
@@ -74,6 +74,7 @@
 import org.apache.beam.sdk.coders.IterableCoder;
 import org.apache.beam.sdk.extensions.gcp.options.GcpOptions;
 import org.apache.beam.sdk.io.Read;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.StreamingOptions;
 import org.apache.beam.sdk.runners.AppliedPTransform;
 import org.apache.beam.sdk.runners.TransformHierarchy;
@@ -121,10 +122,17 @@
   private static final Logger LOG = LoggerFactory.getLogger(DataflowPipelineTranslator.class);
   private static final ObjectMapper MAPPER = new ObjectMapper();
 
-  private static byte[] serializeWindowingStrategy(WindowingStrategy<?, ?> windowingStrategy) {
+  private static byte[] serializeWindowingStrategy(
+      WindowingStrategy<?, ?> windowingStrategy, PipelineOptions options) {
     try {
       SdkComponents sdkComponents = SdkComponents.create();
-      sdkComponents.registerEnvironment(Environments.JAVA_SDK_HARNESS_ENVIRONMENT);
+
+      String workerHarnessContainerImageURL =
+          DataflowRunner.getContainerImageForJob(options.as(DataflowPipelineOptions.class));
+      RunnerApi.Environment defaultEnvironmentForDataflow =
+          Environments.createDockerEnvironment(workerHarnessContainerImageURL);
+      sdkComponents.registerEnvironment(defaultEnvironmentForDataflow);
+
       return WindowingStrategyTranslation.toMessageProto(windowingStrategy, sdkComponents)
           .toByteArray();
     } catch (Exception e) {
@@ -164,7 +172,13 @@
 
     // Capture the sdkComponents for look up during step translations
     SdkComponents sdkComponents = SdkComponents.create();
-    sdkComponents.registerEnvironment(Environments.JAVA_SDK_HARNESS_ENVIRONMENT);
+
+    String workerHarnessContainerImageURL =
+        DataflowRunner.getContainerImageForJob(options.as(DataflowPipelineOptions.class));
+    RunnerApi.Environment defaultEnvironmentForDataflow =
+        Environments.createDockerEnvironment(workerHarnessContainerImageURL);
+    sdkComponents.registerEnvironment(defaultEnvironmentForDataflow);
+
     RunnerApi.Pipeline pipelineProto = PipelineTranslation.toProto(pipeline, sdkComponents, true);
 
     LOG.debug("Portable pipeline proto:\n{}", TextFormat.printToString(pipelineProto));
@@ -754,7 +768,8 @@
             WindowingStrategy<?, ?> windowingStrategy = input.getWindowingStrategy();
             stepContext.addInput(
                 PropertyNames.WINDOWING_STRATEGY,
-                byteArrayToJsonString(serializeWindowingStrategy(windowingStrategy)));
+                byteArrayToJsonString(
+                    serializeWindowingStrategy(windowingStrategy, context.getPipelineOptions())));
             stepContext.addInput(
                 PropertyNames.IS_MERGING_WINDOW_FN,
                 !windowingStrategy.getWindowFn().isNonMerging());
@@ -898,7 +913,8 @@
             stepContext.addInput(PropertyNames.DISALLOW_COMBINER_LIFTING, !allowCombinerLifting);
             stepContext.addInput(
                 PropertyNames.SERIALIZED_FN,
-                byteArrayToJsonString(serializeWindowingStrategy(windowingStrategy)));
+                byteArrayToJsonString(
+                    serializeWindowingStrategy(windowingStrategy, context.getPipelineOptions())));
             stepContext.addInput(
                 PropertyNames.IS_MERGING_WINDOW_FN,
                 !windowingStrategy.getWindowFn().isNonMerging());
@@ -1039,7 +1055,8 @@
             stepContext.addOutput(PropertyNames.OUTPUT, context.getOutput(transform));
 
             WindowingStrategy<?, ?> strategy = context.getOutput(transform).getWindowingStrategy();
-            byte[] serializedBytes = serializeWindowingStrategy(strategy);
+            byte[] serializedBytes =
+                serializeWindowingStrategy(strategy, context.getPipelineOptions());
             String serializedJson = byteArrayToJsonString(serializedBytes);
             stepContext.addInput(PropertyNames.SERIALIZED_FN, serializedJson);
           }
diff --git a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.java b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.java
index da0c6f5..eb313e6 100644
--- a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.java
+++ b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.java
@@ -22,6 +22,7 @@
 import javax.annotation.Nullable;
 import org.apache.beam.runners.dataflow.DataflowRunnerInfo;
 import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.extensions.gcp.options.GcpOptions;
 import org.apache.beam.sdk.options.Default;
 import org.apache.beam.sdk.options.DefaultValueFactory;
 import org.apache.beam.sdk.options.Description;
@@ -30,7 +31,7 @@
 
 /** Options that are used to configure the Dataflow pipeline worker pool. */
 @Description("Options that are used to configure the Dataflow pipeline worker pool.")
-public interface DataflowPipelineWorkerPoolOptions extends PipelineOptions {
+public interface DataflowPipelineWorkerPoolOptions extends GcpOptions {
   /**
    * Number of workers to use when executing the Dataflow job. Note that selection of an autoscaling
    * algorithm other then {@code NONE} will affect the size of the worker pool. If left unspecified,
@@ -167,20 +168,6 @@
   void setSubnetwork(String value);
 
   /**
-   * GCE <a href="https://developers.google.com/compute/docs/zones" >availability zone</a> for
-   * launching workers.
-   *
-   * <p>Default is up to the Dataflow service.
-   */
-  @Description(
-      "GCE availability zone for launching workers. See "
-          + "https://developers.google.com/compute/docs/zones for a list of valid options. "
-          + "Default is up to the Dataflow service.")
-  String getZone();
-
-  void setZone(String value);
-
-  /**
    * Machine type to create Dataflow worker VMs as.
    *
    * <p>See <a href="https://cloud.google.com/compute/docs/machine-types">GCE machine types</a> for
diff --git a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/SchemaCoderCloudObjectTranslator.java b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/SchemaCoderCloudObjectTranslator.java
index 2395f12..2ff8c77 100644
--- a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/SchemaCoderCloudObjectTranslator.java
+++ b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/SchemaCoderCloudObjectTranslator.java
@@ -18,7 +18,7 @@
 package org.apache.beam.runners.dataflow.util;
 
 import java.io.IOException;
-import org.apache.beam.model.pipeline.v1.RunnerApi;
+import org.apache.beam.model.pipeline.v1.SchemaApi;
 import org.apache.beam.runners.core.construction.SchemaTranslation;
 import org.apache.beam.runners.core.construction.SdkComponents;
 import org.apache.beam.sdk.schemas.Schema;
@@ -52,7 +52,7 @@
         base,
         SCHEMA,
         StringUtils.byteArrayToJsonString(
-            SchemaTranslation.toProto(target.getSchema()).toByteArray()));
+            SchemaTranslation.schemaToProto(target.getSchema()).toByteArray()));
     return base;
   }
 
@@ -72,8 +72,8 @@
                   StringUtils.jsonStringToByteArray(
                       Structs.getString(cloudObject, FROM_ROW_FUNCTION)),
                   "fromRowFunction");
-      RunnerApi.Schema protoSchema =
-          RunnerApi.Schema.parseFrom(
+      SchemaApi.Schema protoSchema =
+          SchemaApi.Schema.parseFrom(
               StringUtils.jsonStringToByteArray(Structs.getString(cloudObject, SCHEMA)));
       Schema schema = SchemaTranslation.fromProto(protoSchema);
       return SchemaCoder.of(schema, toRowFunction, fromRowFunction);
diff --git a/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslatorTest.java b/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslatorTest.java
index f8cc7b6..85b1e22 100644
--- a/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslatorTest.java
+++ b/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowPipelineTranslatorTest.java
@@ -50,6 +50,8 @@
 import java.util.Set;
 import org.apache.beam.model.pipeline.v1.RunnerApi;
 import org.apache.beam.model.pipeline.v1.RunnerApi.Components;
+import org.apache.beam.model.pipeline.v1.RunnerApi.DockerPayload;
+import org.apache.beam.model.pipeline.v1.RunnerApi.Environment;
 import org.apache.beam.model.pipeline.v1.RunnerApi.ParDoPayload;
 import org.apache.beam.runners.core.construction.PTransformTranslation;
 import org.apache.beam.runners.core.construction.ParDoTranslation;
@@ -956,6 +958,34 @@
     assertEquals(expectedFn2DisplayData, ImmutableSet.copyOf(fn2displayData));
   }
 
+  /**
+   * Tests that when {@link DataflowPipelineOptions#setWorkerHarnessContainerImage(String)} pipeline
+   * option is set, {@link DataflowRunner} sets that value as the {@link
+   * DockerPayload#getContainerImage()} of the default {@link Environment} used when generating the
+   * model pipeline proto.
+   */
+  @Test
+  public void testSetWorkerHarnessContainerImageInPipelineProto() throws Exception {
+    DataflowPipelineOptions options = buildPipelineOptions();
+    String containerImage = "gcr.io/IMAGE/foo";
+    options.as(DataflowPipelineOptions.class).setWorkerHarnessContainerImage(containerImage);
+
+    JobSpecification specification =
+        DataflowPipelineTranslator.fromOptions(options)
+            .translate(
+                Pipeline.create(options),
+                DataflowRunner.fromOptions(options),
+                Collections.emptyList());
+    RunnerApi.Pipeline pipelineProto = specification.getPipelineProto();
+
+    assertEquals(1, pipelineProto.getComponents().getEnvironmentsCount());
+    Environment defaultEnvironment =
+        Iterables.getOnlyElement(pipelineProto.getComponents().getEnvironmentsMap().values());
+
+    DockerPayload payload = DockerPayload.parseFrom(defaultEnvironment.getPayload());
+    assertEquals(DataflowRunner.getContainerImageForJob(options), payload.getContainerImage());
+  }
+
   private static void assertAllStepOutputsHaveUniqueIds(Job job) throws Exception {
     List<String> outputIds = new ArrayList<>();
     for (Step step : job.getSteps()) {
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/DataflowOperationContext.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/DataflowOperationContext.java
index af3cc95..18ce7c5 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/DataflowOperationContext.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/DataflowOperationContext.java
@@ -30,6 +30,7 @@
 import org.apache.beam.runners.core.SimpleDoFnRunner;
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker;
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker.ExecutionState;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.counters.CounterFactory;
 import org.apache.beam.runners.dataflow.worker.counters.NameContext;
 import org.apache.beam.runners.dataflow.worker.logging.DataflowWorkerLoggingInitializer;
@@ -295,7 +296,7 @@
           .setStructuredNameAndMetadata(
               new CounterStructuredNameAndMetadata()
                   .setName(name)
-                  .setMetadata(new CounterMetadata().setKind("SUM")))
+                  .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
           .setCumulative(isCumulative)
           .setInteger(longToSplitInt(value));
     }
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFn.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFn.java
index 488bfbf..7e298e7 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFn.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFn.java
@@ -228,7 +228,7 @@
               .setInstructionId(processRequestInstructionId)
               .setProcessBundle(
                   ProcessBundleRequest.newBuilder()
-                      .setProcessBundleDescriptorReference(registerIfRequired()))
+                      .setProcessBundleDescriptorId(registerIfRequired()))
               .build();
 
       ConcurrentLinkedQueue<WindowedValue<KV<byte[], TargetWindowT>>> outputValue =
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/MetricsToCounterUpdateConverter.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/MetricsToCounterUpdateConverter.java
index 398c2ec..710fef1 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/MetricsToCounterUpdateConverter.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/MetricsToCounterUpdateConverter.java
@@ -52,6 +52,7 @@
   /** Well-defined {@code kind} strings for use in {@link CounterUpdate} protos. */
   public enum Kind {
     DISTRIBUTION("DISTRIBUTION"),
+    MEAN("MEAN"),
     SUM("SUM");
 
     private final String kind;
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/StreamingDataflowWorker.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/StreamingDataflowWorker.java
index 070212e..a603d16 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/StreamingDataflowWorker.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/StreamingDataflowWorker.java
@@ -78,6 +78,7 @@
 import org.apache.beam.runners.dataflow.worker.apiary.FixMultiOutputInfosOnParDoInstructions;
 import org.apache.beam.runners.dataflow.worker.counters.Counter;
 import org.apache.beam.runners.dataflow.worker.counters.CounterSet;
+import org.apache.beam.runners.dataflow.worker.counters.CounterUpdateAggregators;
 import org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor;
 import org.apache.beam.runners.dataflow.worker.counters.NameContext;
 import org.apache.beam.runners.dataflow.worker.graph.CloneAmbiguousFlattensFunction;
@@ -129,6 +130,7 @@
 import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
 import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.TextFormat;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Optional;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Splitter;
@@ -198,6 +200,8 @@
   /** Maximum number of failure stacktraces to report in each update sent to backend. */
   private static final int MAX_FAILURES_TO_REPORT_IN_UPDATE = 1000;
 
+  private final AtomicLong counterAggregationErrorCount = new AtomicLong();
+
   /** Returns whether an exception was caused by a {@link OutOfMemoryError}. */
   private static boolean isOutOfMemoryError(Throwable t) {
     while (t != null) {
@@ -1891,6 +1895,8 @@
       counterUpdates.addAll(
           deltaCounters.extractModifiedDeltaUpdates(DataflowCounterUpdateExtractor.INSTANCE));
       if (hasExperiment(options, "beam_fn_api")) {
+        Map<Object, List<CounterUpdate>> fnApiCounters = new HashMap<>();
+
         while (!this.pendingMonitoringInfos.isEmpty()) {
           final CounterUpdate item = this.pendingMonitoringInfos.poll();
 
@@ -1900,16 +1906,49 @@
           // WorkItem.
           if (item.getCumulative()) {
             item.setCumulative(false);
+            // Group counterUpdates by counterUpdateKey so they can be aggregated before sending to
+            // dataflow service.
+            fnApiCounters
+                .computeIfAbsent(getCounterUpdateKey(item), k -> new ArrayList<>())
+                .add(item);
           } else {
             // In current world all counters coming from FnAPI are cumulative.
             // This is a safety check in case new counter type appears in FnAPI.
             throw new UnsupportedOperationException(
                 "FnApi counters are expected to provide cumulative values."
-                    + " Please, update convertion to delta logic"
+                    + " Please, update conversion to delta logic"
                     + " if non-cumulative counter type is required.");
           }
+        }
 
-          counterUpdates.add(item);
+        // Aggregates counterUpdates with same counterUpdateKey to single CounterUpdate if possible
+        // so we can avoid excessive I/Os for reporting to dataflow service.
+        for (List<CounterUpdate> counterUpdateList : fnApiCounters.values()) {
+          if (counterUpdateList.isEmpty()) {
+            continue;
+          }
+          List<CounterUpdate> aggregatedCounterUpdateList =
+              CounterUpdateAggregators.aggregate(counterUpdateList);
+
+          // Log a warning message if encountered enough non-aggregatable counter updates since this
+          // can lead to a severe performance penalty if dataflow service can not handle the
+          // updates.
+          if (aggregatedCounterUpdateList.size() > 10) {
+            CounterUpdate head = aggregatedCounterUpdateList.get(0);
+            this.counterAggregationErrorCount.getAndIncrement();
+            // log warning message only when error count is the power of 2 to avoid spamming.
+            if (this.counterAggregationErrorCount.get() > 10
+                && Long.bitCount(this.counterAggregationErrorCount.get()) == 1) {
+              LOG.warn(
+                  "Found non-aggregated counter updates of size {} with kind {}, this will likely "
+                      + "cause performance degradation and excessive GC if size is large.",
+                  counterUpdateList.size(),
+                  MoreObjects.firstNonNull(
+                      head.getNameAndKind(), head.getStructuredNameAndMetadata()));
+            }
+          }
+
+          counterUpdates.addAll(aggregatedCounterUpdateList);
         }
       }
     }
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregator.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregator.java
new file mode 100644
index 0000000..c6e393e
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregator.java
@@ -0,0 +1,38 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import com.google.api.services.dataflow.model.CounterUpdate;
+import java.util.List;
+
+/**
+ * CounterUpdateAggregator performs aggregation over a list of CounterUpdate and return combined
+ * result.
+ */
+interface CounterUpdateAggregator {
+
+  /**
+   * Implementation of aggregate function should provide logic to take the list of CounterUpdates
+   * and return single combined CounterUpdate object. Reporting the aggregated result to Dataflow
+   * should have same effect as reporting the elements in list individually to Dataflow.
+   *
+   * @param counterUpdates CounterUpdates to aggregate.
+   * @return Aggregated CounterUpdate.
+   */
+  CounterUpdate aggregate(List<CounterUpdate> counterUpdates);
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregators.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregators.java
new file mode 100644
index 0000000..32f99e7
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregators.java
@@ -0,0 +1,75 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import com.google.api.services.dataflow.model.CounterUpdate;
+import com.google.common.collect.ImmutableMap;
+import java.util.Collections;
+import java.util.List;
+import java.util.Map;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
+
+public class CounterUpdateAggregators {
+
+  private static final Map<String, CounterUpdateAggregator> aggregators =
+      ImmutableMap.of(
+          Kind.SUM.toString(), new SumCounterUpdateAggregator(),
+          Kind.MEAN.toString(), new MeanCounterUpdateAggregator(),
+          Kind.DISTRIBUTION.toString(), new DistributionCounterUpdateAggregator());
+
+  private static String getCounterUpdateKind(CounterUpdate counterUpdate) {
+    if (counterUpdate.getStructuredNameAndMetadata() != null
+        && counterUpdate.getStructuredNameAndMetadata().getMetadata() != null) {
+      return counterUpdate.getStructuredNameAndMetadata().getMetadata().getKind();
+    }
+    if (counterUpdate.getNameAndKind() != null) {
+      return counterUpdate.getNameAndKind().getKind();
+    }
+    throw new IllegalArgumentException(
+        "CounterUpdate must have either StructuredNameAndMetadata or NameAndKind.");
+  }
+
+  /**
+   * Try to aggregate a List of CounterUpdates. The first CounterUpdate entry of the List will be
+   * examined to identify the CounterUpdate kind with {@link #getCounterUpdateKind(CounterUpdate)}
+   * and find the suitable {@link CounterUpdateAggregator}, if there is no suitable aggregator the
+   * original list will be returned.
+   *
+   * <p>Note that this method assumes the CounterUpdate elements in this list has the same {@link
+   * com.google.api.services.dataflow.model.CounterStructuredNameAndMetadata
+   * StructruredNameAndMetadata} or {@link com.google.api.services.dataflow.model.NameAndKind
+   * NameAndKind}, also the value type should be the same across all the elements.
+   *
+   * @param counterUpdates List of CounterUpdate to be aggregated.
+   * @return A singleton list of combined CounterUpdate if it is possible to aggregate the elements,
+   *     other wise return the original list.
+   */
+  public static List<CounterUpdate> aggregate(List<CounterUpdate> counterUpdates) {
+    if (counterUpdates == null || counterUpdates.isEmpty()) {
+      return counterUpdates;
+    }
+    CounterUpdate first = counterUpdates.get(0);
+    String kind = getCounterUpdateKind(first);
+    if (aggregators.containsKey(kind)) {
+      // Return list containing combined CounterUpdate
+      return Collections.singletonList(aggregators.get(kind).aggregate(counterUpdates));
+    }
+    // not able to aggregate the counter updates.
+    return counterUpdates;
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/DistributionCounterUpdateAggregator.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/DistributionCounterUpdateAggregator.java
new file mode 100644
index 0000000..b850864
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/DistributionCounterUpdateAggregator.java
@@ -0,0 +1,65 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+
+import com.google.api.services.dataflow.model.CounterUpdate;
+import com.google.api.services.dataflow.model.DistributionUpdate;
+import java.util.List;
+
+public class DistributionCounterUpdateAggregator implements CounterUpdateAggregator {
+
+  @Override
+  public CounterUpdate aggregate(List<CounterUpdate> counterUpdates) {
+
+    if (counterUpdates == null || counterUpdates.isEmpty()) {
+      return null;
+    }
+    if (counterUpdates.stream().anyMatch(c -> c.getDistribution() == null)) {
+      throw new UnsupportedOperationException(
+          "Aggregating DISTRIBUTION counter updates over non-distribution type is not implemented.");
+    }
+    CounterUpdate initial = counterUpdates.remove(0);
+    return counterUpdates.stream()
+        .reduce(
+            initial,
+            (first, second) ->
+                first.setDistribution(
+                    new DistributionUpdate()
+                        .setCount(
+                            longToSplitInt(
+                                splitIntToLong(first.getDistribution().getCount())
+                                    + splitIntToLong(second.getDistribution().getCount())))
+                        .setMax(
+                            longToSplitInt(
+                                Math.max(
+                                    splitIntToLong(first.getDistribution().getMax()),
+                                    splitIntToLong(second.getDistribution().getMax()))))
+                        .setMin(
+                            longToSplitInt(
+                                Math.min(
+                                    splitIntToLong(first.getDistribution().getMin()),
+                                    splitIntToLong(second.getDistribution().getMin()))))
+                        .setSum(
+                            longToSplitInt(
+                                splitIntToLong(first.getDistribution().getSum())
+                                    + splitIntToLong(second.getDistribution().getSum())))));
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/MeanCounterUpdateAggregator.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/MeanCounterUpdateAggregator.java
new file mode 100644
index 0000000..4cc2a46
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/MeanCounterUpdateAggregator.java
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+
+import com.google.api.services.dataflow.model.CounterUpdate;
+import com.google.api.services.dataflow.model.IntegerMean;
+import java.util.List;
+
+public class MeanCounterUpdateAggregator implements CounterUpdateAggregator {
+
+  @Override
+  public CounterUpdate aggregate(List<CounterUpdate> counterUpdates) {
+    if (counterUpdates == null || counterUpdates.isEmpty()) {
+      return null;
+    }
+    if (counterUpdates.stream().anyMatch(c -> c.getIntegerMean() == null)) {
+      throw new UnsupportedOperationException(
+          "Aggregating MEAN counter updates over non-integerMean type is not implemented.");
+    }
+
+    CounterUpdate initial = counterUpdates.remove(0);
+    return counterUpdates.stream()
+        .reduce(
+            initial,
+            (first, second) ->
+                first.setIntegerMean(
+                    new IntegerMean()
+                        .setCount(
+                            longToSplitInt(
+                                splitIntToLong(first.getIntegerMean().getCount())
+                                    + splitIntToLong(second.getIntegerMean().getCount())))
+                        .setSum(
+                            longToSplitInt(
+                                splitIntToLong(first.getIntegerMean().getSum())
+                                    + splitIntToLong(second.getIntegerMean().getSum())))));
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/SumCounterUpdateAggregator.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/SumCounterUpdateAggregator.java
new file mode 100644
index 0000000..bff2489
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/counters/SumCounterUpdateAggregator.java
@@ -0,0 +1,47 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+
+import com.google.api.services.dataflow.model.CounterUpdate;
+import java.util.List;
+
+public class SumCounterUpdateAggregator implements CounterUpdateAggregator {
+
+  @Override
+  public CounterUpdate aggregate(List<CounterUpdate> counterUpdates) {
+    if (counterUpdates == null || counterUpdates.isEmpty()) {
+      return null;
+    }
+    if (counterUpdates.stream().anyMatch(c -> c.getInteger() == null)) {
+      throw new UnsupportedOperationException(
+          "Aggregating SUM counter updates over non-integer type is not implemented.");
+    }
+
+    CounterUpdate initial = counterUpdates.remove(0);
+    return counterUpdates.stream()
+        .reduce(
+            initial,
+            (first, second) ->
+                first.setInteger(
+                    longToSplitInt(
+                        splitIntToLong(first.getInteger()) + splitIntToLong(second.getInteger()))));
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/ElementCountMonitoringInfoToCounterUpdateTransformer.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/ElementCountMonitoringInfoToCounterUpdateTransformer.java
index 86325a5..6dfc20a 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/ElementCountMonitoringInfoToCounterUpdateTransformer.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/ElementCountMonitoringInfoToCounterUpdateTransformer.java
@@ -25,6 +25,7 @@
 import org.apache.beam.model.pipeline.v1.MetricsApi.MonitoringInfo;
 import org.apache.beam.runners.core.metrics.MonitoringInfoConstants;
 import org.apache.beam.runners.core.metrics.SpecMonitoringInfoValidator;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor;
 import org.apache.beam.runners.dataflow.worker.counters.NameContext;
 import org.slf4j.Logger;
@@ -102,7 +103,7 @@
 
     String counterName = pcollectionName + "-ElementCount";
     NameAndKind name = new NameAndKind();
-    name.setName(counterName).setKind("SUM");
+    name.setName(counterName).setKind(Kind.SUM.toString());
 
     return new CounterUpdate()
         .setNameAndKind(name)
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MSecMonitoringInfoToCounterUpdateTransformer.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MSecMonitoringInfoToCounterUpdateTransformer.java
index b0c3936..feaf1b4 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MSecMonitoringInfoToCounterUpdateTransformer.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MSecMonitoringInfoToCounterUpdateTransformer.java
@@ -29,6 +29,7 @@
 import org.apache.beam.runners.core.metrics.MonitoringInfoConstants;
 import org.apache.beam.runners.core.metrics.SpecMonitoringInfoValidator;
 import org.apache.beam.runners.dataflow.worker.DataflowExecutionContext.DataflowStepContext;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
 import org.slf4j.Logger;
@@ -134,7 +135,7 @@
                 .setName(counterName)
                 .setOriginalStepName(stepContext.getNameContext().originalName())
                 .setExecutionStepName(stepContext.getNameContext().stageName()))
-        .setMetadata(new CounterMetadata().setKind("SUM"));
+        .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString()));
 
     return new CounterUpdate()
         .setStructuredNameAndMetadata(name)
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MeanByteCountMonitoringInfoToCounterUpdateTransformer.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MeanByteCountMonitoringInfoToCounterUpdateTransformer.java
index bd332ec..20eeaba 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MeanByteCountMonitoringInfoToCounterUpdateTransformer.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/MeanByteCountMonitoringInfoToCounterUpdateTransformer.java
@@ -29,6 +29,7 @@
 import org.apache.beam.model.pipeline.v1.MetricsApi.MonitoringInfo;
 import org.apache.beam.runners.core.metrics.MonitoringInfoConstants;
 import org.apache.beam.runners.core.metrics.SpecMonitoringInfoValidator;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.counters.NameContext;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
@@ -108,7 +109,7 @@
 
     String counterName = pcollectionName + "-MeanByteCount";
     NameAndKind name = new NameAndKind();
-    name.setName(counterName).setKind("MEAN");
+    name.setName(counterName).setKind(Kind.MEAN.toString());
 
     return new CounterUpdate()
         .setNameAndKind(name)
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperation.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperation.java
index ce20527..73bbf4b 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperation.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperation.java
@@ -308,7 +308,7 @@
               .setInstructionId(getProcessBundleInstructionId())
               .setProcessBundle(
                   ProcessBundleRequest.newBuilder()
-                      .setProcessBundleDescriptorReference(
+                      .setProcessBundleDescriptorId(
                           registerRequest.getProcessBundleDescriptor(0).getId()))
               .build();
 
@@ -386,7 +386,7 @@
         InstructionRequest.newBuilder()
             .setInstructionId(idGenerator.getId())
             .setProcessBundleProgress(
-                ProcessBundleProgressRequest.newBuilder().setInstructionReference(processBundleId))
+                ProcessBundleProgressRequest.newBuilder().setInstructionId(processBundleId))
             .build();
 
     return instructionRequestHandler
@@ -499,36 +499,36 @@
         stateRequest.getStateKey().getMultimapSideInput();
 
     SideInputReader sideInputReader =
-        ptransformIdToSideInputReader.get(multimapSideInputStateKey.getPtransformId());
+        ptransformIdToSideInputReader.get(multimapSideInputStateKey.getTransformId());
     checkState(
         sideInputReader != null,
-        String.format("Unknown PTransform '%s'", multimapSideInputStateKey.getPtransformId()));
+        String.format("Unknown PTransform '%s'", multimapSideInputStateKey.getTransformId()));
 
     PCollectionView<Materializations.MultimapView<Object, Object>> view =
         (PCollectionView<Materializations.MultimapView<Object, Object>>)
             ptransformIdToSideInputIdToPCollectionView.get(
-                multimapSideInputStateKey.getPtransformId(),
+                multimapSideInputStateKey.getTransformId(),
                 multimapSideInputStateKey.getSideInputId());
     checkState(
         view != null,
         String.format(
             "Unknown side input '%s' on PTransform '%s'",
             multimapSideInputStateKey.getSideInputId(),
-            multimapSideInputStateKey.getPtransformId()));
+            multimapSideInputStateKey.getTransformId()));
     checkState(
         Materializations.MULTIMAP_MATERIALIZATION_URN.equals(
             view.getViewFn().getMaterialization().getUrn()),
         String.format(
             "Unknown materialization for side input '%s' on PTransform '%s' with urn '%s'",
             multimapSideInputStateKey.getSideInputId(),
-            multimapSideInputStateKey.getPtransformId(),
+            multimapSideInputStateKey.getTransformId(),
             view.getViewFn().getMaterialization().getUrn()));
     checkState(
         view.getCoderInternal() instanceof KvCoder,
         String.format(
             "Materialization of side input '%s' on PTransform '%s' expects %s but received %s.",
             multimapSideInputStateKey.getSideInputId(),
-            multimapSideInputStateKey.getPtransformId(),
+            multimapSideInputStateKey.getTransformId(),
             KvCoder.class.getSimpleName(),
             view.getCoderInternal().getClass().getSimpleName()));
     Coder<Object> keyCoder = ((KvCoder) view.getCoderInternal()).getKeyCoder();
@@ -547,7 +547,7 @@
           String.format(
               "Unable to decode window for side input '%s' on PTransform '%s'.",
               multimapSideInputStateKey.getSideInputId(),
-              multimapSideInputStateKey.getPtransformId()),
+              multimapSideInputStateKey.getTransformId()),
           e);
     }
 
@@ -560,7 +560,7 @@
           String.format(
               "Unable to decode user key for side input '%s' on PTransform '%s'.",
               multimapSideInputStateKey.getSideInputId(),
-              multimapSideInputStateKey.getPtransformId()),
+              multimapSideInputStateKey.getTransformId()),
           e);
     }
 
@@ -578,7 +578,7 @@
           String.format(
               "Unable to encode values for side input '%s' on PTransform '%s'.",
               multimapSideInputStateKey.getSideInputId(),
-              multimapSideInputStateKey.getPtransformId()),
+              multimapSideInputStateKey.getTransformId()),
           e);
     }
   }
@@ -587,10 +587,10 @@
       StateRequest stateRequest) {
     StateKey.BagUserState bagUserStateKey = stateRequest.getStateKey().getBagUserState();
     DataflowStepContext userStepContext =
-        ptransformIdToUserStepContext.get(bagUserStateKey.getPtransformId());
+        ptransformIdToUserStepContext.get(bagUserStateKey.getTransformId());
     checkState(
         userStepContext != null,
-        String.format("Unknown PTransform id '%s'", bagUserStateKey.getPtransformId()));
+        String.format("Unknown PTransform id '%s'", bagUserStateKey.getTransformId()));
     // TODO: We should not be required to hold onto a pointer to the bag states for the
     // user. InMemoryStateInternals assumes that the Java garbage collector does the clean-up work
     // but instead StateInternals should hold its own references and write out any data and
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserDistributionMonitoringInfoToCounterUpdateTransformer.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserDistributionMonitoringInfoToCounterUpdateTransformer.java
index cf98d50..abf5a87 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserDistributionMonitoringInfoToCounterUpdateTransformer.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserDistributionMonitoringInfoToCounterUpdateTransformer.java
@@ -30,6 +30,7 @@
 import org.apache.beam.runners.core.metrics.MonitoringInfoConstants;
 import org.apache.beam.runners.core.metrics.SpecMonitoringInfoValidator;
 import org.apache.beam.runners.dataflow.worker.DataflowExecutionContext.DataflowStepContext;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Origin;
 import org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor;
 import org.slf4j.Logger;
@@ -113,7 +114,7 @@
                 .setName(counterName)
                 .setOriginalStepName(stepContext.getNameContext().originalName())
                 .setOriginNamespace(counterNamespace))
-        .setMetadata(new CounterMetadata().setKind("DISTRIBUTION"));
+        .setMetadata(new CounterMetadata().setKind(Kind.DISTRIBUTION.toString()));
 
     return new CounterUpdate()
         .setStructuredNameAndMetadata(name)
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserMonitoringInfoToCounterUpdateTransformer.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserMonitoringInfoToCounterUpdateTransformer.java
index 470a1fa..5babdd3 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserMonitoringInfoToCounterUpdateTransformer.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/fn/control/UserMonitoringInfoToCounterUpdateTransformer.java
@@ -28,6 +28,7 @@
 import org.apache.beam.runners.core.metrics.MonitoringInfoConstants;
 import org.apache.beam.runners.core.metrics.SpecMonitoringInfoValidator;
 import org.apache.beam.runners.dataflow.worker.DataflowExecutionContext.DataflowStepContext;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Origin;
 import org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor;
 import org.slf4j.Logger;
@@ -109,7 +110,7 @@
                 .setName(counterName)
                 .setOriginalStepName(stepContext.getNameContext().originalName())
                 .setOriginNamespace(counterNamespace))
-        .setMetadata(new CounterMetadata().setKind("SUM"));
+        .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString()));
 
     return new CounterUpdate()
         .setStructuredNameAndMetadata(name)
diff --git a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandler.java b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandler.java
index 0b45e9f..1af00f1 100644
--- a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandler.java
+++ b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandler.java
@@ -196,13 +196,13 @@
       writeIfNotEmpty("thread", logEntry.getThread());
       writeIfNotEmpty("job", DataflowWorkerLoggingMDC.getJobId());
       // TODO: Write the stage execution information by translating the currently execution
-      // instruction reference to a stage.
+      // instruction id to a stage.
       // writeIfNotNull("stage", ...);
-      writeIfNotEmpty("step", logEntry.getPrimitiveTransformReference());
+      writeIfNotEmpty("step", logEntry.getTransformId());
       writeIfNotEmpty("worker", DataflowWorkerLoggingMDC.getWorkerId());
       // Id should match to id in //depot/google3/third_party/cloud/dataflow/worker/agent/sdk.go
       writeIfNotEmpty("portability_worker_id", DataflowWorkerLoggingMDC.getSdkHarnessId());
-      writeIfNotEmpty("work", logEntry.getInstructionReference());
+      writeIfNotEmpty("work", logEntry.getInstructionId());
       writeIfNotEmpty("logger", logEntry.getLogLocation());
       // TODO: Figure out a way to get exceptions transported across Beam Fn Logging API
       writeIfNotEmpty("exception", logEntry.getTrace());
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/BatchModeExecutionContextTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/BatchModeExecutionContextTest.java
index 6d58f4d..898396d 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/BatchModeExecutionContextTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/BatchModeExecutionContextTest.java
@@ -34,6 +34,7 @@
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker;
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker.ExecutionState;
 import org.apache.beam.runners.dataflow.worker.BatchModeExecutionContext.BatchModeExecutionState;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.counters.NameContext;
 import org.apache.beam.runners.dataflow.worker.profiler.ScopedProfiler.NoopProfileScope;
 import org.apache.beam.runners.dataflow.worker.profiler.ScopedProfiler.ProfileScope;
@@ -78,7 +79,7 @@
                             .setOriginNamespace("namespace")
                             .setName("some-counter")
                             .setOriginalStepName("originalName"))
-                    .setMetadata(new CounterMetadata().setKind("SUM")))
+                    .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
             .setCumulative(true)
             .setInteger(longToSplitInt(42));
 
@@ -102,7 +103,7 @@
                             .setOriginNamespace("namespace")
                             .setName("uncommitted-counter")
                             .setOriginalStepName("originalName"))
-                    .setMetadata(new CounterMetadata().setKind("SUM")))
+                    .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
             .setCumulative(true)
             .setInteger(longToSplitInt(64));
 
@@ -145,7 +146,7 @@
                             .setOriginNamespace("namespace")
                             .setName("some-distribution")
                             .setOriginalStepName("originalName"))
-                    .setMetadata(new CounterMetadata().setKind("DISTRIBUTION")))
+                    .setMetadata(new CounterMetadata().setKind(Kind.DISTRIBUTION.toString())))
             .setCumulative(true)
             .setDistribution(
                 new DistributionUpdate()
@@ -249,7 +250,7 @@
                         .setOrigin("SYSTEM")
                         .setName(counterName)
                         .setExecutionStepName(stageName))
-                .setMetadata(new CounterMetadata().setKind("SUM")))
+                .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
         .setCumulative(true)
         .setInteger(longToSplitInt(value));
   }
@@ -265,7 +266,7 @@
                         .setName(counterName)
                         .setOriginalStepName(originalStepName)
                         .setExecutionStepName(stageName))
-                .setMetadata(new CounterMetadata().setKind("SUM")))
+                .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
         .setCumulative(true)
         .setInteger(longToSplitInt(value));
   }
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFnTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFnTest.java
index 08143ed..81d4a67 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFnTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/FnApiWindowMappingFnTest.java
@@ -159,8 +159,7 @@
                 .build());
       } else if (RequestCase.PROCESS_BUNDLE.equals(request.getRequestCase())) {
         assertEquals(
-            processBundleDescriptorId,
-            request.getProcessBundle().getProcessBundleDescriptorReference());
+            processBundleDescriptorId, request.getProcessBundle().getProcessBundleDescriptorId());
         return CompletableFuture.completedFuture(
             InstructionResponse.newBuilder()
                 .setInstructionId(request.getInstructionId())
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/IsmSideInputReaderTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/IsmSideInputReaderTest.java
index 19a9b10..db981a9 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/IsmSideInputReaderTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/IsmSideInputReaderTest.java
@@ -71,6 +71,7 @@
 import org.apache.beam.runners.dataflow.util.RandomAccessData;
 import org.apache.beam.runners.dataflow.worker.DataflowOperationContext.DataflowExecutionState;
 import org.apache.beam.runners.dataflow.worker.ExperimentContext.Experiment;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.counters.Counter;
 import org.apache.beam.runners.dataflow.worker.counters.CounterName;
 import org.apache.beam.runners.dataflow.worker.counters.CounterSet;
@@ -1331,7 +1332,7 @@
         new CounterUpdate()
             .setStructuredNameAndMetadata(
                 new CounterStructuredNameAndMetadata()
-                    .setMetadata(new CounterMetadata().setKind("SUM"))
+                    .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString()))
                     .setName(
                         new CounterStructuredName()
                             .setOrigin("SYSTEM")
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingModeExecutionContextTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingModeExecutionContextTest.java
index bfaf5c8..03d4376 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingModeExecutionContextTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingModeExecutionContextTest.java
@@ -47,6 +47,7 @@
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker;
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker.ExecutionState;
 import org.apache.beam.runners.dataflow.worker.DataflowExecutionContext.DataflowExecutionStateTracker;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.StreamingModeExecutionContext.StreamingModeExecutionState;
 import org.apache.beam.runners.dataflow.worker.StreamingModeExecutionContext.StreamingModeExecutionStateRegistry;
 import org.apache.beam.runners.dataflow.worker.counters.CounterSet;
@@ -278,7 +279,7 @@
                         .setName(counterName)
                         .setOriginalStepName(originalStepName)
                         .setExecutionStepName(stageName))
-                .setMetadata(new CounterMetadata().setKind("SUM")))
+                .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
         .setCumulative(false)
         .setInteger(longToSplitInt(value));
   }
@@ -292,7 +293,7 @@
                         .setOrigin("SYSTEM")
                         .setName(counterName)
                         .setExecutionStepName(stageName))
-                .setMetadata(new CounterMetadata().setKind("SUM")))
+                .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
         .setCumulative(false)
         .setInteger(longToSplitInt(value));
   }
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingStepMetricsContainerTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingStepMetricsContainerTest.java
index 8217cc6..c743478 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingStepMetricsContainerTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/StreamingStepMetricsContainerTest.java
@@ -98,7 +98,7 @@
                                 .setOriginNamespace("ns")
                                 .setName("name2")
                                 .setOriginalStepName("s2"))
-                        .setMetadata(new CounterMetadata().setKind("SUM")))
+                        .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
                 .setCumulative(false)
                 .setInteger(longToSplitInt(12))));
 
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/WorkItemStatusClientTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/WorkItemStatusClientTest.java
index b1bcd55..26f8502 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/WorkItemStatusClientTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/WorkItemStatusClientTest.java
@@ -53,6 +53,7 @@
 import org.apache.beam.runners.core.metrics.ExecutionStateTracker.ExecutionState;
 import org.apache.beam.runners.core.metrics.MetricsContainerImpl;
 import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
 import org.apache.beam.runners.dataflow.worker.SourceTranslationUtils.DataflowReaderPosition;
 import org.apache.beam.runners.dataflow.worker.WorkerCustomSources.BoundedSourceSplit;
 import org.apache.beam.runners.dataflow.worker.counters.CounterName;
@@ -346,7 +347,7 @@
   public void populateCounterUpdatesWithOutputCounters() throws Exception {
     final CounterUpdate counter =
         new CounterUpdate()
-            .setNameAndKind(new NameAndKind().setName("some-counter").setKind("SUM"))
+            .setNameAndKind(new NameAndKind().setName("some-counter").setKind(Kind.SUM.toString()))
             .setCumulative(true)
             .setInteger(DataflowCounterUpdateExtractor.longToSplitInt(42));
 
@@ -368,7 +369,7 @@
   public void populateCounterUpdatesWithMetricsAndCounters() throws Exception {
     final CounterUpdate expectedCounter =
         new CounterUpdate()
-            .setNameAndKind(new NameAndKind().setName("some-counter").setKind("SUM"))
+            .setNameAndKind(new NameAndKind().setName("some-counter").setKind(Kind.SUM.toString()))
             .setCumulative(true)
             .setInteger(DataflowCounterUpdateExtractor.longToSplitInt(42));
 
@@ -385,7 +386,7 @@
                             .setOriginNamespace("namespace")
                             .setName("some-counter")
                             .setOriginalStepName("step"))
-                    .setMetadata(new CounterMetadata().setKind("SUM")))
+                    .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
             .setCumulative(true)
             .setInteger(DataflowCounterUpdateExtractor.longToSplitInt(42));
 
@@ -422,7 +423,7 @@
                             .setOrigin("SYSTEM")
                             .setName("start-msecs")
                             .setOriginalStepName("step"))
-                    .setMetadata(new CounterMetadata().setKind("SUM")))
+                    .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
             .setCumulative(true)
             .setInteger(DataflowCounterUpdateExtractor.longToSplitInt(42));
 
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregatorsTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregatorsTest.java
new file mode 100644
index 0000000..17e6aa0
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/CounterUpdateAggregatorsTest.java
@@ -0,0 +1,96 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+import static org.junit.Assert.assertEquals;
+
+import com.google.api.services.dataflow.model.CounterMetadata;
+import com.google.api.services.dataflow.model.CounterStructuredNameAndMetadata;
+import com.google.api.services.dataflow.model.CounterUpdate;
+import com.google.api.services.dataflow.model.DistributionUpdate;
+import com.google.api.services.dataflow.model.IntegerMean;
+import java.util.ArrayList;
+import java.util.List;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
+import org.junit.Test;
+
+public class CounterUpdateAggregatorsTest {
+
+  @Test
+  public void testAggregateSum() {
+    List<CounterUpdate> sumUpdates = new ArrayList<>();
+    for (int i = 0; i < 10; i++) {
+      sumUpdates.add(
+          new CounterUpdate()
+              .setStructuredNameAndMetadata(
+                  new CounterStructuredNameAndMetadata()
+                      .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
+              .setInteger(longToSplitInt((long) i)));
+    }
+    List<CounterUpdate> aggregated = CounterUpdateAggregators.aggregate(sumUpdates);
+    assertEquals(1, aggregated.size());
+    CounterUpdate combined = aggregated.get(0);
+    assertEquals(45L, splitIntToLong(combined.getInteger()));
+  }
+
+  @Test
+  public void testAggregateMean() {
+    List<CounterUpdate> meanUpdates = new ArrayList<>();
+    for (int i = 0; i < 10; i++) {
+      meanUpdates.add(
+          new CounterUpdate()
+              .setStructuredNameAndMetadata(
+                  new CounterStructuredNameAndMetadata()
+                      .setMetadata(new CounterMetadata().setKind(Kind.MEAN.toString())))
+              .setIntegerMean(
+                  new IntegerMean().setSum(longToSplitInt((long) i)).setCount(longToSplitInt(1L))));
+    }
+    List<CounterUpdate> aggregated = CounterUpdateAggregators.aggregate(meanUpdates);
+    assertEquals(1, aggregated.size());
+    CounterUpdate combined = aggregated.get(0);
+    assertEquals(45L, splitIntToLong(combined.getIntegerMean().getSum()));
+    assertEquals(10L, splitIntToLong(combined.getIntegerMean().getCount()));
+  }
+
+  @Test
+  public void testAggregateDistribution() {
+    List<CounterUpdate> distributionUpdates = new ArrayList<>();
+    for (int i = 0; i < 10; i++) {
+      distributionUpdates.add(
+          new CounterUpdate()
+              .setStructuredNameAndMetadata(
+                  new CounterStructuredNameAndMetadata()
+                      .setMetadata(new CounterMetadata().setKind(Kind.DISTRIBUTION.toString())))
+              .setDistribution(
+                  new DistributionUpdate()
+                      .setSum(longToSplitInt((long) i))
+                      .setMax(longToSplitInt((long) i))
+                      .setMin(longToSplitInt((long) i))
+                      .setCount(longToSplitInt((long) 1))));
+    }
+    List<CounterUpdate> aggregated = CounterUpdateAggregators.aggregate(distributionUpdates);
+    assertEquals(1, aggregated.size());
+    CounterUpdate combined = aggregated.get(0);
+    assertEquals(45L, splitIntToLong(combined.getDistribution().getSum()));
+    assertEquals(10L, splitIntToLong(combined.getDistribution().getCount()));
+    assertEquals(9L, splitIntToLong(combined.getDistribution().getMax()));
+    assertEquals(0L, splitIntToLong(combined.getDistribution().getMin()));
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/DistributionCounterUpdateAggregatorTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/DistributionCounterUpdateAggregatorTest.java
new file mode 100644
index 0000000..8dac8a7
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/DistributionCounterUpdateAggregatorTest.java
@@ -0,0 +1,72 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+import static org.junit.Assert.assertEquals;
+
+import com.google.api.services.dataflow.model.CounterMetadata;
+import com.google.api.services.dataflow.model.CounterStructuredNameAndMetadata;
+import com.google.api.services.dataflow.model.CounterUpdate;
+import com.google.api.services.dataflow.model.DistributionUpdate;
+import java.util.ArrayList;
+import java.util.List;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
+import org.junit.Before;
+import org.junit.Test;
+
+public class DistributionCounterUpdateAggregatorTest {
+
+  private List<CounterUpdate> counterUpdates;
+  private DistributionCounterUpdateAggregator aggregator;
+
+  @Before
+  public void setUp() {
+    counterUpdates = new ArrayList<>();
+    aggregator = new DistributionCounterUpdateAggregator();
+    for (int i = 0; i < 10; i++) {
+      counterUpdates.add(
+          new CounterUpdate()
+              .setStructuredNameAndMetadata(
+                  new CounterStructuredNameAndMetadata()
+                      .setMetadata(new CounterMetadata().setKind(Kind.MEAN.toString())))
+              .setDistribution(
+                  new DistributionUpdate()
+                      .setSum(longToSplitInt((long) i))
+                      .setMax(longToSplitInt((long) i))
+                      .setMin(longToSplitInt((long) i))
+                      .setCount(longToSplitInt((long) 1))));
+    }
+  }
+
+  @Test
+  public void testAggregate() {
+    CounterUpdate combined = aggregator.aggregate(counterUpdates);
+    assertEquals(45L, splitIntToLong(combined.getDistribution().getSum()));
+    assertEquals(10L, splitIntToLong(combined.getDistribution().getCount()));
+    assertEquals(9L, splitIntToLong(combined.getDistribution().getMax()));
+    assertEquals(0L, splitIntToLong(combined.getDistribution().getMin()));
+  }
+
+  @Test(expected = UnsupportedOperationException.class)
+  public void testAggregateWithNullIntegerDistribution() {
+    counterUpdates.get(0).setDistribution(null);
+    aggregator.aggregate(counterUpdates);
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/MeanCounterUpdateAggregatorTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/MeanCounterUpdateAggregatorTest.java
new file mode 100644
index 0000000..9ea7a31
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/MeanCounterUpdateAggregatorTest.java
@@ -0,0 +1,66 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+import static org.junit.Assert.assertEquals;
+
+import com.google.api.services.dataflow.model.CounterMetadata;
+import com.google.api.services.dataflow.model.CounterStructuredNameAndMetadata;
+import com.google.api.services.dataflow.model.CounterUpdate;
+import com.google.api.services.dataflow.model.IntegerMean;
+import java.util.ArrayList;
+import java.util.List;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
+import org.junit.Before;
+import org.junit.Test;
+
+public class MeanCounterUpdateAggregatorTest {
+
+  private List<CounterUpdate> counterUpdates;
+  private MeanCounterUpdateAggregator aggregator;
+
+  @Before
+  public void setUp() {
+    counterUpdates = new ArrayList<>();
+    aggregator = new MeanCounterUpdateAggregator();
+    for (int i = 0; i < 10; i++) {
+      counterUpdates.add(
+          new CounterUpdate()
+              .setStructuredNameAndMetadata(
+                  new CounterStructuredNameAndMetadata()
+                      .setMetadata(new CounterMetadata().setKind(Kind.MEAN.toString())))
+              .setIntegerMean(
+                  new IntegerMean().setSum(longToSplitInt((long) i)).setCount(longToSplitInt(1L))));
+    }
+  }
+
+  @Test
+  public void testAggregate() {
+    CounterUpdate combined = aggregator.aggregate(counterUpdates);
+    assertEquals(45L, splitIntToLong(combined.getIntegerMean().getSum()));
+    assertEquals(10L, splitIntToLong(combined.getIntegerMean().getCount()));
+  }
+
+  @Test(expected = UnsupportedOperationException.class)
+  public void testAggregateWithNullIntegerMean() {
+    counterUpdates.get(0).setIntegerMean(null);
+    aggregator.aggregate(counterUpdates);
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/SumCounterUpdateAggregatorTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/SumCounterUpdateAggregatorTest.java
new file mode 100644
index 0000000..e30354f
--- /dev/null
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/counters/SumCounterUpdateAggregatorTest.java
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.dataflow.worker.counters;
+
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.longToSplitInt;
+import static org.apache.beam.runners.dataflow.worker.counters.DataflowCounterUpdateExtractor.splitIntToLong;
+import static org.junit.Assert.assertEquals;
+
+import com.google.api.services.dataflow.model.CounterMetadata;
+import com.google.api.services.dataflow.model.CounterStructuredNameAndMetadata;
+import com.google.api.services.dataflow.model.CounterUpdate;
+import java.util.ArrayList;
+import java.util.List;
+import org.apache.beam.runners.dataflow.worker.MetricsToCounterUpdateConverter.Kind;
+import org.junit.Before;
+import org.junit.Test;
+
+public class SumCounterUpdateAggregatorTest {
+  private List<CounterUpdate> counterUpdates;
+  private SumCounterUpdateAggregator aggregator;
+
+  @Before
+  public void setUp() {
+    counterUpdates = new ArrayList<>();
+    aggregator = new SumCounterUpdateAggregator();
+    for (int i = 0; i < 10; i++) {
+      counterUpdates.add(
+          new CounterUpdate()
+              .setStructuredNameAndMetadata(
+                  new CounterStructuredNameAndMetadata()
+                      .setMetadata(new CounterMetadata().setKind(Kind.SUM.toString())))
+              .setInteger(longToSplitInt((long) i)));
+    }
+  }
+
+  @Test
+  public void testAggregate() {
+    CounterUpdate combined = aggregator.aggregate(counterUpdates);
+    assertEquals(45L, splitIntToLong(combined.getInteger()));
+  }
+
+  @Test(expected = UnsupportedOperationException.class)
+  public void testAggregateWithNullInteger() {
+    counterUpdates.get(0).setInteger(null);
+    aggregator.aggregate(counterUpdates);
+  }
+}
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperationTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperationTest.java
index a646894..321a236 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperationTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/control/RegisterAndProcessBundleOperationTest.java
@@ -233,8 +233,7 @@
         BeamFnApi.InstructionRequest.newBuilder()
             .setInstructionId("778")
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("555"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("555"))
             .build());
     operation.finish();
 
@@ -245,8 +244,7 @@
         BeamFnApi.InstructionRequest.newBuilder()
             .setInstructionId("779")
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("555"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("555"))
             .build());
     operation.finish();
   }
@@ -516,8 +514,7 @@
         BeamFnApi.InstructionRequest.newBuilder()
             .setInstructionId("778")
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("555"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("555"))
             .build());
   }
 
@@ -549,7 +546,7 @@
                                   StateKey.newBuilder()
                                       .setBagUserState(
                                           StateKey.BagUserState.newBuilder()
-                                              .setPtransformId("testPTransformId")
+                                              .setTransformId("testPTransformId")
                                               .setWindow(ByteString.EMPTY)
                                               .setUserStateId("testUserStateId")))
                               .buildPartial();
@@ -657,7 +654,7 @@
                           StateKey.newBuilder()
                               .setMultimapSideInput(
                                   StateKey.MultimapSideInput.newBuilder()
-                                      .setPtransformId("testPTransformId")
+                                      .setTransformId("testPTransformId")
                                       .setSideInputId("testSideInputId")
                                       .setWindow(
                                           ByteString.copyFrom(
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/data/BeamFnDataGrpcServiceTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/data/BeamFnDataGrpcServiceTest.java
index 965bce6..9c2b57a 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/data/BeamFnDataGrpcServiceTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/data/BeamFnDataGrpcServiceTest.java
@@ -77,7 +77,7 @@
 @RunWith(JUnit4.class)
 @SuppressWarnings("FutureReturnValueIgnored")
 public class BeamFnDataGrpcServiceTest {
-  private static final String PTRANSFORM_ID = "888";
+  private static final String TRANSFORM_ID = "888";
   private static final Coder<WindowedValue<String>> CODER =
       LengthPrefixCoder.of(WindowedValue.getValueOnlyCoder(StringUtf8Coder.of()));
   private static final String DEFAULT_CLIENT = "";
@@ -129,7 +129,7 @@
       CloseableFnDataReceiver<WindowedValue<String>> consumer =
           service
               .getDataService(DEFAULT_CLIENT)
-              .send(LogicalEndpoint.of(Integer.toString(i), PTRANSFORM_ID), CODER);
+              .send(LogicalEndpoint.of(Integer.toString(i), TRANSFORM_ID), CODER);
 
       consumer.accept(valueInGlobalWindow("A" + i));
       consumer.accept(valueInGlobalWindow("B" + i));
@@ -202,7 +202,7 @@
         CloseableFnDataReceiver<WindowedValue<String>> consumer =
             service
                 .getDataService(Integer.toString(client))
-                .send(LogicalEndpoint.of(instructionId, PTRANSFORM_ID), CODER);
+                .send(LogicalEndpoint.of(instructionId, TRANSFORM_ID), CODER);
 
         consumer.accept(valueInGlobalWindow("A" + instructionId));
         consumer.accept(valueInGlobalWindow("B" + instructionId));
@@ -235,7 +235,7 @@
     CountDownLatch waitForInboundElements = new CountDownLatch(1);
 
     for (int i = 0; i < 3; ++i) {
-      String instructionReference = Integer.toString(i);
+      String instructionId = Integer.toString(i);
       executorService.submit(
           () -> {
             ManagedChannel channel =
@@ -243,7 +243,7 @@
             StreamObserver<BeamFnApi.Elements> outboundObserver =
                 BeamFnDataGrpc.newStub(channel)
                     .data(TestStreams.withOnNext(clientInboundElements::add).build());
-            outboundObserver.onNext(elementsWithData(instructionReference));
+            outboundObserver.onNext(elementsWithData(instructionId));
             waitForInboundElements.await();
             outboundObserver.onCompleted();
             return null;
@@ -259,7 +259,7 @@
           service
               .getDataService(DEFAULT_CLIENT)
               .receive(
-                  LogicalEndpoint.of(Integer.toString(i), PTRANSFORM_ID),
+                  LogicalEndpoint.of(Integer.toString(i), TRANSFORM_ID),
                   CODER,
                   serverInboundValue::add));
     }
@@ -284,8 +284,8 @@
     return BeamFnApi.Elements.newBuilder()
         .addData(
             BeamFnApi.Elements.Data.newBuilder()
-                .setInstructionReference(id)
-                .setPtransformId(PTRANSFORM_ID)
+                .setInstructionId(id)
+                .setTransformId(TRANSFORM_ID)
                 .setData(
                     ByteString.copyFrom(encodeToByteArray(CODER, valueInGlobalWindow("A" + id)))
                         .concat(
@@ -295,9 +295,7 @@
                             ByteString.copyFrom(
                                 encodeToByteArray(CODER, valueInGlobalWindow("C" + id))))))
         .addData(
-            BeamFnApi.Elements.Data.newBuilder()
-                .setInstructionReference(id)
-                .setPtransformId(PTRANSFORM_ID))
+            BeamFnApi.Elements.Data.newBuilder().setInstructionId(id).setTransformId(TRANSFORM_ID))
         .build();
   }
 
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/logging/BeamFnLoggingServiceTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/logging/BeamFnLoggingServiceTest.java
index 588778b..55b81e0 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/logging/BeamFnLoggingServiceTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/fn/logging/BeamFnLoggingServiceTest.java
@@ -82,7 +82,7 @@
 
       Collection<Callable<Void>> tasks = new ArrayList<>();
       for (int i = 1; i <= 3; ++i) {
-        int instructionReference = i;
+        int instructionId = i;
         tasks.add(
             () -> {
               CountDownLatch waitForServerHangup = new CountDownLatch(1);
@@ -95,8 +95,7 @@
                           TestStreams.withOnNext(BeamFnLoggingServiceTest::discardMessage)
                               .withOnCompleted(waitForServerHangup::countDown)
                               .build());
-              outboundObserver.onNext(
-                  createLogsWithIds(instructionReference, -instructionReference));
+              outboundObserver.onNext(createLogsWithIds(instructionId, -instructionId));
               outboundObserver.onCompleted();
               waitForServerHangup.await();
               return null;
@@ -128,7 +127,7 @@
             GrpcContextHeaderAccessorProvider.getHeaderAccessor())) {
       server = createServer(service, service.getApiServiceDescriptor());
       for (int i = 1; i <= 3; ++i) {
-        int instructionReference = i;
+        int instructionId = i;
         tasks.add(
             () -> {
               CountDownLatch waitForTermination = new CountDownLatch(1);
@@ -141,9 +140,8 @@
                           TestStreams.withOnNext(BeamFnLoggingServiceTest::discardMessage)
                               .withOnError(waitForTermination::countDown)
                               .build());
-              outboundObserver.onNext(
-                  createLogsWithIds(instructionReference, -instructionReference));
-              outboundObserver.onError(new RuntimeException("Client " + instructionReference));
+              outboundObserver.onNext(createLogsWithIds(instructionId, -instructionId));
+              outboundObserver.onError(new RuntimeException("Client " + instructionId));
               waitForTermination.await();
               return null;
             });
@@ -167,7 +165,7 @@
       server = createServer(service, service.getApiServiceDescriptor());
 
       for (int i = 1; i <= 3; ++i) {
-        long instructionReference = i;
+        long instructionId = i;
         futures.add(
             executorService.submit(
                 () -> {
@@ -181,7 +179,7 @@
                               TestStreams.withOnNext(BeamFnLoggingServiceTest::discardMessage)
                                   .withOnCompleted(waitForServerHangup::countDown)
                                   .build());
-                  outboundObserver.onNext(createLogsWithIds(instructionReference));
+                  outboundObserver.onNext(createLogsWithIds(instructionId));
                   waitForServerHangup.await();
                   return null;
                 }));
@@ -210,7 +208,7 @@
   }
 
   private BeamFnApi.LogEntry createLogWithId(long id) {
-    return BeamFnApi.LogEntry.newBuilder().setInstructionReference(Long.toString(id)).build();
+    return BeamFnApi.LogEntry.newBuilder().setInstructionId(Long.toString(id)).build();
   }
 
   private Server createServer(
diff --git a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandlerTest.java b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandlerTest.java
index 4d6dd99..568fcff 100644
--- a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandlerTest.java
+++ b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/logging/DataflowWorkerLoggingHandlerTest.java
@@ -348,7 +348,7 @@
         .setLogLocation("LoggerName")
         .setSeverity(BeamFnApi.LogEntry.Severity.Enum.INFO)
         .setMessage(message)
-        .setInstructionReference("1")
+        .setInstructionId("1")
         .setThread("2")
         .setTimestamp(Timestamp.newBuilder().setSeconds(0).setNanos(1 * 1000000))
         .build();
diff --git a/runners/google-cloud-dataflow-java/worker/windmill/build.gradle b/runners/google-cloud-dataflow-java/worker/windmill/build.gradle
index f348cb9..53a2882 100644
--- a/runners/google-cloud-dataflow-java/worker/windmill/build.gradle
+++ b/runners/google-cloud-dataflow-java/worker/windmill/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyPortabilityNature(
+    publish: false,
     shadowJarValidationExcludes: ["org/apache/beam/runners/dataflow/worker/windmill/**"],
     archivesBaseName: 'beam-runners-google-cloud-dataflow-java-windmill'
 )
diff --git a/runners/java-fn-execution/build.gradle b/runners/java-fn-execution/build.gradle
index 7bf3cd1..f032d8f 100644
--- a/runners/java-fn-execution/build.gradle
+++ b/runners/java-fn-execution/build.gradle
@@ -16,7 +16,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.fnexecution')
 
 description = "Apache Beam :: Runners :: Java Fn Execution"
 
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/AbstractArtifactRetrievalService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/AbstractArtifactRetrievalService.java
new file mode 100644
index 0000000..93ae657
--- /dev/null
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/AbstractArtifactRetrievalService.java
@@ -0,0 +1,199 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.fnexecution.artifact;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.charset.StandardCharsets;
+import java.util.List;
+import java.util.concurrent.ExecutionException;
+import java.util.concurrent.TimeUnit;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ArtifactMetadata;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ProxyManifest;
+import org.apache.beam.model.jobmanagement.v1.ArtifactRetrievalServiceGrpc;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.util.JsonFormat;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.StreamObserver;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Strings;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.Cache;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.CacheBuilder;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hasher;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hashing;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.io.ByteStreams;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * An {@link ArtifactRetrievalService} that handles everything aside from actually opening the
+ * backing resources.
+ */
+public abstract class AbstractArtifactRetrievalService
+    extends ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceImplBase
+    implements ArtifactRetrievalService {
+  private static final Logger LOG = LoggerFactory.getLogger(AbstractArtifactRetrievalService.class);
+
+  private static final int ARTIFACT_CHUNK_SIZE_BYTES = 2 << 20; // 2MB
+
+  public AbstractArtifactRetrievalService() {
+    this(
+        CacheBuilder.newBuilder()
+            .expireAfterAccess(1, TimeUnit.HOURS /* arbitrary */)
+            .maximumSize(100 /* arbitrary */)
+            .build());
+  }
+
+  public AbstractArtifactRetrievalService(Cache<String, ArtifactApi.ProxyManifest> manifestCache) {
+    this.manifestCache = manifestCache;
+  }
+
+  public abstract InputStream openManifest(String retrievalToken) throws IOException;
+
+  public abstract InputStream openUri(String retrievalToken, String uri) throws IOException;
+
+  private final Cache<String, ArtifactApi.ProxyManifest> manifestCache;
+
+  public ArtifactApi.ProxyManifest getManifestProxy(String retrievalToken)
+      throws IOException, ExecutionException {
+    return manifestCache.get(
+        retrievalToken,
+        () -> {
+          try (InputStream stream = openManifest(retrievalToken)) {
+            return loadManifest(stream, retrievalToken);
+          }
+        });
+  }
+
+  @Override
+  public void getManifest(
+      ArtifactApi.GetManifestRequest request,
+      StreamObserver<ArtifactApi.GetManifestResponse> responseObserver) {
+    final String token = request.getRetrievalToken();
+    if (Strings.isNullOrEmpty(token)) {
+      throw new StatusRuntimeException(
+          Status.INVALID_ARGUMENT.withDescription("Empty artifact token"));
+    }
+
+    LOG.info("GetManifest for {}", token);
+    try {
+      ArtifactApi.ProxyManifest proxyManifest = getManifestProxy(token);
+      ArtifactApi.GetManifestResponse response =
+          ArtifactApi.GetManifestResponse.newBuilder()
+              .setManifest(proxyManifest.getManifest())
+              .build();
+      LOG.info(
+          "GetManifest for {} -> {} artifacts",
+          token,
+          proxyManifest.getManifest().getArtifactCount());
+      responseObserver.onNext(response);
+      responseObserver.onCompleted();
+    } catch (Exception e) {
+      LOG.warn("GetManifest for {} failed.", token, e);
+      responseObserver.onError(e);
+    }
+  }
+
+  @Override
+  public void getArtifact(
+      ArtifactApi.GetArtifactRequest request,
+      StreamObserver<ArtifactApi.ArtifactChunk> responseObserver) {
+    LOG.debug("GetArtifact {}", request);
+    String name = request.getName();
+    try {
+      ArtifactApi.ProxyManifest proxyManifest = getManifestProxy(request.getRetrievalToken());
+      // look for file at URI specified by proxy manifest location
+      ArtifactApi.ProxyManifest.Location location =
+          proxyManifest.getLocationList().stream()
+              .filter(loc -> loc.getName().equals(name))
+              .findFirst()
+              .orElseThrow(
+                  () ->
+                      new StatusRuntimeException(
+                          Status.NOT_FOUND.withDescription(
+                              String.format("Artifact location not found in manifest: %s", name))));
+
+      List<ArtifactMetadata> existingArtifacts = proxyManifest.getManifest().getArtifactList();
+      ArtifactMetadata metadata =
+          existingArtifacts.stream()
+              .filter(meta -> meta.getName().equals(name))
+              .findFirst()
+              .orElseThrow(
+                  () ->
+                      new StatusRuntimeException(
+                          Status.NOT_FOUND.withDescription(
+                              String.format("Artifact metadata not found in manifest: %s", name))));
+
+      Hasher hasher = Hashing.sha256().newHasher();
+      byte[] data = new byte[ARTIFACT_CHUNK_SIZE_BYTES];
+      try (InputStream stream = openUri(request.getRetrievalToken(), location.getUri())) {
+        int len;
+        while ((len = stream.read(data)) != -1) {
+          hasher.putBytes(data, 0, len);
+          responseObserver.onNext(
+              ArtifactApi.ArtifactChunk.newBuilder()
+                  .setData(ByteString.copyFrom(data, 0, len))
+                  .build());
+        }
+      }
+      if (metadata.getSha256() != null && !metadata.getSha256().isEmpty()) {
+        String expected = metadata.getSha256();
+        String actual = hasher.hash().toString();
+        if (!actual.equals(expected)) {
+          throw new StatusRuntimeException(
+              Status.DATA_LOSS.withDescription(
+                  String.format(
+                      "Artifact %s is corrupt: expected sha256 %s, actual %s",
+                      name, expected, actual)));
+        }
+      }
+      responseObserver.onCompleted();
+    } catch (IOException | ExecutionException e) {
+      LOG.info("GetArtifact {} failed", request, e);
+      responseObserver.onError(e);
+    }
+  }
+
+  @Override
+  public void close() throws Exception {}
+
+  static ProxyManifest loadManifest(InputStream stream, String manifestName) throws IOException {
+    ProxyManifest.Builder manifestBuilder = ProxyManifest.newBuilder();
+    String contents = new String(ByteStreams.toByteArray(stream), StandardCharsets.UTF_8);
+    JsonFormat.parser().merge(contents, manifestBuilder);
+    ProxyManifest proxyManifest = manifestBuilder.build();
+    checkArgument(
+        proxyManifest.hasManifest(),
+        String.format("Invalid ProxyManifest at %s: doesn't have a Manifest", manifestName));
+    checkArgument(
+        proxyManifest.getLocationCount() == proxyManifest.getManifest().getArtifactCount(),
+        String.format(
+            "Invalid ProxyManifestat %s: %d locations but %d artifacts",
+            manifestName,
+            proxyManifest.getLocationCount(),
+            proxyManifest.getManifest().getArtifactCount()));
+    LOG.info(
+        "Manifest at {} has {} artifact locations",
+        manifestName,
+        proxyManifest.getManifest().getArtifactCount());
+    return proxyManifest;
+  }
+}
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/AbstractArtifactStagingService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/AbstractArtifactStagingService.java
new file mode 100644
index 0000000..25f09a3
--- /dev/null
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/AbstractArtifactStagingService.java
@@ -0,0 +1,227 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.fnexecution.artifact;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import java.io.IOException;
+import java.nio.channels.WritableByteChannel;
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ArtifactMetadata;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.CommitManifestRequest;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.CommitManifestResponse;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ProxyManifest;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ProxyManifest.Location;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.PutArtifactMetadata;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.PutArtifactRequest;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi.PutArtifactResponse;
+import org.apache.beam.model.jobmanagement.v1.ArtifactStagingServiceGrpc.ArtifactStagingServiceImplBase;
+import org.apache.beam.runners.fnexecution.FnService;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.util.JsonFormat;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.StreamObserver;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hasher;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hashing;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * An {@link ArtifactStagingServiceImplBase} that handles everything aside from actually opening the
+ * backing resources.
+ */
+public abstract class AbstractArtifactStagingService extends ArtifactStagingServiceImplBase
+    implements FnService {
+
+  private static final Logger LOG = LoggerFactory.getLogger(AbstractArtifactStagingService.class);
+
+  private static final Charset CHARSET = StandardCharsets.UTF_8;
+
+  public abstract String getArtifactUri(String stagingSessionToken, String encodedFileName)
+      throws Exception;
+
+  public abstract WritableByteChannel openUri(String uri) throws IOException;
+
+  public abstract void removeUri(String uri) throws IOException;
+
+  public abstract WritableByteChannel openManifest(String stagingSessionToken) throws Exception;
+
+  public abstract void removeArtifacts(String stagingSessionToken) throws Exception;
+
+  public abstract String getRetrievalToken(String stagingSessionToken) throws Exception;
+
+  @Override
+  public StreamObserver<PutArtifactRequest> putArtifact(
+      StreamObserver<PutArtifactResponse> responseObserver) {
+    return new PutArtifactStreamObserver(responseObserver);
+  }
+
+  @Override
+  public void commitManifest(
+      CommitManifestRequest request, StreamObserver<CommitManifestResponse> responseObserver) {
+    try {
+      String stagingSessionToken = request.getStagingSessionToken();
+      ProxyManifest.Builder proxyManifestBuilder =
+          ProxyManifest.newBuilder().setManifest(request.getManifest());
+      for (ArtifactMetadata artifactMetadata : request.getManifest().getArtifactList()) {
+        proxyManifestBuilder.addLocation(
+            Location.newBuilder()
+                .setName(artifactMetadata.getName())
+                .setUri(getArtifactUri(stagingSessionToken, encodedFileName(artifactMetadata)))
+                .build());
+      }
+      try (WritableByteChannel manifestWritableByteChannel = openManifest(stagingSessionToken)) {
+        manifestWritableByteChannel.write(
+            CHARSET.encode(JsonFormat.printer().print(proxyManifestBuilder.build())));
+      }
+      // TODO: Validate integrity of staged files.
+      responseObserver.onNext(
+          CommitManifestResponse.newBuilder()
+              .setRetrievalToken(getRetrievalToken(stagingSessionToken))
+              .build());
+      responseObserver.onCompleted();
+    } catch (Exception e) {
+      // TODO: Cleanup all the artifacts.
+      LOG.error("Unable to commit manifest.", e);
+      responseObserver.onError(e);
+    }
+  }
+
+  @Override
+  public void close() throws Exception {
+    // Nothing to close here.
+  }
+
+  private String encodedFileName(ArtifactMetadata artifactMetadata) {
+    return "artifact_"
+        + Hashing.sha256().hashString(artifactMetadata.getName(), CHARSET).toString();
+  }
+
+  private class PutArtifactStreamObserver implements StreamObserver<PutArtifactRequest> {
+
+    private final StreamObserver<PutArtifactResponse> outboundObserver;
+    private PutArtifactMetadata metadata;
+    private String artifactId;
+    private WritableByteChannel artifactWritableByteChannel;
+    private Hasher hasher;
+
+    PutArtifactStreamObserver(StreamObserver<PutArtifactResponse> outboundObserver) {
+      this.outboundObserver = outboundObserver;
+    }
+
+    @Override
+    public void onNext(PutArtifactRequest putArtifactRequest) {
+      // Create the directory structure for storing artifacts in the first call.
+      if (metadata == null) {
+        checkNotNull(putArtifactRequest);
+        checkNotNull(putArtifactRequest.getMetadata());
+        metadata = putArtifactRequest.getMetadata();
+        LOG.debug("stored metadata: {}", metadata);
+        // Check the base path exists or create the base path
+        try {
+          artifactId =
+              getArtifactUri(
+                  putArtifactRequest.getMetadata().getStagingSessionToken(),
+                  encodedFileName(metadata.getMetadata()));
+          LOG.debug(
+              "Going to stage artifact {} to {}.", metadata.getMetadata().getName(), artifactId);
+          artifactWritableByteChannel = openUri(artifactId);
+          hasher = Hashing.sha256().newHasher();
+        } catch (Exception e) {
+          String message =
+              String.format(
+                  "Failed to begin staging artifact %s", metadata.getMetadata().getName());
+          LOG.error(message, e);
+          outboundObserver.onError(
+              new StatusRuntimeException(Status.DATA_LOSS.withDescription(message).withCause(e)));
+        }
+      } else {
+        try {
+          ByteString data = putArtifactRequest.getData().getData();
+          artifactWritableByteChannel.write(data.asReadOnlyByteBuffer());
+          hasher.putBytes(data.toByteArray());
+        } catch (IOException e) {
+          String message =
+              String.format(
+                  "Failed to write chunk of artifact %s to %s",
+                  metadata.getMetadata().getName(), artifactId);
+          LOG.error(message, e);
+          outboundObserver.onError(
+              new StatusRuntimeException(Status.DATA_LOSS.withDescription(message).withCause(e)));
+        }
+      }
+    }
+
+    @Override
+    public void onError(Throwable throwable) {
+      // Delete the artifact.
+      LOG.error("Staging artifact failed for " + artifactId, throwable);
+      try {
+        if (artifactWritableByteChannel != null) {
+          artifactWritableByteChannel.close();
+        }
+        if (artifactId != null) {
+          removeUri(artifactId);
+        }
+
+      } catch (IOException e) {
+        outboundObserver.onError(
+            new StatusRuntimeException(
+                Status.DATA_LOSS.withDescription(
+                    String.format("Failed to clean up artifact file %s", artifactId))));
+        return;
+      }
+      outboundObserver.onError(
+          new StatusRuntimeException(
+              Status.DATA_LOSS
+                  .withDescription(String.format("Failed to stage artifact %s", artifactId))
+                  .withCause(throwable)));
+    }
+
+    @Override
+    public void onCompleted() {
+      // Close the stream.
+      LOG.debug("Staging artifact completed for " + artifactId);
+      if (artifactWritableByteChannel != null) {
+        try {
+          artifactWritableByteChannel.close();
+        } catch (IOException e) {
+          onError(e);
+          return;
+        }
+      }
+      String expectedSha256 = metadata.getMetadata().getSha256();
+      if (expectedSha256 != null && !expectedSha256.isEmpty()) {
+        String actualSha256 = hasher.hash().toString();
+        if (!actualSha256.equals(expectedSha256)) {
+          outboundObserver.onError(
+              new StatusRuntimeException(
+                  Status.INVALID_ARGUMENT.withDescription(
+                      String.format(
+                          "Artifact %s is corrupt: expected sah256 %s, but has sha256 %s",
+                          metadata.getMetadata().getName(), expectedSha256, actualSha256))));
+          return;
+        }
+      }
+      outboundObserver.onNext(PutArtifactResponse.newBuilder().build());
+      outboundObserver.onCompleted();
+    }
+  }
+}
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactRetrievalService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactRetrievalService.java
index ff7e9ba..14a123e 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactRetrievalService.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactRetrievalService.java
@@ -17,34 +17,17 @@
  */
 package org.apache.beam.runners.fnexecution.artifact;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
-
 import java.io.IOException;
 import java.io.InputStream;
 import java.nio.channels.Channels;
-import java.nio.charset.StandardCharsets;
-import java.util.List;
-import java.util.concurrent.ExecutionException;
 import java.util.concurrent.TimeUnit;
 import org.apache.beam.model.jobmanagement.v1.ArtifactApi;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ArtifactMetadata;
 import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ProxyManifest;
-import org.apache.beam.model.jobmanagement.v1.ArtifactRetrievalServiceGrpc;
 import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.io.fs.ResourceId;
-import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
-import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.util.JsonFormat;
-import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status;
-import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException;
-import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.StreamObserver;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Strings;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.Cache;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.CacheBuilder;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.CacheLoader;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LoadingCache;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hasher;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hashing;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.io.ByteStreams;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -53,125 +36,45 @@
  * the artifact layout and retrieval token format produced by {@link
  * BeamFileSystemArtifactStagingService}.
  */
-public class BeamFileSystemArtifactRetrievalService
-    extends ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceImplBase
-    implements ArtifactRetrievalService {
+public class BeamFileSystemArtifactRetrievalService extends AbstractArtifactRetrievalService {
   private static final Logger LOG =
       LoggerFactory.getLogger(BeamFileSystemArtifactRetrievalService.class);
 
-  private static final int ARTIFACT_CHUNK_SIZE_BYTES = 2 << 20; // 2MB
+  private static final Cache<String, ArtifactApi.ProxyManifest> MANIFEST_CACHE =
+      CacheBuilder.newBuilder()
+          .expireAfterAccess(1, TimeUnit.HOURS /* arbitrary */)
+          .maximumSize(100 /* arbitrary */)
+          .build();
+
+  public BeamFileSystemArtifactRetrievalService() {
+    super(MANIFEST_CACHE);
+  }
 
   public static BeamFileSystemArtifactRetrievalService create() {
     return new BeamFileSystemArtifactRetrievalService();
   }
 
   @Override
-  public void getManifest(
-      ArtifactApi.GetManifestRequest request,
-      StreamObserver<ArtifactApi.GetManifestResponse> responseObserver) {
-    final String token = request.getRetrievalToken();
-    if (Strings.isNullOrEmpty(token)) {
-      throw new StatusRuntimeException(
-          Status.INVALID_ARGUMENT.withDescription("Empty artifact token"));
-    }
-
-    LOG.info("GetManifest for {}", token);
-    try {
-      ArtifactApi.ProxyManifest proxyManifest = MANIFEST_CACHE.get(token);
-      ArtifactApi.GetManifestResponse response =
-          ArtifactApi.GetManifestResponse.newBuilder()
-              .setManifest(proxyManifest.getManifest())
-              .build();
-      LOG.info(
-          "GetManifest for {} -> {} artifacts",
-          token,
-          proxyManifest.getManifest().getArtifactCount());
-      responseObserver.onNext(response);
-      responseObserver.onCompleted();
-    } catch (Exception e) {
-      LOG.info("GetManifest for {} failed", token, e);
-      responseObserver.onError(e);
-    }
+  public InputStream openUri(String retrievalToken, String uri) throws IOException {
+    ResourceId artifactResourceId = FileSystems.matchNewResource(uri, false /* is directory */);
+    return Channels.newInputStream(FileSystems.open(artifactResourceId));
   }
 
   @Override
-  public void getArtifact(
-      ArtifactApi.GetArtifactRequest request,
-      StreamObserver<ArtifactApi.ArtifactChunk> responseObserver) {
-    LOG.debug("GetArtifact {}", request);
-    String name = request.getName();
+  public InputStream openManifest(String retrievalToken) throws IOException {
+    ResourceId manifestResourceId = getManifestLocationFromToken(retrievalToken);
     try {
-      ArtifactApi.ProxyManifest proxyManifest = MANIFEST_CACHE.get(request.getRetrievalToken());
-      // look for file at URI specified by proxy manifest location
-      ArtifactApi.ProxyManifest.Location location =
-          proxyManifest.getLocationList().stream()
-              .filter(loc -> loc.getName().equals(name))
-              .findFirst()
-              .orElseThrow(
-                  () ->
-                      new StatusRuntimeException(
-                          Status.NOT_FOUND.withDescription(
-                              String.format("Artifact location not found in manifest: %s", name))));
-
-      List<ArtifactMetadata> existingArtifacts = proxyManifest.getManifest().getArtifactList();
-      ArtifactMetadata metadata =
-          existingArtifacts.stream()
-              .filter(meta -> meta.getName().equals(name))
-              .findFirst()
-              .orElseThrow(
-                  () ->
-                      new StatusRuntimeException(
-                          Status.NOT_FOUND.withDescription(
-                              String.format("Artifact metadata not found in manifest: %s", name))));
-
-      ResourceId artifactResourceId =
-          FileSystems.matchNewResource(location.getUri(), false /* is directory */);
-      LOG.debug("Artifact {} located in {}", name, artifactResourceId);
-      Hasher hasher = Hashing.sha256().newHasher();
-      byte[] data = new byte[ARTIFACT_CHUNK_SIZE_BYTES];
-      try (InputStream stream = Channels.newInputStream(FileSystems.open(artifactResourceId))) {
-        int len;
-        while ((len = stream.read(data)) != -1) {
-          hasher.putBytes(data, 0, len);
-          responseObserver.onNext(
-              ArtifactApi.ArtifactChunk.newBuilder()
-                  .setData(ByteString.copyFrom(data, 0, len))
-                  .build());
-        }
-      }
-      if (metadata.getSha256() != null && !metadata.getSha256().isEmpty()) {
-        String expected = metadata.getSha256();
-        String actual = hasher.hash().toString();
-        if (!actual.equals(expected)) {
-          throw new StatusRuntimeException(
-              Status.DATA_LOSS.withDescription(
-                  String.format(
-                      "Artifact %s is corrupt: expected sha256 %s, actual %s",
-                      name, expected, actual)));
-        }
-      }
-      responseObserver.onCompleted();
-    } catch (IOException | ExecutionException e) {
-      LOG.info("GetArtifact {} failed", request, e);
-      responseObserver.onError(e);
+      return Channels.newInputStream(FileSystems.open(manifestResourceId));
+    } catch (IOException e) {
+      LOG.warn(
+          "GetManifest for {} failed. Make sure the artifact staging directory (configurable "
+              + "via --artifacts-dir argument to the job server) is accessible to workers.",
+          retrievalToken,
+          e);
+      throw e;
     }
   }
 
-  @Override
-  public void close() throws Exception {}
-
-  private static final LoadingCache<String, ArtifactApi.ProxyManifest> MANIFEST_CACHE =
-      CacheBuilder.newBuilder()
-          .expireAfterAccess(1, TimeUnit.HOURS /* arbitrary */)
-          .maximumSize(100 /* arbitrary */)
-          .build(
-              new CacheLoader<String, ProxyManifest>() {
-                @Override
-                public ProxyManifest load(String retrievalToken) throws Exception {
-                  return loadManifest(retrievalToken);
-                }
-              });
-
   @VisibleForTesting
   static ProxyManifest loadManifest(String retrievalToken) throws IOException {
     LOG.info("Loading manifest for retrieval token {}", retrievalToken);
@@ -181,27 +84,9 @@
   }
 
   static ProxyManifest loadManifest(ResourceId manifestResourceId) throws IOException {
-    ProxyManifest.Builder manifestBuilder = ProxyManifest.newBuilder();
-    try (InputStream stream = Channels.newInputStream(FileSystems.open(manifestResourceId))) {
-      String contents = new String(ByteStreams.toByteArray(stream), StandardCharsets.UTF_8);
-      JsonFormat.parser().merge(contents, manifestBuilder);
-    }
-    ProxyManifest proxyManifest = manifestBuilder.build();
-    checkArgument(
-        proxyManifest.hasManifest(),
-        String.format("Invalid ProxyManifest at %s: doesn't have a Manifest", manifestResourceId));
-    checkArgument(
-        proxyManifest.getLocationCount() == proxyManifest.getManifest().getArtifactCount(),
-        String.format(
-            "Invalid ProxyManifestat %s: %d locations but %d artifacts",
-            manifestResourceId,
-            proxyManifest.getLocationCount(),
-            proxyManifest.getManifest().getArtifactCount()));
-    LOG.info(
-        "Manifest at {} has {} artifact locations",
-        manifestResourceId,
-        proxyManifest.getManifest().getArtifactCount());
-    return proxyManifest;
+    return loadManifest(
+        Channels.newInputStream(FileSystems.open(manifestResourceId)),
+        manifestResourceId.toString());
   }
 
   private static ResourceId getManifestLocationFromToken(String retrievalToken) {
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactStagingService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactStagingService.java
index 5dd11b7..c9baa17 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactStagingService.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/BeamFileSystemArtifactStagingService.java
@@ -17,38 +17,22 @@
  */
 package org.apache.beam.runners.fnexecution.artifact;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
-
 import com.fasterxml.jackson.core.JsonProcessingException;
 import com.fasterxml.jackson.databind.ObjectMapper;
 import java.io.IOException;
 import java.io.Serializable;
 import java.nio.channels.WritableByteChannel;
-import java.nio.charset.Charset;
-import java.nio.charset.StandardCharsets;
 import java.util.Collections;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ArtifactMetadata;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.CommitManifestRequest;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.CommitManifestResponse;
 import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ProxyManifest;
 import org.apache.beam.model.jobmanagement.v1.ArtifactApi.ProxyManifest.Location;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.PutArtifactMetadata;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.PutArtifactRequest;
-import org.apache.beam.model.jobmanagement.v1.ArtifactApi.PutArtifactResponse;
 import org.apache.beam.model.jobmanagement.v1.ArtifactStagingServiceGrpc.ArtifactStagingServiceImplBase;
-import org.apache.beam.runners.fnexecution.FnService;
 import org.apache.beam.sdk.io.FileSystems;
 import org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions;
 import org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions;
 import org.apache.beam.sdk.io.fs.ResourceId;
 import org.apache.beam.sdk.util.MimeTypes;
-import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
-import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.util.JsonFormat;
 import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status;
 import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException;
-import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.StreamObserver;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hasher;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.hash.Hashing;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -66,86 +50,37 @@
  *
  * <p>The manifest file is encoded in {@link ProxyManifest}.
  */
-public class BeamFileSystemArtifactStagingService extends ArtifactStagingServiceImplBase
-    implements FnService {
+public class BeamFileSystemArtifactStagingService extends AbstractArtifactStagingService {
 
   private static final Logger LOG =
       LoggerFactory.getLogger(BeamFileSystemArtifactStagingService.class);
   private static final ObjectMapper MAPPER = new ObjectMapper();
   // Use UTF8 for all text encoding.
-  private static final Charset CHARSET = StandardCharsets.UTF_8;
   public static final String MANIFEST = "MANIFEST";
   public static final String ARTIFACTS = "artifacts";
 
   @Override
-  public StreamObserver<PutArtifactRequest> putArtifact(
-      StreamObserver<PutArtifactResponse> responseObserver) {
-    return new PutArtifactStreamObserver(responseObserver);
+  public String getArtifactUri(String stagingSession, String encodedFileName) throws Exception {
+    StagingSessionToken stagingSessionToken = StagingSessionToken.decode(stagingSession);
+    ResourceId artifactDirResourceId = getArtifactDirResourceId(stagingSessionToken);
+    return artifactDirResourceId
+        .resolve(encodedFileName, StandardResolveOptions.RESOLVE_FILE)
+        .toString();
   }
 
   @Override
-  public void commitManifest(
-      CommitManifestRequest request, StreamObserver<CommitManifestResponse> responseObserver) {
-    try {
-      StagingSessionToken stagingSessionToken =
-          StagingSessionToken.decode(request.getStagingSessionToken());
-      ResourceId manifestResourceId = getManifestFileResourceId(stagingSessionToken);
-      ResourceId artifactDirResourceId = getArtifactDirResourceId(stagingSessionToken);
-      ProxyManifest.Builder proxyManifestBuilder =
-          ProxyManifest.newBuilder().setManifest(request.getManifest());
-      for (ArtifactMetadata artifactMetadata : request.getManifest().getArtifactList()) {
-        proxyManifestBuilder.addLocation(
-            Location.newBuilder()
-                .setName(artifactMetadata.getName())
-                .setUri(
-                    artifactDirResourceId
-                        .resolve(
-                            encodedFileName(artifactMetadata), StandardResolveOptions.RESOLVE_FILE)
-                        .toString())
-                .build());
-      }
-      try (WritableByteChannel manifestWritableByteChannel =
-          FileSystems.create(manifestResourceId, MimeTypes.TEXT)) {
-        manifestWritableByteChannel.write(
-            CHARSET.encode(JsonFormat.printer().print(proxyManifestBuilder.build())));
-      }
-      // TODO: Validate integrity of staged files.
-      responseObserver.onNext(
-          CommitManifestResponse.newBuilder()
-              .setRetrievalToken(manifestResourceId.toString())
-              .build());
-      responseObserver.onCompleted();
-    } catch (Exception e) {
-      // TODO: Cleanup all the artifacts.
-      LOG.error("Unable to commit manifest.", e);
-      responseObserver.onError(e);
-    }
+  public WritableByteChannel openUri(String uri) throws IOException {
+    return FileSystems.create(FileSystems.matchNewResource(uri, false), MimeTypes.BINARY);
   }
 
   @Override
-  public void close() throws Exception {
-    // Nothing to close here.
+  public void removeUri(String uri) throws IOException {
+    FileSystems.delete(
+        Collections.singletonList(FileSystems.matchNewResource(uri, false)),
+        StandardMoveOptions.IGNORE_MISSING_FILES);
   }
 
-  /**
-   * Generate a stagingSessionToken compatible with {@link BeamFileSystemArtifactStagingService}.
-   *
-   * @param sessionId Unique sessionId for artifact staging.
-   * @param basePath Base path to upload artifacts.
-   * @return Encoded stagingSessionToken.
-   */
-  public static String generateStagingSessionToken(String sessionId, String basePath) {
-    StagingSessionToken stagingSessionToken = new StagingSessionToken();
-    stagingSessionToken.setSessionId(sessionId);
-    stagingSessionToken.setBasePath(basePath);
-    return stagingSessionToken.encode();
-  }
-
-  private String encodedFileName(ArtifactMetadata artifactMetadata) {
-    return "artifact_"
-        + Hashing.sha256().hashString(artifactMetadata.getName(), CHARSET).toString();
-  }
-
+  @Override
   public void removeArtifacts(String stagingSessionToken) throws Exception {
     StagingSessionToken parsedToken = StagingSessionToken.decode(stagingSessionToken);
     ResourceId dir = getJobDirResourceId(parsedToken);
@@ -176,6 +111,19 @@
     LOG.info("Removed dir {}", dir);
   }
 
+  @Override
+  public WritableByteChannel openManifest(String stagingSession) throws Exception {
+    return FileSystems.create(
+        getManifestFileResourceId(StagingSessionToken.decode(stagingSession)), MimeTypes.TEXT);
+  }
+
+  @Override
+  public String getRetrievalToken(String stagingSession) throws Exception {
+    StagingSessionToken stagingSessionToken = StagingSessionToken.decode(stagingSession);
+    ResourceId manifestResourceId = getManifestFileResourceId(stagingSessionToken);
+    return manifestResourceId.toString();
+  }
+
   private ResourceId getJobDirResourceId(StagingSessionToken stagingSessionToken) {
     ResourceId baseResourceId;
     // Get or Create the base path
@@ -196,126 +144,25 @@
         .resolve(ARTIFACTS, StandardResolveOptions.RESOLVE_DIRECTORY);
   }
 
-  private class PutArtifactStreamObserver implements StreamObserver<PutArtifactRequest> {
-
-    private final StreamObserver<PutArtifactResponse> outboundObserver;
-    private PutArtifactMetadata metadata;
-    private ResourceId artifactId;
-    private WritableByteChannel artifactWritableByteChannel;
-    private Hasher hasher;
-
-    PutArtifactStreamObserver(StreamObserver<PutArtifactResponse> outboundObserver) {
-      this.outboundObserver = outboundObserver;
-    }
-
-    @Override
-    public void onNext(PutArtifactRequest putArtifactRequest) {
-      // Create the directory structure for storing artifacts in the first call.
-      if (metadata == null) {
-        checkNotNull(putArtifactRequest);
-        checkNotNull(putArtifactRequest.getMetadata());
-        metadata = putArtifactRequest.getMetadata();
-        LOG.debug("stored metadata: {}", metadata);
-        // Check the base path exists or create the base path
-        try {
-          ResourceId artifactsDirId =
-              getArtifactDirResourceId(
-                  StagingSessionToken.decode(
-                      putArtifactRequest.getMetadata().getStagingSessionToken()));
-          artifactId =
-              artifactsDirId.resolve(
-                  encodedFileName(metadata.getMetadata()), StandardResolveOptions.RESOLVE_FILE);
-          LOG.debug(
-              "Going to stage artifact {} to {}.", metadata.getMetadata().getName(), artifactId);
-          artifactWritableByteChannel = FileSystems.create(artifactId, MimeTypes.BINARY);
-          hasher = Hashing.sha256().newHasher();
-        } catch (Exception e) {
-          String message =
-              String.format(
-                  "Failed to begin staging artifact %s", metadata.getMetadata().getName());
-          LOG.error(message, e);
-          outboundObserver.onError(
-              new StatusRuntimeException(Status.DATA_LOSS.withDescription(message).withCause(e)));
-        }
-      } else {
-        try {
-          ByteString data = putArtifactRequest.getData().getData();
-          artifactWritableByteChannel.write(data.asReadOnlyByteBuffer());
-          hasher.putBytes(data.toByteArray());
-        } catch (IOException e) {
-          String message =
-              String.format(
-                  "Failed to write chunk of artifact %s to %s",
-                  metadata.getMetadata().getName(), artifactId);
-          LOG.error(message, e);
-          outboundObserver.onError(
-              new StatusRuntimeException(Status.DATA_LOSS.withDescription(message).withCause(e)));
-        }
-      }
-    }
-
-    @Override
-    public void onError(Throwable throwable) {
-      // Delete the artifact.
-      LOG.error("Staging artifact failed for " + artifactId, throwable);
-      try {
-        if (artifactWritableByteChannel != null) {
-          artifactWritableByteChannel.close();
-        }
-        if (artifactId != null) {
-          FileSystems.delete(
-              Collections.singletonList(artifactId), StandardMoveOptions.IGNORE_MISSING_FILES);
-        }
-
-      } catch (IOException e) {
-        outboundObserver.onError(
-            new StatusRuntimeException(
-                Status.DATA_LOSS.withDescription(
-                    String.format("Failed to clean up artifact file %s", artifactId))));
-        return;
-      }
-      outboundObserver.onError(
-          new StatusRuntimeException(
-              Status.DATA_LOSS
-                  .withDescription(String.format("Failed to stage artifact %s", artifactId))
-                  .withCause(throwable)));
-    }
-
-    @Override
-    public void onCompleted() {
-      // Close the stream.
-      LOG.debug("Staging artifact completed for " + artifactId);
-      if (artifactWritableByteChannel != null) {
-        try {
-          artifactWritableByteChannel.close();
-        } catch (IOException e) {
-          onError(e);
-          return;
-        }
-      }
-      String expectedSha256 = metadata.getMetadata().getSha256();
-      if (expectedSha256 != null && !expectedSha256.isEmpty()) {
-        String actualSha256 = hasher.hash().toString();
-        if (!actualSha256.equals(expectedSha256)) {
-          outboundObserver.onError(
-              new StatusRuntimeException(
-                  Status.INVALID_ARGUMENT.withDescription(
-                      String.format(
-                          "Artifact %s is corrupt: expected sah256 %s, but has sha256 %s",
-                          metadata.getMetadata().getName(), expectedSha256, actualSha256))));
-          return;
-        }
-      }
-      outboundObserver.onNext(PutArtifactResponse.newBuilder().build());
-      outboundObserver.onCompleted();
-    }
+  /**
+   * Generate a stagingSessionToken compatible with {@link BeamFileSystemArtifactStagingService}.
+   *
+   * @param sessionId Unique sessionId for artifact staging.
+   * @param basePath Base path to upload artifacts.
+   * @return Encoded stagingSessionToken.
+   */
+  public static String generateStagingSessionToken(String sessionId, String basePath) {
+    StagingSessionToken stagingSessionToken = new StagingSessionToken();
+    stagingSessionToken.setSessionId(sessionId);
+    stagingSessionToken.setBasePath(basePath);
+    return stagingSessionToken.encode();
   }
 
   /**
    * Serializable StagingSessionToken used to stage files with {@link
    * BeamFileSystemArtifactStagingService}.
    */
-  private static class StagingSessionToken implements Serializable {
+  protected static class StagingSessionToken implements Serializable {
 
     private String sessionId;
     private String basePath;
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/ClassLoaderArtifactRetrievalService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/ClassLoaderArtifactRetrievalService.java
new file mode 100644
index 0000000..5f4b90b
--- /dev/null
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/ClassLoaderArtifactRetrievalService.java
@@ -0,0 +1,58 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.fnexecution.artifact;
+
+import java.io.IOException;
+import java.io.InputStream;
+
+/**
+ * An {@link ArtifactRetrievalService} that loads artifacts as {@link ClassLoader} resources.
+ *
+ * <p>The retrieval token should be a path to a JSON-formatted ProxyManifest accessible via {@link
+ * ClassLoader#getResource(String)} whose resource locations also point to paths loadable via {@link
+ * ClassLoader#getResource(String)}.
+ */
+public class ClassLoaderArtifactRetrievalService extends AbstractArtifactRetrievalService {
+
+  private final ClassLoader classLoader;
+
+  public ClassLoaderArtifactRetrievalService() {
+    this(ClassLoaderArtifactRetrievalService.class.getClassLoader());
+  }
+
+  public ClassLoaderArtifactRetrievalService(ClassLoader classLoader) {
+    this.classLoader = classLoader;
+  }
+
+  @Override
+  public InputStream openManifest(String retrievalToken) throws IOException {
+    return openUri(retrievalToken, retrievalToken);
+  }
+
+  @Override
+  public InputStream openUri(String retrievalToken, String uri) throws IOException {
+    if (uri.charAt(0) == '/') {
+      uri = uri.substring(1);
+    }
+    InputStream result = classLoader.getResourceAsStream(uri);
+    if (result == null) {
+      throw new IOException("Unable to load " + uri + " with " + classLoader);
+    }
+    return result;
+  }
+}
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/JavaFilesystemArtifactStagingService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/JavaFilesystemArtifactStagingService.java
new file mode 100644
index 0000000..774efaf
--- /dev/null
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/artifact/JavaFilesystemArtifactStagingService.java
@@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.fnexecution.artifact;
+
+import java.io.File;
+import java.io.IOException;
+import java.nio.channels.Channels;
+import java.nio.channels.WritableByteChannel;
+import java.nio.file.FileSystem;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.util.Comparator;
+import java.util.stream.Stream;
+import org.apache.beam.model.jobmanagement.v1.ArtifactStagingServiceGrpc;
+
+/**
+ * An {@link ArtifactStagingServiceGrpc.ArtifactStagingServiceImplBase} that loads artifacts into a
+ * Java {@link FileSystem}.
+ */
+public class JavaFilesystemArtifactStagingService extends AbstractArtifactStagingService {
+
+  public static final String MANIFEST = "MANIFEST.json";
+  public static final String ARTIFACTS = "ARTIFACTS";
+
+  private final FileSystem fileSystem;
+  private final Path artifactRootDir;
+
+  public JavaFilesystemArtifactStagingService(FileSystem fileSystem, String artifactRootDir) {
+    this.fileSystem = fileSystem;
+    this.artifactRootDir = fileSystem.getPath(artifactRootDir);
+  }
+
+  @Override
+  public String getArtifactUri(String stagingSessionToken, String encodedFileName)
+      throws Exception {
+    return artifactRootDir
+        .resolve(stagingSessionToken)
+        .resolve(ARTIFACTS)
+        .resolve(encodedFileName)
+        .toString();
+  }
+
+  @Override
+  public WritableByteChannel openUri(String uri) throws IOException {
+    Path parent = fileSystem.getPath(uri).getParent();
+    if (parent == null) {
+      throw new RuntimeException("Provided URI did not have a parent: " + uri);
+    }
+    Files.createDirectories(parent);
+    return Channels.newChannel(Files.newOutputStream(fileSystem.getPath(uri)));
+  }
+
+  @Override
+  public void removeUri(String uri) throws IOException {
+    Files.deleteIfExists(fileSystem.getPath(uri));
+  }
+
+  @Override
+  public WritableByteChannel openManifest(String stagingSessionToken) throws Exception {
+    return openUri(getManifestUri(stagingSessionToken));
+  }
+
+  @Override
+  public void removeArtifacts(String stagingSessionToken) throws Exception {
+    try (Stream<Path> paths = Files.walk(artifactRootDir.resolve(stagingSessionToken))) {
+      paths.sorted(Comparator.reverseOrder()).map(Path::toFile).forEach(File::delete);
+    }
+  }
+
+  @Override
+  public String getRetrievalToken(String stagingSessionToken) throws Exception {
+    return getManifestUri(stagingSessionToken);
+  }
+
+  private String getManifestUri(String stagingSessionToken) {
+    return artifactRootDir.resolve(stagingSessionToken).resolve(MANIFEST).toString();
+  }
+}
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
index 8c8541a..b3f1c2e 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
@@ -138,7 +138,8 @@
                   .setInstructionId(bundleId)
                   .setProcessBundle(
                       BeamFnApi.ProcessBundleRequest.newBuilder()
-                          .setProcessBundleDescriptorReference(processBundleDescriptor.getId()))
+                          .setProcessBundleDescriptorId(processBundleDescriptor.getId())
+                          .addAllCacheTokens(stateRequestHandler.getCacheTokens()))
                   .build());
       LOG.debug(
           "Sent {} with ID {} for {} with ID {}",
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/GrpcDataService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/GrpcDataService.java
index 50e0dd5..69d378f 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/GrpcDataService.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/GrpcDataService.java
@@ -132,7 +132,7 @@
     LOG.debug(
         "Registering receiver for instruction {} and transform {}",
         inputLocation.getInstructionId(),
-        inputLocation.getPTransformId());
+        inputLocation.getTransformId());
     final BeamFnDataInboundObserver<T> observer =
         BeamFnDataInboundObserver.forConsumer(coder, listener);
     if (connectedClient.isDone()) {
@@ -165,7 +165,7 @@
     LOG.debug(
         "Creating sender for instruction {} and transform {}",
         outputLocation.getInstructionId(),
-        outputLocation.getPTransformId());
+        outputLocation.getTransformId());
     try {
       return BeamFnDataBufferingOutboundObserver.forLocation(
           outputLocation, coder, connectedClient.get(3, TimeUnit.MINUTES).getOutboundObserver());
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/GrpcStateService.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/GrpcStateService.java
index 9081778..9c72d81 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/GrpcStateService.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/GrpcStateService.java
@@ -125,7 +125,7 @@
     @Override
     public void onNext(StateRequest request) {
       StateRequestHandler handler =
-          requestHandlers.getOrDefault(request.getInstructionReference(), this::handlerNotFound);
+          requestHandlers.getOrDefault(request.getInstructionId(), this::handlerNotFound);
       try {
         CompletionStage<StateResponse.Builder> result = handler.handle(request);
         result.whenComplete(
@@ -156,8 +156,7 @@
           StateResponse.newBuilder()
               .setError(
                   String.format(
-                      "Unknown process bundle instruction id '%s'",
-                      request.getInstructionReference())));
+                      "Unknown process bundle instruction id '%s'", request.getInstructionId())));
       return result;
     }
 
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/InMemoryBagUserStateFactory.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/InMemoryBagUserStateFactory.java
index 61ec24d..f840864 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/InMemoryBagUserStateFactory.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/InMemoryBagUserStateFactory.java
@@ -20,6 +20,8 @@
 import java.util.ArrayList;
 import java.util.Iterator;
 import java.util.List;
+import java.util.Optional;
+import java.util.UUID;
 import org.apache.beam.runners.core.InMemoryStateInternals;
 import org.apache.beam.runners.core.StateInternals;
 import org.apache.beam.runners.core.StateNamespace;
@@ -29,6 +31,8 @@
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.state.BagState;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Charsets;
 
 /**
  * Holds user state in memory. Only one key is active at a time due to the GroupReduceFunction being
@@ -70,6 +74,7 @@
 
     private final StateTag<BagState<V>> stateTag;
     private final Coder<W> windowCoder;
+    private final ByteString cacheToken;
 
     /* Lazily initialized state internals upon first access */
     private volatile StateInternals stateInternals;
@@ -77,6 +82,7 @@
     InMemorySingleKeyBagState(String userStateId, Coder<V> valueCoder, Coder<W> windowCoder) {
       this.windowCoder = windowCoder;
       this.stateTag = StateTags.bag(userStateId, valueCoder);
+      this.cacheToken = ByteString.copyFrom(UUID.randomUUID().toString().getBytes(Charsets.UTF_8));
     }
 
     @Override
@@ -105,6 +111,11 @@
       bagState.clear();
     }
 
+    @Override
+    public Optional<ByteString> getCacheToken() {
+      return Optional.of(cacheToken);
+    }
+
     private void initStateInternals(K key) {
       if (stateInternals == null) {
         stateInternals = InMemoryStateInternals.forKey(key);
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandler.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandler.java
index d085893..1ca0313 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandler.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandler.java
@@ -17,7 +17,9 @@
  */
 package org.apache.beam.runners.fnexecution.state;
 
+import java.util.Collections;
 import java.util.concurrent.CompletionStage;
+import org.apache.beam.model.fnexecution.v1.BeamFnApi;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.StateRequest;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.StateResponse;
 
@@ -37,6 +39,11 @@
    */
   CompletionStage<StateResponse.Builder> handle(StateRequest request) throws Exception;
 
+  /** Retrieves a list of valid cache tokens. */
+  default Iterable<BeamFnApi.ProcessBundleRequest.CacheToken> getCacheTokens() {
+    return Collections.emptyList();
+  }
+
   static StateRequestHandler unsupported() {
     return request -> {
       throw new UnsupportedOperationException(
diff --git a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandlers.java b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandlers.java
index 5b40e61..4d8f816 100644
--- a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandlers.java
+++ b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/state/StateRequestHandlers.java
@@ -21,13 +21,17 @@
 
 import java.util.ArrayList;
 import java.util.EnumMap;
+import java.util.HashSet;
 import java.util.Iterator;
 import java.util.List;
 import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
 import java.util.concurrent.CompletableFuture;
 import java.util.concurrent.CompletionStage;
 import java.util.concurrent.ConcurrentHashMap;
 import javax.annotation.concurrent.ThreadSafe;
+import org.apache.beam.model.fnexecution.v1.BeamFnApi;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.StateAppendResponse;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.StateClearResponse;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.StateGetResponse;
@@ -137,6 +141,11 @@
 
     /** Clears the bag user state for the given key and window. */
     void clear(K key, W window);
+
+    /** Returns the currently valid cache token. */
+    default Optional<ByteString> getCacheToken() {
+      return Optional.empty();
+    }
   }
 
   /**
@@ -155,20 +164,12 @@
 
     /** Throws a {@link UnsupportedOperationException} on the first access. */
     static <K, V, W extends BoundedWindow> BagUserStateHandlerFactory<K, V, W> unsupported() {
-      return new BagUserStateHandlerFactory<K, V, W>() {
-        @Override
-        public BagUserStateHandler<K, V, W> forUserState(
-            String pTransformId,
-            String userStateId,
-            Coder<K> keyCoder,
-            Coder<V> valueCoder,
-            Coder<W> windowCoder) {
-          throw new UnsupportedOperationException(
-              String.format(
-                  "The %s does not support handling sides inputs for PTransform %s with user state "
-                      + "id %s.",
-                  BagUserStateHandler.class.getSimpleName(), pTransformId, userStateId));
-        }
+      return (pTransformId, userStateId, keyCoder, valueCoder, windowCoder) -> {
+        throw new UnsupportedOperationException(
+            String.format(
+                "The %s does not support handling sides inputs for PTransform %s with user state "
+                    + "id %s.",
+                BagUserStateHandler.class.getSimpleName(), pTransformId, userStateId));
       };
     }
   }
@@ -205,6 +206,19 @@
           .handle(request);
     }
 
+    @Override
+    public Iterable<BeamFnApi.ProcessBundleRequest.CacheToken> getCacheTokens() {
+      // Use loops here due to the horrible performance of Java Streams:
+      // https://medium.com/@milan.mimica/slow-like-a-stream-fast-like-a-loop-524f70391182
+      Set<BeamFnApi.ProcessBundleRequest.CacheToken> cacheTokens = new HashSet<>();
+      for (StateRequestHandler handler : handlers.values()) {
+        for (BeamFnApi.ProcessBundleRequest.CacheToken cacheToken : handler.getCacheTokens()) {
+          cacheTokens.add(cacheToken);
+        }
+      }
+      return cacheTokens;
+    }
+
     private CompletionStage<StateResponse.Builder> handlerNotFound(StateRequest request) {
       CompletableFuture<StateResponse.Builder> rval = new CompletableFuture<>();
       rval.completeExceptionally(new IllegalStateException());
@@ -236,14 +250,14 @@
 
     private final Map<String, Map<String, SideInputSpec>> sideInputSpecs;
     private final SideInputHandlerFactory sideInputHandlerFactory;
-    private final ConcurrentHashMap<SideInputSpec, SideInputHandler> cache;
+    private final ConcurrentHashMap<SideInputSpec, SideInputHandler> handlerCache;
 
     StateRequestHandlerToSideInputHandlerFactoryAdapter(
         Map<String, Map<String, SideInputSpec>> sideInputSpecs,
         SideInputHandlerFactory sideInputHandlerFactory) {
       this.sideInputSpecs = sideInputSpecs;
       this.sideInputHandlerFactory = sideInputHandlerFactory;
-      this.cache = new ConcurrentHashMap<>();
+      this.handlerCache = new ConcurrentHashMap<>();
     }
 
     @Override
@@ -258,8 +272,9 @@
 
         StateKey.MultimapSideInput stateKey = request.getStateKey().getMultimapSideInput();
         SideInputSpec<?, ?, ?> referenceSpec =
-            sideInputSpecs.get(stateKey.getPtransformId()).get(stateKey.getSideInputId());
-        SideInputHandler<?, ?> handler = cache.computeIfAbsent(referenceSpec, this::createHandler);
+            sideInputSpecs.get(stateKey.getTransformId()).get(stateKey.getSideInputId());
+        SideInputHandler<?, ?> handler =
+            handlerCache.computeIfAbsent(referenceSpec, this::createHandler);
 
         switch (request.getRequestCase()) {
           case GET:
@@ -289,7 +304,7 @@
       StateKey.MultimapSideInput stateKey = request.getStateKey().getMultimapSideInput();
 
       SideInputSpec<K, V, W> sideInputReferenceSpec =
-          sideInputSpecs.get(stateKey.getPtransformId()).get(stateKey.getSideInputId());
+          sideInputSpecs.get(stateKey.getTransformId()).get(stateKey.getSideInputId());
 
       W window = sideInputReferenceSpec.windowCoder().decode(stateKey.getWindow().newInput());
 
@@ -347,14 +362,14 @@
 
     private final ExecutableProcessBundleDescriptor processBundleDescriptor;
     private final BagUserStateHandlerFactory handlerFactory;
-    private final ConcurrentHashMap<BagUserStateSpec, BagUserStateHandler> cache;
+    private final ConcurrentHashMap<BagUserStateSpec, BagUserStateHandler> handlerCache;
 
     ByteStringStateRequestHandlerToBagUserStateHandlerFactoryAdapter(
         ExecutableProcessBundleDescriptor processBundleDescriptor,
         BagUserStateHandlerFactory handlerFactory) {
       this.processBundleDescriptor = processBundleDescriptor;
       this.handlerFactory = handlerFactory;
-      this.cache = new ConcurrentHashMap<>();
+      this.handlerCache = new ConcurrentHashMap<>();
     }
 
     @Override
@@ -371,7 +386,7 @@
         BagUserStateSpec<Object, Object, BoundedWindow> referenceSpec =
             processBundleDescriptor
                 .getBagUserStateSpecs()
-                .get(stateKey.getPtransformId())
+                .get(stateKey.getTransformId())
                 .get(stateKey.getUserStateId());
 
         // Note that by using the ByteStringCoder, we simplify the issue of encoding/decoding the
@@ -390,7 +405,7 @@
             ByteStringCoder.class.getSimpleName());
 
         BagUserStateHandler<ByteString, ByteString, BoundedWindow> handler =
-            cache.computeIfAbsent(referenceSpec, this::createHandler);
+            handlerCache.computeIfAbsent(referenceSpec, this::createHandler);
 
         ByteString key = stateKey.getKey();
         BoundedWindow window = referenceSpec.windowCoder().decode(stateKey.getWindow().newInput());
@@ -414,6 +429,24 @@
       }
     }
 
+    @Override
+    public Iterable<BeamFnApi.ProcessBundleRequest.CacheToken> getCacheTokens() {
+      // Use a loop here due to the horrible performance of Java Streams:
+      // https://medium.com/@milan.mimica/slow-like-a-stream-fast-like-a-loop-524f70391182
+      Set<BeamFnApi.ProcessBundleRequest.CacheToken> cacheTokens = new HashSet<>();
+      for (BagUserStateHandler handler : handlerCache.values()) {
+        if (handler.getCacheToken().isPresent()) {
+          cacheTokens.add(
+              BeamFnApi.ProcessBundleRequest.CacheToken.newBuilder()
+                  .setUserState(
+                      BeamFnApi.ProcessBundleRequest.CacheToken.UserState.getDefaultInstance())
+                  .setToken((ByteString) handler.getCacheToken().get())
+                  .build());
+        }
+      }
+      return cacheTokens;
+    }
+
     private static <W extends BoundedWindow>
         CompletionStage<StateResponse.Builder> handleGetRequest(
             StateRequest request,
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/ServerFactoryTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/ServerFactoryTest.java
index 65e9269..0972d7b 100644
--- a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/ServerFactoryTest.java
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/ServerFactoryTest.java
@@ -57,11 +57,11 @@
 
   private static final BeamFnApi.Elements CLIENT_DATA =
       BeamFnApi.Elements.newBuilder()
-          .addData(BeamFnApi.Elements.Data.newBuilder().setInstructionReference("1"))
+          .addData(BeamFnApi.Elements.Data.newBuilder().setInstructionId("1"))
           .build();
   private static final BeamFnApi.Elements SERVER_DATA =
       BeamFnApi.Elements.newBuilder()
-          .addData(BeamFnApi.Elements.Data.newBuilder().setInstructionReference("1"))
+          .addData(BeamFnApi.Elements.Data.newBuilder().setInstructionId("1"))
           .build();
 
   @Test
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/artifact/ClassLoaderArtifactServiceTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/artifact/ClassLoaderArtifactServiceTest.java
new file mode 100644
index 0000000..65d54a9
--- /dev/null
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/artifact/ClassLoaderArtifactServiceTest.java
@@ -0,0 +1,406 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.fnexecution.artifact;
+
+import java.io.ByteArrayOutputStream;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.net.URI;
+import java.net.URL;
+import java.net.URLClassLoader;
+import java.nio.charset.Charset;
+import java.nio.file.FileSystem;
+import java.nio.file.FileSystems;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.CompletableFuture;
+import java.util.concurrent.ExecutionException;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.TimeoutException;
+import java.util.zip.ZipEntry;
+import java.util.zip.ZipOutputStream;
+import org.apache.beam.model.jobmanagement.v1.ArtifactApi;
+import org.apache.beam.model.jobmanagement.v1.ArtifactRetrievalServiceGrpc;
+import org.apache.beam.model.jobmanagement.v1.ArtifactStagingServiceGrpc;
+import org.apache.beam.runners.fnexecution.GrpcFnServer;
+import org.apache.beam.runners.fnexecution.InProcessServerFactory;
+import org.apache.beam.vendor.grpc.v1p21p0.com.google.protobuf.ByteString;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ManagedChannel;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.inprocess.InProcessChannelBuilder;
+import org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.StreamObserver;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Charsets;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.junit.Assert;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.JUnit4;
+
+/**
+ * Tests for {@link ClassLoaderArtifactRetrievalService} and {@link
+ * JavaFilesystemArtifactStagingService}.
+ */
+@RunWith(JUnit4.class)
+public class ClassLoaderArtifactServiceTest {
+
+  @Rule public TemporaryFolder tempFolder = new TemporaryFolder();
+
+  private static final int ARTIFACT_CHUNK_SIZE = 100;
+
+  private static final Charset BIJECTIVE_CHARSET = Charsets.ISO_8859_1;
+
+  public interface ArtifactServicePair extends AutoCloseable {
+
+    String getStagingToken(String nonce);
+
+    ArtifactStagingServiceGrpc.ArtifactStagingServiceStub createStagingStub() throws Exception;
+
+    ArtifactStagingServiceGrpc.ArtifactStagingServiceBlockingStub createStagingBlockingStub()
+        throws Exception;
+
+    ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceStub createRetrievalStub()
+        throws Exception;
+
+    ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceBlockingStub createRetrievalBlockingStub()
+        throws Exception;
+  }
+
+  /**
+   * An ArtifactServicePair that loads artifacts into a jar file and then serves them up via a
+   * ClassLoader using out of that jar.
+   */
+  private ArtifactServicePair classLoaderService() throws IOException {
+    return new ArtifactServicePair() {
+
+      Path jarPath = Paths.get(tempFolder.newFile("jar.jar").getPath());
+
+      // These are initialized when the staging service is requested.
+      FileSystem jarFilesystem;
+      JavaFilesystemArtifactStagingService stagingService;
+      GrpcFnServer<JavaFilesystemArtifactStagingService> stagingServer;
+      ClassLoaderArtifactRetrievalService retrievalService;
+      GrpcFnServer<ClassLoaderArtifactRetrievalService> retrievalServer;
+
+      // These are initialized when the retrieval service is requested, closing the jar file
+      // created above.
+      ArtifactStagingServiceGrpc.ArtifactStagingServiceStub stagingStub;
+      ArtifactStagingServiceGrpc.ArtifactStagingServiceBlockingStub stagingBlockingStub;
+      ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceStub retrievalStub;
+      ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceBlockingStub retrievalBlockingStub;
+
+      @Override
+      public void close() throws Exception {
+        if (stagingServer != null) {
+          stagingServer.close();
+        }
+        if (stagingService != null) {
+          stagingService.close();
+        }
+        if (retrievalServer != null) {
+          retrievalServer.close();
+        }
+        if (retrievalService != null) {
+          retrievalService.close();
+        }
+      }
+
+      @Override
+      public String getStagingToken(String nonce) {
+        return "/path/to/subdir" + nonce.hashCode();
+      }
+
+      private void startStagingService() throws Exception {
+        try (FileOutputStream fileOut = new FileOutputStream(jarPath.toString())) {
+          try (ZipOutputStream zipOut = new ZipOutputStream(fileOut)) {
+            ZipEntry zipEntry = new ZipEntry("someFile");
+            zipOut.putNextEntry(zipEntry);
+            zipOut.write(new byte[] {'s', 't', 'u', 'f', 'f'});
+            zipOut.closeEntry();
+          }
+        }
+        jarFilesystem =
+            FileSystems.newFileSystem(
+                URI.create("jar:file:" + jarPath.toString()), ImmutableMap.of());
+        JavaFilesystemArtifactStagingService stagingService =
+            new JavaFilesystemArtifactStagingService(jarFilesystem, "/path/to/root");
+        GrpcFnServer<JavaFilesystemArtifactStagingService> stagingServer =
+            GrpcFnServer.allocatePortAndCreateFor(stagingService, InProcessServerFactory.create());
+        ManagedChannel stagingChannel =
+            InProcessChannelBuilder.forName(stagingServer.getApiServiceDescriptor().getUrl())
+                .build();
+        stagingStub = ArtifactStagingServiceGrpc.newStub(stagingChannel);
+        stagingBlockingStub = ArtifactStagingServiceGrpc.newBlockingStub(stagingChannel);
+      }
+
+      @Override
+      public ArtifactStagingServiceGrpc.ArtifactStagingServiceStub createStagingStub()
+          throws Exception {
+        if (stagingStub == null) {
+          startStagingService();
+        }
+        return stagingStub;
+      }
+
+      @Override
+      public ArtifactStagingServiceGrpc.ArtifactStagingServiceBlockingStub
+          createStagingBlockingStub() throws Exception {
+        if (stagingBlockingStub == null) {
+          startStagingService();
+        }
+        return stagingBlockingStub;
+      }
+
+      public void startupRetrievalService() throws Exception {
+        jarFilesystem.close();
+        retrievalService =
+            new ClassLoaderArtifactRetrievalService(
+                new URLClassLoader(new URL[] {jarPath.toUri().toURL()}));
+        retrievalServer =
+            GrpcFnServer.allocatePortAndCreateFor(
+                retrievalService, InProcessServerFactory.create());
+        ManagedChannel retrievalChannel =
+            InProcessChannelBuilder.forName(retrievalServer.getApiServiceDescriptor().getUrl())
+                .build();
+        retrievalStub = ArtifactRetrievalServiceGrpc.newStub(retrievalChannel);
+        retrievalBlockingStub = ArtifactRetrievalServiceGrpc.newBlockingStub(retrievalChannel);
+      }
+
+      @Override
+      public ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceStub createRetrievalStub()
+          throws Exception {
+        if (retrievalStub == null) {
+          startupRetrievalService();
+        }
+        return retrievalStub;
+      }
+
+      @Override
+      public ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceBlockingStub
+          createRetrievalBlockingStub() throws Exception {
+        if (retrievalBlockingStub == null) {
+          startupRetrievalService();
+        }
+        return retrievalBlockingStub;
+      }
+    };
+  }
+
+  private ArtifactApi.ArtifactMetadata putArtifact(
+      ArtifactStagingServiceGrpc.ArtifactStagingServiceStub stagingStub,
+      String stagingSessionToken,
+      String name,
+      String contents)
+      throws InterruptedException, ExecutionException, TimeoutException {
+    ArtifactApi.ArtifactMetadata metadata =
+        ArtifactApi.ArtifactMetadata.newBuilder().setName(name).build();
+    CompletableFuture<Void> complete = new CompletableFuture<>();
+    StreamObserver<ArtifactApi.PutArtifactRequest> outputStreamObserver =
+        stagingStub.putArtifact(
+            new StreamObserver<ArtifactApi.PutArtifactResponse>() {
+
+              @Override
+              public void onNext(ArtifactApi.PutArtifactResponse putArtifactResponse) {
+                // Do nothing.
+              }
+
+              @Override
+              public void onError(Throwable th) {
+                complete.completeExceptionally(th);
+              }
+
+              @Override
+              public void onCompleted() {
+                complete.complete(null);
+              }
+            });
+    outputStreamObserver.onNext(
+        ArtifactApi.PutArtifactRequest.newBuilder()
+            .setMetadata(
+                ArtifactApi.PutArtifactMetadata.newBuilder()
+                    .setMetadata(metadata)
+                    .setStagingSessionToken(stagingSessionToken))
+            .build());
+
+    byte[] byteContents = contents.getBytes(BIJECTIVE_CHARSET);
+    for (int start = 0; start < byteContents.length; start += ARTIFACT_CHUNK_SIZE) {
+      outputStreamObserver.onNext(
+          ArtifactApi.PutArtifactRequest.newBuilder()
+              .setData(
+                  ArtifactApi.ArtifactChunk.newBuilder()
+                      .setData(
+                          ByteString.copyFrom(
+                              byteContents,
+                              start,
+                              Math.min(byteContents.length - start, ARTIFACT_CHUNK_SIZE)))
+                      .build())
+              .build());
+    }
+    outputStreamObserver.onCompleted();
+    complete.get(10, TimeUnit.SECONDS);
+    return metadata;
+  }
+
+  private String commitManifest(
+      ArtifactStagingServiceGrpc.ArtifactStagingServiceBlockingStub stagingStub,
+      String stagingToken,
+      List<ArtifactApi.ArtifactMetadata> artifacts) {
+    return stagingStub
+        .commitManifest(
+            ArtifactApi.CommitManifestRequest.newBuilder()
+                .setStagingSessionToken(stagingToken)
+                .setManifest(ArtifactApi.Manifest.newBuilder().addAllArtifact(artifacts))
+                .build())
+        .getRetrievalToken();
+  }
+
+  private String getArtifact(
+      ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceStub retrievalStub,
+      String retrievalToken,
+      String name)
+      throws ExecutionException, InterruptedException {
+    CompletableFuture<String> result = new CompletableFuture<>();
+    retrievalStub.getArtifact(
+        ArtifactApi.GetArtifactRequest.newBuilder()
+            .setRetrievalToken(retrievalToken)
+            .setName(name)
+            .build(),
+        new StreamObserver<ArtifactApi.ArtifactChunk>() {
+
+          private ByteArrayOutputStream all = new ByteArrayOutputStream();
+
+          @Override
+          public void onNext(ArtifactApi.ArtifactChunk artifactChunk) {
+            try {
+              all.write(artifactChunk.getData().toByteArray());
+            } catch (IOException exn) {
+              Assert.fail("ByteArrayOutputStream threw exception: " + exn);
+            }
+          }
+
+          @Override
+          public void onError(Throwable th) {
+            result.completeExceptionally(th);
+          }
+
+          @Override
+          public void onCompleted() {
+            result.complete(new String(all.toByteArray(), BIJECTIVE_CHARSET));
+          }
+        });
+    return result.get();
+  }
+
+  private String stageArtifacts(
+      ArtifactServicePair service, String stagingToken, Map<String, String> artifacts)
+      throws Exception {
+    ArtifactStagingServiceGrpc.ArtifactStagingServiceStub stagingStub = service.createStagingStub();
+    ArtifactStagingServiceGrpc.ArtifactStagingServiceBlockingStub stagingBlockingStub =
+        service.createStagingBlockingStub();
+    List<ArtifactApi.ArtifactMetadata> artifactMetadatas = new ArrayList<>();
+    for (Map.Entry<String, String> entry : artifacts.entrySet()) {
+      artifactMetadatas.add(
+          putArtifact(stagingStub, stagingToken, entry.getKey(), entry.getValue()));
+    }
+    return commitManifest(stagingBlockingStub, stagingToken, artifactMetadatas);
+  }
+
+  private void checkArtifacts(
+      ArtifactServicePair service, String retrievalToken, Map<String, String> artifacts)
+      throws Exception {
+    ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceStub retrievalStub =
+        service.createRetrievalStub();
+    ArtifactRetrievalServiceGrpc.ArtifactRetrievalServiceBlockingStub retrievalBlockingStub =
+        service.createRetrievalBlockingStub();
+    ArtifactApi.Manifest manifest =
+        retrievalBlockingStub
+            .getManifest(
+                ArtifactApi.GetManifestRequest.newBuilder()
+                    .setRetrievalToken(retrievalToken)
+                    .build())
+            .getManifest();
+    Assert.assertEquals(manifest.getArtifactCount(), artifacts.size());
+    for (ArtifactApi.ArtifactMetadata artifact : manifest.getArtifactList()) {
+      String contents = getArtifact(retrievalStub, retrievalToken, artifact.getName());
+      Assert.assertEquals(artifacts.get(artifact.getName()), contents);
+    }
+  }
+
+  private void runTest(ArtifactServicePair service, Map<String, String> artifacts)
+      throws Exception {
+    checkArtifacts(
+        service, stageArtifacts(service, service.getStagingToken("nonce"), artifacts), artifacts);
+  }
+
+  private Map<String, String> identityMap(String... keys) {
+    ImmutableMap.Builder<String, String> builder = ImmutableMap.builder();
+    for (String key : keys) {
+      builder.put(key, key);
+    }
+    return builder.build();
+  }
+
+  @Test
+  public void testBasic() throws Exception {
+    try (ArtifactServicePair service = classLoaderService()) {
+      runTest(service, ImmutableMap.of("a", "Aa", "b", "Bbb", "c", "C"));
+    }
+  }
+
+  @Test
+  public void testOddFilenames() throws Exception {
+    try (ArtifactServicePair service = classLoaderService()) {
+      runTest(
+          service,
+          identityMap(
+              "some whitespace\n\t",
+              "some whitespace\n",
+              "nullTerminated\0",
+              "nullTerminated\0\0",
+              "../../../../../../../slashes",
+              "..\\..\\..\\..\\..\\..\\..\\backslashes",
+              "/private"));
+    }
+  }
+
+  @Test
+  public void testMultipleChunks() throws Exception {
+    try (ArtifactServicePair service = classLoaderService()) {
+      byte[] contents = new byte[ARTIFACT_CHUNK_SIZE * 9 / 2];
+      for (int i = 0; i < contents.length; i++) {
+        contents[i] = (byte) (i * i + Integer.MAX_VALUE / (i + 1));
+      }
+      runTest(service, ImmutableMap.of("filename", new String(contents, BIJECTIVE_CHARSET)));
+    }
+  }
+
+  @Test
+  public void testMultipleTokens() throws Exception {
+    try (ArtifactServicePair service = classLoaderService()) {
+      Map<String, String> artifacts1 = ImmutableMap.of("a", "a1", "b", "b");
+      Map<String, String> artifacts2 = ImmutableMap.of("a", "a2", "c", "c");
+      String token1 = stageArtifacts(service, service.getStagingToken("1"), artifacts1);
+      String token2 = stageArtifacts(service, service.getStagingToken("2"), artifacts2);
+      checkArtifacts(service, token1, artifacts1);
+      checkArtifacts(service, token2, artifacts2);
+    }
+  }
+}
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/DefaultJobBundleFactoryTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/DefaultJobBundleFactoryTest.java
index 274f079..4db1b64 100644
--- a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/DefaultJobBundleFactoryTest.java
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/DefaultJobBundleFactoryTest.java
@@ -23,6 +23,7 @@
 import static org.mockito.Mockito.verifyNoMoreInteractions;
 import static org.mockito.Mockito.when;
 
+import java.util.Collections;
 import java.util.Map;
 import java.util.concurrent.CompletableFuture;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.InstructionResponse;
@@ -262,6 +263,7 @@
             serverInfo)) {
       OutputReceiverFactory orf = mock(OutputReceiverFactory.class);
       StateRequestHandler srh = mock(StateRequestHandler.class);
+      when(srh.getCacheTokens()).thenReturn(Collections.emptyList());
       StageBundleFactory sbf = bundleFactory.forStage(getExecutableStage(environmentA));
       Thread.sleep(10); // allow environment to expire
       sbf.getBundle(orf, srh, BundleProgressHandler.ignored()).close();
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClientTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClientTest.java
index 871ee43..5bc0d1c 100644
--- a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClientTest.java
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClientTest.java
@@ -18,11 +18,12 @@
 package org.apache.beam.runners.fnexecution.control;
 
 import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables.getOnlyElement;
+import static org.hamcrest.MatcherAssert.assertThat;
 import static org.hamcrest.Matchers.containsInAnyOrder;
+import static org.hamcrest.Matchers.is;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertNotSame;
 import static org.junit.Assert.assertSame;
-import static org.junit.Assert.assertThat;
 import static org.junit.Assert.fail;
 import static org.mockito.Matchers.any;
 import static org.mockito.Matchers.eq;
@@ -35,6 +36,7 @@
 import java.util.ArrayList;
 import java.util.Collection;
 import java.util.Collections;
+import java.util.List;
 import java.util.Map;
 import java.util.concurrent.CompletableFuture;
 import java.util.concurrent.ExecutionException;
@@ -81,7 +83,9 @@
 import org.junit.rules.ExpectedException;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
+import org.mockito.ArgumentCaptor;
 import org.mockito.Mock;
+import org.mockito.Mockito;
 import org.mockito.MockitoAnnotations;
 
 /** Unit tests for {@link SdkHarnessClient}. */
@@ -334,6 +338,7 @@
     when(mockStateDelegator.registerForProcessBundleInstructionId(any(), any()))
         .thenReturn(mockStateRegistration);
     StateRequestHandler mockStateHandler = mock(StateRequestHandler.class);
+    when(mockStateHandler.getCacheTokens()).thenReturn(Collections.emptyList());
     BundleProgressHandler mockProgressHandler = mock(BundleProgressHandler.class);
 
     CompletableFuture<InstructionResponse> processBundleResponseFuture = new CompletableFuture<>();
@@ -429,6 +434,7 @@
     when(mockStateDelegator.registerForProcessBundleInstructionId(any(), any()))
         .thenReturn(mockStateRegistration);
     StateRequestHandler mockStateHandler = mock(StateRequestHandler.class);
+    when(mockStateHandler.getCacheTokens()).thenReturn(Collections.emptyList());
     BundleProgressHandler mockProgressHandler = mock(BundleProgressHandler.class);
 
     CompletableFuture<InstructionResponse> processBundleResponseFuture = new CompletableFuture<>();
@@ -529,6 +535,7 @@
     when(mockStateDelegator.registerForProcessBundleInstructionId(any(), any()))
         .thenReturn(mockStateRegistration);
     StateRequestHandler mockStateHandler = mock(StateRequestHandler.class);
+    when(mockStateHandler.getCacheTokens()).thenReturn(Collections.emptyList());
     BundleProgressHandler mockProgressHandler = mock(BundleProgressHandler.class);
 
     CompletableFuture<InstructionResponse> processBundleResponseFuture = new CompletableFuture<>();
@@ -574,6 +581,51 @@
     }
   }
 
+  @Test
+  public void verifyCacheTokensAreUsedInNewBundleRequest() {
+    CompletableFuture<InstructionResponse> registerResponseFuture = new CompletableFuture<>();
+    when(fnApiControlClient.handle(any(BeamFnApi.InstructionRequest.class)))
+        .thenReturn(registerResponseFuture);
+
+    ProcessBundleDescriptor descriptor1 =
+        ProcessBundleDescriptor.newBuilder().setId("descriptor1").build();
+
+    Map<String, RemoteInputDestination> remoteInputs =
+        Collections.singletonMap(
+            "inputPC",
+            RemoteInputDestination.of(
+                FullWindowedValueCoder.of(VarIntCoder.of(), GlobalWindow.Coder.INSTANCE),
+                SDK_GRPC_READ_TRANSFORM));
+
+    BundleProcessor processor1 = sdkHarnessClient.getProcessor(descriptor1, remoteInputs);
+    when(dataService.send(any(), any())).thenReturn(mock(CloseableFnDataReceiver.class));
+
+    StateRequestHandler stateRequestHandler = Mockito.mock(StateRequestHandler.class);
+    List<BeamFnApi.ProcessBundleRequest.CacheToken> cacheTokens =
+        Collections.singletonList(
+            BeamFnApi.ProcessBundleRequest.CacheToken.newBuilder().getDefaultInstanceForType());
+    when(stateRequestHandler.getCacheTokens()).thenReturn(cacheTokens);
+
+    processor1.newBundle(
+        ImmutableMap.of(SDK_GRPC_WRITE_TRANSFORM, mock(RemoteOutputReceiver.class)),
+        stateRequestHandler,
+        BundleProgressHandler.ignored());
+
+    // Retrieve the requests made to the FnApiControlClient
+    ArgumentCaptor<BeamFnApi.InstructionRequest> reqCaptor =
+        ArgumentCaptor.forClass(BeamFnApi.InstructionRequest.class);
+    Mockito.verify(fnApiControlClient, Mockito.times(2)).handle(reqCaptor.capture());
+    List<BeamFnApi.InstructionRequest> requests = reqCaptor.getAllValues();
+
+    // Verify that the cache tokens are included in the ProcessBundleRequest
+    assertThat(
+        requests.get(0).getRequestCase(), is(BeamFnApi.InstructionRequest.RequestCase.REGISTER));
+    assertThat(
+        requests.get(1).getRequestCase(),
+        is(BeamFnApi.InstructionRequest.RequestCase.PROCESS_BUNDLE));
+    assertThat(requests.get(1).getProcessBundle().getCacheTokensList(), is(cacheTokens));
+  }
+
   private static class TestFn extends DoFn<String, String> {
     @ProcessElement
     public void processElement(ProcessContext context) {
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/data/GrpcDataServiceTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/data/GrpcDataServiceTest.java
index f83f3d4..a4458c0 100644
--- a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/data/GrpcDataServiceTest.java
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/data/GrpcDataServiceTest.java
@@ -57,7 +57,7 @@
 /** Tests for {@link GrpcDataService}. */
 @RunWith(JUnit4.class)
 public class GrpcDataServiceTest {
-  private static final String PTRANSFORM_ID = "888";
+  private static final String TRANSFORM_ID = "888";
   private static final Coder<WindowedValue<String>> CODER =
       LengthPrefixCoder.of(WindowedValue.getValueOnlyCoder(StringUtf8Coder.of()));
 
@@ -91,7 +91,7 @@
 
       for (int i = 0; i < 3; ++i) {
         CloseableFnDataReceiver<WindowedValue<String>> consumer =
-            service.send(LogicalEndpoint.of(Integer.toString(i), PTRANSFORM_ID), CODER);
+            service.send(LogicalEndpoint.of(Integer.toString(i), TRANSFORM_ID), CODER);
 
         consumer.accept(WindowedValue.valueInGlobalWindow("A" + i));
         consumer.accept(WindowedValue.valueInGlobalWindow("B" + i));
@@ -121,7 +121,7 @@
         GrpcFnServer.allocatePortAndCreateFor(service, InProcessServerFactory.create())) {
       Collection<Future<Void>> clientFutures = new ArrayList<>();
       for (int i = 0; i < 3; ++i) {
-        final String instructionReference = Integer.toString(i);
+        final String instructionId = Integer.toString(i);
         clientFutures.add(
             executorService.submit(
                 () -> {
@@ -131,7 +131,7 @@
                   StreamObserver<Elements> outboundObserver =
                       BeamFnDataGrpc.newStub(channel)
                           .data(TestStreams.withOnNext(clientInboundElements::add).build());
-                  outboundObserver.onNext(elementsWithData(instructionReference));
+                  outboundObserver.onNext(elementsWithData(instructionId));
                   waitForInboundElements.await();
                   outboundObserver.onCompleted();
                   return null;
@@ -145,7 +145,7 @@
         serverInboundValues.add(serverInboundValue);
         readFutures.add(
             service.receive(
-                LogicalEndpoint.of(Integer.toString(i), PTRANSFORM_ID),
+                LogicalEndpoint.of(Integer.toString(i), TRANSFORM_ID),
                 CODER,
                 serverInboundValue::add));
       }
@@ -172,8 +172,8 @@
     return BeamFnApi.Elements.newBuilder()
         .addData(
             BeamFnApi.Elements.Data.newBuilder()
-                .setInstructionReference(id)
-                .setPtransformId(PTRANSFORM_ID)
+                .setInstructionId(id)
+                .setTransformId(TRANSFORM_ID)
                 .setData(
                     ByteString.copyFrom(
                             encodeToByteArray(CODER, WindowedValue.valueInGlobalWindow("A" + id)))
@@ -186,9 +186,7 @@
                                 encodeToByteArray(
                                     CODER, WindowedValue.valueInGlobalWindow("C" + id))))))
         .addData(
-            BeamFnApi.Elements.Data.newBuilder()
-                .setInstructionReference(id)
-                .setPtransformId(PTRANSFORM_ID))
+            BeamFnApi.Elements.Data.newBuilder().setInstructionId(id).setTransformId(TRANSFORM_ID))
         .build();
   }
 }
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/logging/GrpcLoggingServiceTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/logging/GrpcLoggingServiceTest.java
index 6733e76..3bfda79 100644
--- a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/logging/GrpcLoggingServiceTest.java
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/logging/GrpcLoggingServiceTest.java
@@ -62,7 +62,7 @@
 
       Collection<Callable<Void>> tasks = new ArrayList<>();
       for (int i = 1; i <= 3; ++i) {
-        final int instructionReference = i;
+        final int instructionId = i;
         tasks.add(
             () -> {
               CountDownLatch waitForServerHangup = new CountDownLatch(1);
@@ -74,8 +74,7 @@
                           TestStreams.withOnNext(messageDiscarder)
                               .withOnCompleted(new CountDown(waitForServerHangup))
                               .build());
-              outboundObserver.onNext(
-                  createLogsWithIds(instructionReference, -instructionReference));
+              outboundObserver.onNext(createLogsWithIds(instructionId, -instructionId));
               outboundObserver.onCompleted();
               waitForServerHangup.await();
               return null;
@@ -105,7 +104,7 @@
 
       Collection<Callable<Void>> tasks = new ArrayList<>();
       for (int i = 1; i <= 3; ++i) {
-        final int instructionReference = i;
+        final int instructionId = i;
         tasks.add(
             () -> {
               CountDownLatch waitForTermination = new CountDownLatch(1);
@@ -118,9 +117,8 @@
                           TestStreams.withOnNext(messageDiscarder)
                               .withOnError(new CountDown(waitForTermination))
                               .build());
-              outboundObserver.onNext(
-                  createLogsWithIds(instructionReference, -instructionReference));
-              outboundObserver.onError(new RuntimeException("Client " + instructionReference));
+              outboundObserver.onNext(createLogsWithIds(instructionId, -instructionId));
+              outboundObserver.onError(new RuntimeException("Client " + instructionId));
               waitForTermination.await();
               return null;
             });
@@ -141,7 +139,7 @@
         GrpcFnServer.allocatePortAndCreateFor(service, InProcessServerFactory.create())) {
 
       for (int i = 1; i <= 3; ++i) {
-        final long instructionReference = i;
+        final long instructionId = i;
         futures.add(
             executorService.submit(
                 () -> {
@@ -156,7 +154,7 @@
                                 TestStreams.withOnNext(messageDiscarder)
                                     .withOnCompleted(new CountDown(waitForServerHangup))
                                     .build());
-                    outboundObserver.onNext(createLogsWithIds(instructionReference));
+                    outboundObserver.onNext(createLogsWithIds(instructionId));
                     waitForServerHangup.await();
                     return null;
                   }
@@ -181,7 +179,7 @@
   }
 
   private BeamFnApi.LogEntry createLogWithId(long id) {
-    return BeamFnApi.LogEntry.newBuilder().setInstructionReference(Long.toString(id)).build();
+    return BeamFnApi.LogEntry.newBuilder().setInstructionId(Long.toString(id)).build();
   }
 
   private static class CollectionAppendingLogWriter implements LogWriter {
diff --git a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/state/GrpcStateServiceTest.java b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/state/GrpcStateServiceTest.java
index 235ada7..f8b3f29 100644
--- a/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/state/GrpcStateServiceTest.java
+++ b/runners/java-fn-execution/src/test/java/org/apache/beam/runners/fnexecution/state/GrpcStateServiceTest.java
@@ -75,7 +75,7 @@
 
     // send state request
     BeamFnApi.StateRequest request =
-        BeamFnApi.StateRequest.newBuilder().setInstructionReference(bundleInstructionId).build();
+        BeamFnApi.StateRequest.newBuilder().setInstructionId(bundleInstructionId).build();
     requestObserver.onNext(request);
 
     // assert behavior
@@ -113,7 +113,7 @@
 
     // send state request
     BeamFnApi.StateRequest request =
-        BeamFnApi.StateRequest.newBuilder().setInstructionReference(bundleInstructionId).build();
+        BeamFnApi.StateRequest.newBuilder().setInstructionId(bundleInstructionId).build();
     requestObserver.onNext(request);
 
     // wait for response
diff --git a/runners/jet-experimental/build.gradle b/runners/jet/build.gradle
similarity index 98%
rename from runners/jet-experimental/build.gradle
rename to runners/jet/build.gradle
index edd6973..b97b016 100644
--- a/runners/jet-experimental/build.gradle
+++ b/runners/jet/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.jet')
 
 description = "Apache Beam :: Runners :: Hazelcast Jet"
 
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/DAGBuilder.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/DAGBuilder.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/DAGBuilder.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/DAGBuilder.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/FailedRunningPipelineResults.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/FailedRunningPipelineResults.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/FailedRunningPipelineResults.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/FailedRunningPipelineResults.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetGraphVisitor.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetGraphVisitor.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetGraphVisitor.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetGraphVisitor.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetPipelineOptions.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetPipelineOptions.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetPipelineOptions.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetPipelineOptions.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetPipelineResult.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetPipelineResult.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetPipelineResult.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetPipelineResult.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetRunner.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetRunner.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetRunner.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetRunner.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetRunnerRegistrar.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetRunnerRegistrar.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetRunnerRegistrar.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetRunnerRegistrar.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetTransformTranslator.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetTransformTranslator.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetTransformTranslator.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetTransformTranslator.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetTransformTranslators.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetTransformTranslators.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetTransformTranslators.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetTransformTranslators.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetTranslationContext.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/JetTranslationContext.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/JetTranslationContext.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/JetTranslationContext.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/Utils.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/Utils.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/Utils.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/Utils.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/AbstractMetric.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/AbstractMetric.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/AbstractMetric.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/AbstractMetric.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/CounterImpl.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/CounterImpl.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/CounterImpl.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/CounterImpl.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/DistributionImpl.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/DistributionImpl.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/DistributionImpl.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/DistributionImpl.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/GaugeImpl.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/GaugeImpl.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/GaugeImpl.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/GaugeImpl.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricResults.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricResults.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricResults.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricResults.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricsContainer.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricsContainer.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricsContainer.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/JetMetricsContainer.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/package-info.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/package-info.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/metrics/package-info.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/metrics/package-info.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/package-info.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/package-info.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/package-info.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/package-info.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/AbstractParDoP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/AbstractParDoP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/AbstractParDoP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/AbstractParDoP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/AssignWindowP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/AssignWindowP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/AssignWindowP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/AssignWindowP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/BoundedSourceP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/BoundedSourceP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/BoundedSourceP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/BoundedSourceP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/FlattenP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/FlattenP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/FlattenP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/FlattenP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/ImpulseP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/ImpulseP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/ImpulseP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/ImpulseP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/ParDoP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/ParDoP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/ParDoP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/ParDoP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/StatefulParDoP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/StatefulParDoP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/StatefulParDoP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/StatefulParDoP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/UnboundedSourceP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/UnboundedSourceP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/UnboundedSourceP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/UnboundedSourceP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/ViewP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/ViewP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/ViewP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/ViewP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/WindowGroupP.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/WindowGroupP.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/WindowGroupP.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/WindowGroupP.java
diff --git a/runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/package-info.java b/runners/jet/src/main/java/org/apache/beam/runners/jet/processors/package-info.java
similarity index 100%
rename from runners/jet-experimental/src/main/java/org/apache/beam/runners/jet/processors/package-info.java
rename to runners/jet/src/main/java/org/apache/beam/runners/jet/processors/package-info.java
diff --git a/runners/jet-experimental/src/test/java/org/apache/beam/runners/jet/JetTestRunnerRegistrar.java b/runners/jet/src/test/java/org/apache/beam/runners/jet/JetTestRunnerRegistrar.java
similarity index 100%
rename from runners/jet-experimental/src/test/java/org/apache/beam/runners/jet/JetTestRunnerRegistrar.java
rename to runners/jet/src/test/java/org/apache/beam/runners/jet/JetTestRunnerRegistrar.java
diff --git a/runners/jet-experimental/src/test/java/org/apache/beam/runners/jet/TestJetRunner.java b/runners/jet/src/test/java/org/apache/beam/runners/jet/TestJetRunner.java
similarity index 100%
rename from runners/jet-experimental/src/test/java/org/apache/beam/runners/jet/TestJetRunner.java
rename to runners/jet/src/test/java/org/apache/beam/runners/jet/TestJetRunner.java
diff --git a/runners/jet-experimental/src/test/java/org/apache/beam/runners/jet/TestStreamP.java b/runners/jet/src/test/java/org/apache/beam/runners/jet/TestStreamP.java
similarity index 100%
rename from runners/jet-experimental/src/test/java/org/apache/beam/runners/jet/TestStreamP.java
rename to runners/jet/src/test/java/org/apache/beam/runners/jet/TestStreamP.java
diff --git a/runners/local-java/build.gradle b/runners/local-java/build.gradle
index dac870d..343327a 100644
--- a/runners/local-java/build.gradle
+++ b/runners/local-java/build.gradle
@@ -19,6 +19,7 @@
 plugins { id 'org.apache.beam.module' }
 
 applyJavaNature(
+    automaticModuleName: 'org.apache.beam.runners.local',
     archivesBaseName: 'beam-runners-local-java-core'
 )
 
diff --git a/runners/reference/java/build.gradle b/runners/reference/java/build.gradle
index 42be346..35d4dff 100644
--- a/runners/reference/java/build.gradle
+++ b/runners/reference/java/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.reference')
 
 description = "Apache Beam :: Runners :: Reference :: Java"
 ext.summary = """A Java implementation of the Beam Model which utilizes the portability
diff --git a/runners/samza/build.gradle b/runners/samza/build.gradle
index f80f88e..209db64 100644
--- a/runners/samza/build.gradle
+++ b/runners/samza/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.runners.samza')
 
 description = "Apache Beam :: Runners :: Samza"
 
diff --git a/runners/samza/job-server/build.gradle b/runners/samza/job-server/build.gradle
index c7bea45..34c177f 100644
--- a/runners/samza/job-server/build.gradle
+++ b/runners/samza/job-server/build.gradle
@@ -24,6 +24,7 @@
 mainClassName = "org.apache.beam.runners.samza.SamzaJobServerDriver"
 
 applyJavaNature(
+    automaticModuleName: 'org.apache.beam.runners.samza.jobserver',
     validateShadowJar: false,
     exportJavadoc: false,
     shadowClosure: {
diff --git a/runners/spark/build.gradle b/runners/spark/build.gradle
index 4940ce9..9b8ff6e 100644
--- a/runners/spark/build.gradle
+++ b/runners/spark/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.runners.spark')
 
 description = "Apache Beam :: Runners :: Spark"
 
@@ -59,6 +59,7 @@
   compile project(":runners:core-construction-java")
   compile project(":runners:core-java")
   compile project(":runners:java-fn-execution")
+  compile project(":sdks:java:extensions:google-cloud-platform-core")
   compile library.java.jackson_annotations
   compile library.java.slf4j_api
   compile library.java.joda_time
diff --git a/runners/spark/job-server/build.gradle b/runners/spark/job-server/build.gradle
index 9782973..4c7ee60 100644
--- a/runners/spark/job-server/build.gradle
+++ b/runners/spark/job-server/build.gradle
@@ -28,6 +28,7 @@
 mainClassName = "org.apache.beam.runners.spark.SparkJobServerDriver"
 
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.runners.spark.jobserver',
   validateShadowJar: false,
   exportJavadoc: false,
   shadowClosure: {
diff --git a/runners/spark/src/main/java/org/apache/beam/runners/spark/SparkJobServerDriver.java b/runners/spark/src/main/java/org/apache/beam/runners/spark/SparkJobServerDriver.java
index 0589045..f0302f1 100644
--- a/runners/spark/src/main/java/org/apache/beam/runners/spark/SparkJobServerDriver.java
+++ b/runners/spark/src/main/java/org/apache/beam/runners/spark/SparkJobServerDriver.java
@@ -20,7 +20,9 @@
 import org.apache.beam.runners.fnexecution.ServerFactory;
 import org.apache.beam.runners.fnexecution.jobsubmission.JobInvoker;
 import org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver;
+import org.apache.beam.sdk.extensions.gcp.options.GcsOptions;
 import org.apache.beam.sdk.io.FileSystems;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.kohsuke.args4j.CmdLineException;
 import org.kohsuke.args4j.CmdLineParser;
@@ -51,7 +53,11 @@
   }
 
   public static void main(String[] args) {
-    FileSystems.setDefaultPipelineOptions(PipelineOptionsFactory.create());
+    PipelineOptions options = PipelineOptionsFactory.create();
+    // Limiting gcs upload buffer to reduce memory usage while doing parallel artifact uploads.
+    options.as(GcsOptions.class).setGcsUploadBufferSizeBytes(1024 * 1024);
+    // Register standard file systems.
+    FileSystems.setDefaultPipelineOptions(options);
     fromParams(args).run();
   }
 
diff --git a/sdks/go/pkg/beam/core/runtime/exec/datasource.go b/sdks/go/pkg/beam/core/runtime/exec/datasource.go
index e1da517..79eae96 100644
--- a/sdks/go/pkg/beam/core/runtime/exec/datasource.go
+++ b/sdks/go/pkg/beam/core/runtime/exec/datasource.go
@@ -39,8 +39,8 @@
 
 	source   DataManager
 	state    StateReader
-	count    int64
-	splitPos int64
+	index    int64
+	splitIdx int64
 	start    time.Time
 
 	mu sync.Mutex
@@ -62,8 +62,8 @@
 	n.source = data.Data
 	n.state = data.State
 	n.start = time.Now()
-	n.count = 0
-	n.splitPos = math.MaxInt64
+	n.index = -1
+	n.splitIdx = math.MaxInt64
 	n.mu.Unlock()
 	return n.Out.StartBundle(ctx, id, data)
 }
@@ -94,7 +94,7 @@
 	}
 
 	for {
-		if n.IncrementCountAndCheckSplit(ctx) {
+		if n.incrementIndexAndCheckSplit() {
 			return nil
 		}
 		ws, t, err := DecodeWindowedValueHeader(wc, r)
@@ -201,16 +201,14 @@
 	return buf, nil
 }
 
-// FinishBundle resets the source and metric counters.
+// FinishBundle resets the source.
 func (n *DataSource) FinishBundle(ctx context.Context) error {
 	n.mu.Lock()
 	defer n.mu.Unlock()
-	log.Infof(ctx, "DataSource: %d elements in %d ns", n.count, time.Now().Sub(n.start))
+	log.Infof(ctx, "DataSource: %d elements in %d ns", n.index, time.Now().Sub(n.start))
 	n.source = nil
-	err := n.Out.FinishBundle(ctx)
-	n.count = 0
-	n.splitPos = math.MaxInt64
-	return err
+	n.splitIdx = 0 // Ensure errors are returned for split requests if this plan is re-used.
+	return n.Out.FinishBundle(ctx)
 }
 
 // Down resets the source.
@@ -223,15 +221,15 @@
 	return fmt.Sprintf("DataSource[%v, %v] Coder:%v Out:%v", n.SID, n.Name, n.Coder, n.Out.ID())
 }
 
-// IncrementCountAndCheckSplit increments DataSource.count by one and checks if
+// incrementIndexAndCheckSplit increments DataSource.index by one and checks if
 // the caller should abort further element processing, and finish the bundle.
-// Returns true if the new value of count is greater than or equal to the split
-// point, and false otherwise.
-func (n *DataSource) IncrementCountAndCheckSplit(ctx context.Context) bool {
+// Returns true if the new value of index is greater than or equal to the split
+// index, and false otherwise.
+func (n *DataSource) incrementIndexAndCheckSplit() bool {
 	b := false
 	n.mu.Lock()
-	n.count++
-	if n.count >= n.splitPos {
+	n.index++
+	if n.index >= n.splitIdx {
 		b = true
 	}
 	n.mu.Unlock()
@@ -250,13 +248,19 @@
 		return ProgressReportSnapshot{}
 	}
 	n.mu.Lock()
-	c := n.count
+	// The count is the number of "completely processed elements"
+	// which matches the index of the currently processing element.
+	c := n.index
 	n.mu.Unlock()
-	return ProgressReportSnapshot{n.SID.PtransformID, n.Name, c}
+	// Do not sent negative progress reports, index is initialized to 0.
+	if c < 0 {
+		c = 0
+	}
+	return ProgressReportSnapshot{ID: n.SID.PtransformID, Name: n.Name, Count: c}
 }
 
-// Split takes a sorted set of potential split points, selects and actuates
-// split on an appropriate split point, and returns the selected split point
+// Split takes a sorted set of potential split indices, selects and actuates
+// split on an appropriate split index, and returns the selected split index
 // if successful. Returns an error when unable to split.
 func (n *DataSource) Split(splits []int64, frac float32) (int64, error) {
 	if splits == nil {
@@ -266,19 +270,19 @@
 		return 0, fmt.Errorf("failed to split at requested splits: {%v}, DataSource not initialized", splits)
 	}
 	n.mu.Lock()
-	c := n.count
+	c := n.index
 	// Find the smallest split index that we haven't yet processed, and set
-	// the promised split position to this value.
+	// the promised split index to this value.
 	for _, s := range splits {
-		if s > 0 && s >= c && s < n.splitPos  {
-			n.splitPos = s
-			fs := n.splitPos
+		if s > 0 && s >= c && s < n.splitIdx {
+			n.splitIdx = s
+			fs := n.splitIdx
 			n.mu.Unlock()
 			return fs, nil
 		}
 	}
 	n.mu.Unlock()
-	// If we can't find a suitable split point from the requested choices,
+	// If we can't find a suitable split index from the requested choices,
 	// return an error.
 	return 0, fmt.Errorf("failed to split at requested splits: {%v}, DataSource at index: %v", splits, c)
 }
diff --git a/sdks/go/pkg/beam/core/runtime/exec/datasource_test.go b/sdks/go/pkg/beam/core/runtime/exec/datasource_test.go
index 286d4e8..0fa6d23 100644
--- a/sdks/go/pkg/beam/core/runtime/exec/datasource_test.go
+++ b/sdks/go/pkg/beam/core/runtime/exec/datasource_test.go
@@ -17,6 +17,7 @@
 
 import (
 	"context"
+	"fmt"
 	"io"
 	"testing"
 
@@ -46,6 +47,7 @@
 				pw.Close()
 			},
 		},
+		// TODO: Test progress.
 	}
 	for _, test := range tests {
 		t.Run(test.name, func(t *testing.T) {
@@ -64,13 +66,7 @@
 				Data: &TestDataManager{R: pr},
 			})
 
-			expected := makeValues(test.expected...)
-			if got, want := len(out.Elements), len(expected); got != want {
-				t.Fatalf("lengths don't match: got %v, want %v", got, want)
-			}
-			if !equalList(out.Elements, expected) {
-				t.Errorf("DataSource => %#v, want %#v", extractValues(out.Elements...), extractValues(expected...))
-			}
+			validateSource(t, out, source, makeValues(test.expected...))
 		})
 	}
 }
@@ -158,7 +154,6 @@
 				dmw.Close()
 			},
 		},
-		// TODO: Test splitting.
 		// TODO: Test progress.
 	}
 	for _, test := range tests {
@@ -211,6 +206,147 @@
 	}
 }
 
+func TestDataSource_Split(t *testing.T) {
+	elements := []interface{}{int64(1), int64(2), int64(3), int64(4), int64(5)}
+	initSourceTest := func(name string) (*DataSource, *CaptureNode, io.ReadCloser) {
+		out := &CaptureNode{UID: 1}
+		c := coder.NewW(coder.NewVarInt(), coder.NewGlobalWindow())
+		source := &DataSource{
+			UID:   2,
+			SID:   StreamID{PtransformID: "myPTransform"},
+			Name:  name,
+			Coder: c,
+			Out:   out,
+		}
+		pr, pw := io.Pipe()
+
+		go func(c *coder.Coder, pw io.WriteCloser, elements []interface{}) {
+			wc := MakeWindowEncoder(c.Window)
+			ec := MakeElementEncoder(coder.SkipW(c))
+			for _, v := range elements {
+				EncodeWindowedValueHeader(wc, window.SingleGlobalWindow, mtime.ZeroTimestamp, pw)
+				ec.Encode(&FullValue{Elm: v}, pw)
+			}
+			pw.Close()
+		}(c, pw, elements)
+		return source, out, pr
+	}
+
+	tests := []struct {
+		name     string
+		expected []interface{}
+		splitIdx int64
+	}{
+		{splitIdx: 1},
+		{splitIdx: 2},
+		{splitIdx: 3},
+		{splitIdx: 4},
+		{splitIdx: 5},
+		{
+			name:     "wellBeyondRange",
+			expected: elements,
+			splitIdx: 1000,
+		},
+	}
+	for _, test := range tests {
+		test := test
+		if len(test.name) == 0 {
+			test.name = fmt.Sprintf("atIndex%d", test.splitIdx)
+		}
+		if test.expected == nil {
+			test.expected = elements[:test.splitIdx]
+		}
+		t.Run(test.name, func(t *testing.T) {
+			source, out, pr := initSourceTest(test.name)
+			p, err := NewPlan("a", []Unit{out, source})
+			if err != nil {
+				t.Fatalf("failed to construct plan: %v", err)
+			}
+			dc := DataContext{Data: &TestDataManager{R: pr}}
+			ctx := context.Background()
+
+			// StartBundle resets the source, so no splits can be actuated before then,
+			// which means we need to actuate the plan manually, and insert the split request
+			// after StartBundle.
+			for i, root := range p.units {
+				if err := root.Up(ctx); err != nil {
+					t.Fatalf("error in root[%d].Up: %v", i, err)
+				}
+			}
+			p.status = Active
+
+			runOnRoots(ctx, t, p, "StartBundle", func(root Root, ctx context.Context) error { return root.StartBundle(ctx, "1", dc) })
+
+			// SDK never splits on 0, so check that every test.
+			if splitIdx, err := p.Split(SplitPoints{Splits: []int64{0, test.splitIdx}}); err != nil {
+				t.Fatalf("error in Split: %v", err)
+			} else if got, want := splitIdx, test.splitIdx; got != want {
+				t.Fatalf("error in Split: got splitIdx = %v, want %v ", got, want)
+			}
+			runOnRoots(ctx, t, p, "Process", Root.Process)
+			runOnRoots(ctx, t, p, "FinishBundle", Root.FinishBundle)
+
+			validateSource(t, out, source, makeValues(test.expected...))
+		})
+	}
+
+	// Test expects splitting errors, but for processing to be successful.
+	t.Run("errors", func(t *testing.T) {
+		source, out, pr := initSourceTest("noSplitsUntilStarted")
+		p, err := NewPlan("a", []Unit{out, source})
+		if err != nil {
+			t.Fatalf("failed to construct plan: %v", err)
+		}
+		dc := DataContext{Data: &TestDataManager{R: pr}}
+		ctx := context.Background()
+
+		if _, err := p.Split(SplitPoints{Splits: []int64{0, 3}, Frac: -1}); err == nil {
+			t.Fatal("plan uninitialized, expected error when splitting, got nil")
+		}
+		for i, root := range p.units {
+			if err := root.Up(ctx); err != nil {
+				t.Fatalf("error in root[%d].Up: %v", i, err)
+			}
+		}
+		p.status = Active
+		if _, err := p.Split(SplitPoints{Splits: []int64{0, 3}, Frac: -1}); err == nil {
+			t.Fatal("plan not started, expected error when splitting, got nil")
+		}
+		runOnRoots(ctx, t, p, "StartBundle", func(root Root, ctx context.Context) error { return root.StartBundle(ctx, "1", dc) })
+		if _, err := p.Split(SplitPoints{Splits: []int64{0}, Frac: -1}); err == nil {
+			t.Fatal("plan started, expected error when splitting, got nil")
+		}
+		runOnRoots(ctx, t, p, "Process", Root.Process)
+		if _, err := p.Split(SplitPoints{Splits: []int64{0}, Frac: -1}); err == nil {
+			t.Fatal("plan in progress, expected error when unable to get a desired split, got nil")
+		}
+		runOnRoots(ctx, t, p, "FinishBundle", Root.FinishBundle)
+		if _, err := p.Split(SplitPoints{Splits: []int64{0}, Frac: -1}); err == nil {
+			t.Fatal("plan finished, expected error when splitting, got nil")
+		}
+		validateSource(t, out, source, makeValues(elements...))
+	})
+
+	t.Run("sanity_errors", func(t *testing.T) {
+		var source *DataSource
+		if _, err := source.Split([]int64{0}, -1); err == nil {
+			t.Fatal("expected error splitting nil *DataSource")
+		}
+		if _, err := source.Split(nil, -1); err == nil {
+			t.Fatal("expected error splitting nil desired splits")
+		}
+	})
+}
+
+func runOnRoots(ctx context.Context, t *testing.T, p *Plan, name string, mthd func(Root, context.Context) error) {
+	t.Helper()
+	for i, root := range p.roots {
+		if err := mthd(root, ctx); err != nil {
+			t.Fatalf("error in root[%d].%s: %v", i, name, err)
+		}
+	}
+}
+
 type TestDataManager struct {
 	R io.ReadCloser
 }
@@ -246,3 +382,16 @@
 		t.Fatalf("down failed: %v", err)
 	}
 }
+
+func validateSource(t *testing.T, out *CaptureNode, source *DataSource, expected []FullValue) {
+	t.Helper()
+	if got, want := len(out.Elements), len(expected); got != want {
+		t.Fatalf("lengths don't match: got %v, want %v", got, want)
+	}
+	if got, want := source.Progress().Count, int64(len(expected)); got != want {
+		t.Fatalf("progress count didn't match: got %v, want %v", got, want)
+	}
+	if !equalList(out.Elements, expected) {
+		t.Errorf("DataSource => %#v, want %#v", extractValues(out.Elements...), extractValues(expected...))
+	}
+}
diff --git a/sdks/go/pkg/beam/core/runtime/exec/plan.go b/sdks/go/pkg/beam/core/runtime/exec/plan.go
index 46c4e72..d221c7e 100644
--- a/sdks/go/pkg/beam/core/runtime/exec/plan.go
+++ b/sdks/go/pkg/beam/core/runtime/exec/plan.go
@@ -199,12 +199,14 @@
 
 // SplitPoints captures the split requested by the Runner.
 type SplitPoints struct {
+	// Splits is a list of desired split indices.
 	Splits []int64
 	Frac   float32
 }
 
-// Split takes a set of potential split points, selects and actuates split on an
-// appropriate split point, and returns the selected split point if successful.
+// Split takes a set of potential split indexes, and if successful returns
+// the split index of the first element of the residual, on which processing
+// will be halted.
 // Returns an error when unable to split.
 func (p *Plan) Split(s SplitPoints) (int64, error) {
 	if p.source != nil {
diff --git a/sdks/go/pkg/beam/core/runtime/exec/translate.go b/sdks/go/pkg/beam/core/runtime/exec/translate.go
index ee2efc4..e6caafc 100644
--- a/sdks/go/pkg/beam/core/runtime/exec/translate.go
+++ b/sdks/go/pkg/beam/core/runtime/exec/translate.go
@@ -492,7 +492,7 @@
 
 	case graphx.URNFlatten:
 		u = &Flatten{UID: b.idgen.New(), N: len(transform.Inputs), Out: out[0]}
-	
+
 		// Use the same flatten instance for all the inputs links to this transform.
 		for i := 0; i < len(transform.Inputs); i++ {
 			b.links[linkID{id.to, i}] = u
diff --git a/sdks/go/pkg/beam/core/runtime/harness/datamgr.go b/sdks/go/pkg/beam/core/runtime/harness/datamgr.go
index 70ba226..453cf9f 100644
--- a/sdks/go/pkg/beam/core/runtime/harness/datamgr.go
+++ b/sdks/go/pkg/beam/core/runtime/harness/datamgr.go
@@ -198,7 +198,7 @@
 		// to reduce lock contention.
 
 		for _, elm := range msg.GetData() {
-			id := clientID{ptransformID: elm.PtransformId, instID: elm.GetInstructionReference()}
+			id := clientID{ptransformID: elm.TransformId, instID: elm.GetInstructionId()}
 
 			// log.Printf("Chan read (%v): %v\n", sid, elm.GetData())
 
@@ -333,8 +333,8 @@
 	msg := &pb.Elements{
 		Data: []*pb.Elements_Data{
 			{
-				InstructionReference: w.id.instID,
-				PtransformId:         w.id.ptransformID,
+				InstructionId: w.id.instID,
+				TransformId:   w.id.ptransformID,
 				// Empty data == sentinel
 			},
 		},
@@ -357,9 +357,9 @@
 	msg := &pb.Elements{
 		Data: []*pb.Elements_Data{
 			{
-				InstructionReference: w.id.instID,
-				PtransformId:         w.id.ptransformID,
-				Data:                 w.buf,
+				InstructionId: w.id.instID,
+				TransformId:   w.id.ptransformID,
+				Data:          w.buf,
 			},
 		},
 	}
diff --git a/sdks/go/pkg/beam/core/runtime/harness/datamgr_test.go b/sdks/go/pkg/beam/core/runtime/harness/datamgr_test.go
index 7d80978..1bbf22e 100644
--- a/sdks/go/pkg/beam/core/runtime/harness/datamgr_test.go
+++ b/sdks/go/pkg/beam/core/runtime/harness/datamgr_test.go
@@ -35,9 +35,9 @@
 	f.calls++
 	data := []byte{1, 2, 3, 4}
 	elemData := pb.Elements_Data{
-		InstructionReference: "inst_ref",
-		Data:                 data,
-		PtransformId:         "ptr",
+		InstructionId: "inst_ref",
+		Data:          data,
+		TransformId:   "ptr",
 	}
 
 	msg := pb.Elements{}
diff --git a/sdks/go/pkg/beam/core/runtime/harness/harness.go b/sdks/go/pkg/beam/core/runtime/harness/harness.go
index 642583d..dcc7922 100644
--- a/sdks/go/pkg/beam/core/runtime/harness/harness.go
+++ b/sdks/go/pkg/beam/core/runtime/harness/harness.go
@@ -180,7 +180,7 @@
 
 		log.Debugf(ctx, "PB: %v", msg)
 
-		ref := msg.GetProcessBundleDescriptorReference()
+		ref := msg.GetProcessBundleDescriptorId()
 		c.mu.Lock()
 		plan, ok := c.plans[ref]
 		// Make the plan active, and remove it from candidates
@@ -224,7 +224,7 @@
 
 		// log.Debugf(ctx, "PB Progress: %v", msg)
 
-		ref := msg.GetInstructionReference()
+		ref := msg.GetInstructionId()
 		c.mu.Lock()
 		plan, ok := c.active[ref]
 		c.mu.Unlock()
@@ -247,7 +247,7 @@
 		msg := req.GetProcessBundleSplit()
 
 		log.Debugf(ctx, "PB Split: %v", msg)
-		ref := msg.GetInstructionReference()
+		ref := msg.GetInstructionId()
 		c.mu.Lock()
 		plan, ok := c.active[ref]
 		c.mu.Unlock()
diff --git a/sdks/go/pkg/beam/core/runtime/harness/logging.go b/sdks/go/pkg/beam/core/runtime/harness/logging.go
index ad24148..63afe7c 100644
--- a/sdks/go/pkg/beam/core/runtime/harness/logging.go
+++ b/sdks/go/pkg/beam/core/runtime/harness/logging.go
@@ -29,7 +29,7 @@
 )
 
 // TODO(herohde) 10/12/2017: make this file a separate package. Then
-// populate InstructionReference and PrimitiveTransformReference properly.
+// populate InstructionId and TransformId properly.
 
 // TODO(herohde) 10/13/2017: add top-level harness.Main panic handler that flushes logs.
 // Also make logger flush on Fatal severity messages.
@@ -65,7 +65,7 @@
 		entry.LogLocation = fmt.Sprintf("%v:%v", file, line)
 	}
 	if id, ok := tryGetInstID(ctx); ok {
-		entry.InstructionReference = id
+		entry.InstructionId = id
 	}
 
 	select {
diff --git a/sdks/go/pkg/beam/core/runtime/harness/statemgr.go b/sdks/go/pkg/beam/core/runtime/harness/statemgr.go
index e030b3e..ff20ff9 100644
--- a/sdks/go/pkg/beam/core/runtime/harness/statemgr.go
+++ b/sdks/go/pkg/beam/core/runtime/harness/statemgr.go
@@ -119,10 +119,10 @@
 	key := &pb.StateKey{
 		Type: &pb.StateKey_MultimapSideInput_{
 			MultimapSideInput: &pb.StateKey_MultimapSideInput{
-				PtransformId: id.PtransformID,
-				SideInputId:  sideInputID,
-				Window:       w,
-				Key:          k,
+				TransformId: id.PtransformID,
+				SideInputId: sideInputID,
+				Window:      w,
+				Key:         k,
 			},
 		},
 	}
@@ -166,8 +166,8 @@
 
 		req := &pb.StateRequest{
 			// Id: set by channel
-			InstructionReference: r.instID,
-			StateKey:             r.key,
+			InstructionId: r.instID,
+			StateKey:      r.key,
 			Request: &pb.StateRequest_Get{
 				Get: &pb.StateGetRequest{
 					ContinuationToken: r.token,
diff --git a/sdks/go/pkg/beam/core/typex/fulltype.go b/sdks/go/pkg/beam/core/typex/fulltype.go
index a96e6f9..011c91d 100644
--- a/sdks/go/pkg/beam/core/typex/fulltype.go
+++ b/sdks/go/pkg/beam/core/typex/fulltype.go
@@ -180,6 +180,14 @@
 	return t
 }
 
+// SkipK skips the key in a KV layer, if present. If no, returns the input.
+func SkipK(t FullType) FullType {
+	if t.Type() == KVType {
+		return t.Components()[1]
+	}
+	return t
+}
+
 // IsKV returns true iff the type is a KV.
 func IsKV(t FullType) bool {
 	return t.Type() == KVType
diff --git a/sdks/go/pkg/beam/doc_test.go b/sdks/go/pkg/beam/doc_test.go
index 645926f..92a2b03 100644
--- a/sdks/go/pkg/beam/doc_test.go
+++ b/sdks/go/pkg/beam/doc_test.go
@@ -128,9 +128,9 @@
 	a := textio.Read(s, "...some file path...") // PCollection<string>
 
 	beam.Seq(s, a,
-		strconv.Atoi,                              // string to int
+		strconv.Atoi, // string to int
 		func(i int) float64 { return float64(i) }, // int to float64
-		math.Signbit,                              // float64 to bool
+		math.Signbit, // float64 to bool
 	) // PCollection<bool>
 }
 
diff --git a/sdks/go/pkg/beam/model/fnexecution_v1/beam_fn_api.pb.go b/sdks/go/pkg/beam/model/fnexecution_v1/beam_fn_api.pb.go
index ad8aaa7..55de402 100644
--- a/sdks/go/pkg/beam/model/fnexecution_v1/beam_fn_api.pb.go
+++ b/sdks/go/pkg/beam/model/fnexecution_v1/beam_fn_api.pb.go
@@ -74,7 +74,7 @@
 	return proto.EnumName(LogEntry_Severity_Enum_name, int32(x))
 }
 func (LogEntry_Severity_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{27, 1, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{27, 1, 0}
 }
 
 // A descriptor for connecting to a remote port using the Beam Fn Data API.
@@ -97,7 +97,7 @@
 func (m *RemoteGrpcPort) String() string { return proto.CompactTextString(m) }
 func (*RemoteGrpcPort) ProtoMessage()    {}
 func (*RemoteGrpcPort) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{0}
 }
 func (m *RemoteGrpcPort) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_RemoteGrpcPort.Unmarshal(m, b)
@@ -157,7 +157,7 @@
 func (m *InstructionRequest) String() string { return proto.CompactTextString(m) }
 func (*InstructionRequest) ProtoMessage()    {}
 func (*InstructionRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{1}
 }
 func (m *InstructionRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_InstructionRequest.Unmarshal(m, b)
@@ -413,7 +413,7 @@
 func (m *InstructionResponse) String() string { return proto.CompactTextString(m) }
 func (*InstructionResponse) ProtoMessage()    {}
 func (*InstructionResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{2}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{2}
 }
 func (m *InstructionResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_InstructionResponse.Unmarshal(m, b)
@@ -661,7 +661,7 @@
 func (m *RegisterRequest) String() string { return proto.CompactTextString(m) }
 func (*RegisterRequest) ProtoMessage()    {}
 func (*RegisterRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{3}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{3}
 }
 func (m *RegisterRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_RegisterRequest.Unmarshal(m, b)
@@ -699,7 +699,7 @@
 func (m *RegisterResponse) String() string { return proto.CompactTextString(m) }
 func (*RegisterResponse) ProtoMessage()    {}
 func (*RegisterResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{4}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{4}
 }
 func (m *RegisterResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_RegisterResponse.Unmarshal(m, b)
@@ -747,7 +747,7 @@
 func (m *ProcessBundleDescriptor) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleDescriptor) ProtoMessage()    {}
 func (*ProcessBundleDescriptor) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{5}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{5}
 }
 func (m *ProcessBundleDescriptor) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleDescriptor.Unmarshal(m, b)
@@ -821,8 +821,8 @@
 // https://docs.google.com/document/d/1tUDb45sStdR8u7-jBkGdw3OGFK7aa2-V7eo86zYSE_4/edit#heading=h.9g3g5weg2u9
 // for further details.
 type BundleApplication struct {
-	// (Required) The primitive transform to which to pass the element
-	PtransformId string `protobuf:"bytes,1,opt,name=ptransform_id,json=ptransformId,proto3" json:"ptransform_id,omitempty"`
+	// (Required) The transform to which to pass the element
+	TransformId string `protobuf:"bytes,1,opt,name=transform_id,json=transformId,proto3" json:"transform_id,omitempty"`
 	// (Required) Name of the transform's input to which to pass the element.
 	InputId string `protobuf:"bytes,2,opt,name=input_id,json=inputId,proto3" json:"input_id,omitempty"`
 	// (Required) The encoded element to pass to the transform.
@@ -838,14 +838,11 @@
 	// (Required) Whether this application potentially produces an unbounded
 	// amount of data. Note that this should only be set to BOUNDED if and
 	// only if the application is known to produce a finite amount of output.
-	//
-	// Note that this is different from the backlog as the backlog represents
-	// how much work there is currently outstanding.
 	IsBounded pipeline_v1.IsBounded_Enum `protobuf:"varint,5,opt,name=is_bounded,json=isBounded,proto3,enum=org.apache.beam.model.pipeline.v1.IsBounded_Enum" json:"is_bounded,omitempty"`
 	// Contains additional monitoring information related to this application.
 	//
 	// Each application is able to report information that some runners
-	// will use consume when providing a UI or for making scaling and performance
+	// will use when providing a UI or for making scaling and performance
 	// decisions. See https://s.apache.org/beam-bundles-backlog-splitting for
 	// details about what types of signals may be useful to report.
 	MonitoringInfos      []*pipeline_v1.MonitoringInfo `protobuf:"bytes,6,rep,name=monitoring_infos,json=monitoringInfos,proto3" json:"monitoring_infos,omitempty"`
@@ -858,7 +855,7 @@
 func (m *BundleApplication) String() string { return proto.CompactTextString(m) }
 func (*BundleApplication) ProtoMessage()    {}
 func (*BundleApplication) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{6}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{6}
 }
 func (m *BundleApplication) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_BundleApplication.Unmarshal(m, b)
@@ -878,9 +875,9 @@
 
 var xxx_messageInfo_BundleApplication proto.InternalMessageInfo
 
-func (m *BundleApplication) GetPtransformId() string {
+func (m *BundleApplication) GetTransformId() string {
 	if m != nil {
-		return m.PtransformId
+		return m.TransformId
 	}
 	return ""
 }
@@ -936,7 +933,7 @@
 func (m *DelayedBundleApplication) String() string { return proto.CompactTextString(m) }
 func (*DelayedBundleApplication) ProtoMessage()    {}
 func (*DelayedBundleApplication) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{7}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{7}
 }
 func (m *DelayedBundleApplication) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_DelayedBundleApplication.Unmarshal(m, b)
@@ -975,7 +972,7 @@
 type ProcessBundleRequest struct {
 	// (Required) A reference to the process bundle descriptor that must be
 	// instantiated and executed by the SDK harness.
-	ProcessBundleDescriptorReference string `protobuf:"bytes,1,opt,name=process_bundle_descriptor_reference,json=processBundleDescriptorReference,proto3" json:"process_bundle_descriptor_reference,omitempty"`
+	ProcessBundleDescriptorId string `protobuf:"bytes,1,opt,name=process_bundle_descriptor_id,json=processBundleDescriptorId,proto3" json:"process_bundle_descriptor_id,omitempty"`
 	// (Optional) A list of cache tokens that can be used by an SDK to reuse
 	// cached data returned by the State API across multiple bundles.
 	CacheTokens          []*ProcessBundleRequest_CacheToken `protobuf:"bytes,2,rep,name=cache_tokens,json=cacheTokens,proto3" json:"cache_tokens,omitempty"`
@@ -988,7 +985,7 @@
 func (m *ProcessBundleRequest) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleRequest) ProtoMessage()    {}
 func (*ProcessBundleRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{8}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{8}
 }
 func (m *ProcessBundleRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleRequest.Unmarshal(m, b)
@@ -1008,9 +1005,9 @@
 
 var xxx_messageInfo_ProcessBundleRequest proto.InternalMessageInfo
 
-func (m *ProcessBundleRequest) GetProcessBundleDescriptorReference() string {
+func (m *ProcessBundleRequest) GetProcessBundleDescriptorId() string {
 	if m != nil {
-		return m.ProcessBundleDescriptorReference
+		return m.ProcessBundleDescriptorId
 	}
 	return ""
 }
@@ -1042,7 +1039,7 @@
 func (m *ProcessBundleRequest_CacheToken) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleRequest_CacheToken) ProtoMessage()    {}
 func (*ProcessBundleRequest_CacheToken) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{8, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{8, 0}
 }
 func (m *ProcessBundleRequest_CacheToken) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleRequest_CacheToken.Unmarshal(m, b)
@@ -1191,7 +1188,7 @@
 func (m *ProcessBundleRequest_CacheToken_UserState) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleRequest_CacheToken_UserState) ProtoMessage()    {}
 func (*ProcessBundleRequest_CacheToken_UserState) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{8, 0, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{8, 0, 0}
 }
 func (m *ProcessBundleRequest_CacheToken_UserState) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleRequest_CacheToken_UserState.Unmarshal(m, b)
@@ -1226,7 +1223,7 @@
 func (m *ProcessBundleRequest_CacheToken_SideInput) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleRequest_CacheToken_SideInput) ProtoMessage()    {}
 func (*ProcessBundleRequest_CacheToken_SideInput) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{8, 0, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{8, 0, 1}
 }
 func (m *ProcessBundleRequest_CacheToken_SideInput) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleRequest_CacheToken_SideInput.Unmarshal(m, b)
@@ -1280,7 +1277,7 @@
 func (m *ProcessBundleResponse) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleResponse) ProtoMessage()    {}
 func (*ProcessBundleResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{9}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{9}
 }
 func (m *ProcessBundleResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleResponse.Unmarshal(m, b)
@@ -1334,7 +1331,7 @@
 type ProcessBundleProgressRequest struct {
 	// (Required) A reference to an active process bundle request with the given
 	// instruction id.
-	InstructionReference string   `protobuf:"bytes,1,opt,name=instruction_reference,json=instructionReference,proto3" json:"instruction_reference,omitempty"`
+	InstructionId        string   `protobuf:"bytes,1,opt,name=instruction_id,json=instructionId,proto3" json:"instruction_id,omitempty"`
 	XXX_NoUnkeyedLiteral struct{} `json:"-"`
 	XXX_unrecognized     []byte   `json:"-"`
 	XXX_sizecache        int32    `json:"-"`
@@ -1344,7 +1341,7 @@
 func (m *ProcessBundleProgressRequest) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleProgressRequest) ProtoMessage()    {}
 func (*ProcessBundleProgressRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{10}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{10}
 }
 func (m *ProcessBundleProgressRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleProgressRequest.Unmarshal(m, b)
@@ -1364,9 +1361,9 @@
 
 var xxx_messageInfo_ProcessBundleProgressRequest proto.InternalMessageInfo
 
-func (m *ProcessBundleProgressRequest) GetInstructionReference() string {
+func (m *ProcessBundleProgressRequest) GetInstructionId() string {
 	if m != nil {
-		return m.InstructionReference
+		return m.InstructionId
 	}
 	return ""
 }
@@ -1383,7 +1380,7 @@
 func (m *Metrics) String() string { return proto.CompactTextString(m) }
 func (*Metrics) ProtoMessage()    {}
 func (*Metrics) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11}
 }
 func (m *Metrics) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics.Unmarshal(m, b)
@@ -1414,7 +1411,7 @@
 // These metrics are split into processed and active element groups for
 // progress reporting purposes. This allows a Runner to see what is measured,
 // what is estimated and what can be extrapolated to be able to accurately
-// estimate the backlog of remaining work.
+// estimate the amount of remaining work.
 type Metrics_PTransform struct {
 	// (Required): Metrics for processed elements.
 	ProcessedElements *Metrics_PTransform_ProcessedElements `protobuf:"bytes,1,opt,name=processed_elements,json=processedElements,proto3" json:"processed_elements,omitempty"`
@@ -1436,7 +1433,7 @@
 func (m *Metrics_PTransform) String() string { return proto.CompactTextString(m) }
 func (*Metrics_PTransform) ProtoMessage()    {}
 func (*Metrics_PTransform) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 0}
 }
 func (m *Metrics_PTransform) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_PTransform.Unmarshal(m, b)
@@ -1506,7 +1503,7 @@
 func (m *Metrics_PTransform_Measured) String() string { return proto.CompactTextString(m) }
 func (*Metrics_PTransform_Measured) ProtoMessage()    {}
 func (*Metrics_PTransform_Measured) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 0, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 0, 0}
 }
 func (m *Metrics_PTransform_Measured) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_PTransform_Measured.Unmarshal(m, b)
@@ -1560,7 +1557,7 @@
 func (m *Metrics_PTransform_ProcessedElements) String() string { return proto.CompactTextString(m) }
 func (*Metrics_PTransform_ProcessedElements) ProtoMessage()    {}
 func (*Metrics_PTransform_ProcessedElements) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 0, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 0, 1}
 }
 func (m *Metrics_PTransform_ProcessedElements) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_PTransform_ProcessedElements.Unmarshal(m, b)
@@ -1614,7 +1611,7 @@
 func (m *Metrics_PTransform_ActiveElements) String() string { return proto.CompactTextString(m) }
 func (*Metrics_PTransform_ActiveElements) ProtoMessage()    {}
 func (*Metrics_PTransform_ActiveElements) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 0, 2}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 0, 2}
 }
 func (m *Metrics_PTransform_ActiveElements) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_PTransform_ActiveElements.Unmarshal(m, b)
@@ -1675,7 +1672,7 @@
 func (m *Metrics_User) String() string { return proto.CompactTextString(m) }
 func (*Metrics_User) ProtoMessage()    {}
 func (*Metrics_User) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 1}
 }
 func (m *Metrics_User) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_User.Unmarshal(m, b)
@@ -1856,7 +1853,7 @@
 func (m *Metrics_User_MetricName) String() string { return proto.CompactTextString(m) }
 func (*Metrics_User_MetricName) ProtoMessage()    {}
 func (*Metrics_User_MetricName) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 1, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 1, 0}
 }
 func (m *Metrics_User_MetricName) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_User_MetricName.Unmarshal(m, b)
@@ -1902,7 +1899,7 @@
 func (m *Metrics_User_CounterData) String() string { return proto.CompactTextString(m) }
 func (*Metrics_User_CounterData) ProtoMessage()    {}
 func (*Metrics_User_CounterData) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 1, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 1, 1}
 }
 func (m *Metrics_User_CounterData) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_User_CounterData.Unmarshal(m, b)
@@ -1944,7 +1941,7 @@
 func (m *Metrics_User_DistributionData) String() string { return proto.CompactTextString(m) }
 func (*Metrics_User_DistributionData) ProtoMessage()    {}
 func (*Metrics_User_DistributionData) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 1, 2}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 1, 2}
 }
 func (m *Metrics_User_DistributionData) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_User_DistributionData.Unmarshal(m, b)
@@ -2005,7 +2002,7 @@
 func (m *Metrics_User_GaugeData) String() string { return proto.CompactTextString(m) }
 func (*Metrics_User_GaugeData) ProtoMessage()    {}
 func (*Metrics_User_GaugeData) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{11, 1, 3}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{11, 1, 3}
 }
 func (m *Metrics_User_GaugeData) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Metrics_User_GaugeData.Unmarshal(m, b)
@@ -2057,7 +2054,7 @@
 func (m *ProcessBundleProgressResponse) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleProgressResponse) ProtoMessage()    {}
 func (*ProcessBundleProgressResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{12}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{12}
 }
 func (m *ProcessBundleProgressResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleProgressResponse.Unmarshal(m, b)
@@ -2102,19 +2099,7 @@
 type ProcessBundleSplitRequest struct {
 	// (Required) A reference to an active process bundle request with the given
 	// instruction id.
-	InstructionReference string `protobuf:"bytes,1,opt,name=instruction_reference,json=instructionReference,proto3" json:"instruction_reference,omitempty"`
-	// (Required) Specifies that the Runner would like the bundle to split itself
-	// such that it performs no more work than the backlog specified for each
-	// PTransform. The interpretation of how much work should be processed is up
-	// to the PTransform.
-	//
-	// For example, A backlog of "" tells the SDK to perform as little work as
-	// possible, effectively checkpointing when able. The remaining backlog
-	// will be relative to the backlog reported during processing.
-	//
-	// If the backlog is unspecified for a PTransform, the runner would like
-	// the SDK to process all data received for that PTransform.
-	BacklogRemaining map[string][]byte `protobuf:"bytes,2,rep,name=backlog_remaining,json=backlogRemaining,proto3" json:"backlog_remaining,omitempty" protobuf_key:"bytes,1,opt,name=key,proto3" protobuf_val:"bytes,2,opt,name=value,proto3"`
+	InstructionId string `protobuf:"bytes,1,opt,name=instruction_id,json=instructionId,proto3" json:"instruction_id,omitempty"`
 	// (Required) Specifies the desired split for each transform.
 	//
 	// Currently only splits at GRPC read operations are supported.
@@ -2130,7 +2115,7 @@
 func (m *ProcessBundleSplitRequest) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleSplitRequest) ProtoMessage()    {}
 func (*ProcessBundleSplitRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{13}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{13}
 }
 func (m *ProcessBundleSplitRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleSplitRequest.Unmarshal(m, b)
@@ -2150,20 +2135,13 @@
 
 var xxx_messageInfo_ProcessBundleSplitRequest proto.InternalMessageInfo
 
-func (m *ProcessBundleSplitRequest) GetInstructionReference() string {
+func (m *ProcessBundleSplitRequest) GetInstructionId() string {
 	if m != nil {
-		return m.InstructionReference
+		return m.InstructionId
 	}
 	return ""
 }
 
-func (m *ProcessBundleSplitRequest) GetBacklogRemaining() map[string][]byte {
-	if m != nil {
-		return m.BacklogRemaining
-	}
-	return nil
-}
-
 func (m *ProcessBundleSplitRequest) GetDesiredSplits() map[string]*ProcessBundleSplitRequest_DesiredSplit {
 	if m != nil {
 		return m.DesiredSplits
@@ -2197,7 +2175,7 @@
 func (m *ProcessBundleSplitRequest_DesiredSplit) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleSplitRequest_DesiredSplit) ProtoMessage()    {}
 func (*ProcessBundleSplitRequest_DesiredSplit) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{13, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{13, 0}
 }
 func (m *ProcessBundleSplitRequest_DesiredSplit) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleSplitRequest_DesiredSplit.Unmarshal(m, b)
@@ -2267,7 +2245,7 @@
 func (m *ProcessBundleSplitResponse) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleSplitResponse) ProtoMessage()    {}
 func (*ProcessBundleSplitResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{14}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{14}
 }
 func (m *ProcessBundleSplitResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleSplitResponse.Unmarshal(m, b)
@@ -2318,7 +2296,7 @@
 // as some range in an underlying dataset).
 type ProcessBundleSplitResponse_ChannelSplit struct {
 	// (Required) The grpc read transform reading this channel.
-	PtransformId string `protobuf:"bytes,1,opt,name=ptransform_id,json=ptransformId,proto3" json:"ptransform_id,omitempty"`
+	TransformId string `protobuf:"bytes,1,opt,name=transform_id,json=transformId,proto3" json:"transform_id,omitempty"`
 	// The last element of the input channel that should be entirely considered
 	// part of the primary, identified by its absolute index in the (ordered)
 	// channel.
@@ -2338,7 +2316,7 @@
 func (m *ProcessBundleSplitResponse_ChannelSplit) String() string { return proto.CompactTextString(m) }
 func (*ProcessBundleSplitResponse_ChannelSplit) ProtoMessage()    {}
 func (*ProcessBundleSplitResponse_ChannelSplit) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{14, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{14, 0}
 }
 func (m *ProcessBundleSplitResponse_ChannelSplit) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessBundleSplitResponse_ChannelSplit.Unmarshal(m, b)
@@ -2358,9 +2336,9 @@
 
 var xxx_messageInfo_ProcessBundleSplitResponse_ChannelSplit proto.InternalMessageInfo
 
-func (m *ProcessBundleSplitResponse_ChannelSplit) GetPtransformId() string {
+func (m *ProcessBundleSplitResponse_ChannelSplit) GetTransformId() string {
 	if m != nil {
-		return m.PtransformId
+		return m.TransformId
 	}
 	return ""
 }
@@ -2382,7 +2360,7 @@
 type FinalizeBundleRequest struct {
 	// (Required) A reference to a completed process bundle request with the given
 	// instruction id.
-	InstructionReference string   `protobuf:"bytes,1,opt,name=instruction_reference,json=instructionReference,proto3" json:"instruction_reference,omitempty"`
+	InstructionId        string   `protobuf:"bytes,1,opt,name=instruction_id,json=instructionId,proto3" json:"instruction_id,omitempty"`
 	XXX_NoUnkeyedLiteral struct{} `json:"-"`
 	XXX_unrecognized     []byte   `json:"-"`
 	XXX_sizecache        int32    `json:"-"`
@@ -2392,7 +2370,7 @@
 func (m *FinalizeBundleRequest) String() string { return proto.CompactTextString(m) }
 func (*FinalizeBundleRequest) ProtoMessage()    {}
 func (*FinalizeBundleRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{15}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{15}
 }
 func (m *FinalizeBundleRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_FinalizeBundleRequest.Unmarshal(m, b)
@@ -2412,9 +2390,9 @@
 
 var xxx_messageInfo_FinalizeBundleRequest proto.InternalMessageInfo
 
-func (m *FinalizeBundleRequest) GetInstructionReference() string {
+func (m *FinalizeBundleRequest) GetInstructionId() string {
 	if m != nil {
-		return m.InstructionReference
+		return m.InstructionId
 	}
 	return ""
 }
@@ -2429,7 +2407,7 @@
 func (m *FinalizeBundleResponse) String() string { return proto.CompactTextString(m) }
 func (*FinalizeBundleResponse) ProtoMessage()    {}
 func (*FinalizeBundleResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{16}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{16}
 }
 func (m *FinalizeBundleResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_FinalizeBundleResponse.Unmarshal(m, b)
@@ -2463,7 +2441,7 @@
 func (m *Elements) String() string { return proto.CompactTextString(m) }
 func (*Elements) ProtoMessage()    {}
 func (*Elements) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{17}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{17}
 }
 func (m *Elements) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Elements.Unmarshal(m, b)
@@ -2495,7 +2473,7 @@
 type Elements_Data struct {
 	// (Required) A reference to an active instruction request with the given
 	// instruction id.
-	InstructionReference string `protobuf:"bytes,1,opt,name=instruction_reference,json=instructionReference,proto3" json:"instruction_reference,omitempty"`
+	InstructionId string `protobuf:"bytes,1,opt,name=instruction_id,json=instructionId,proto3" json:"instruction_id,omitempty"`
 	// (Required) A definition representing a consumer or producer of this data.
 	// If received by a harness, this represents the consumer within that
 	// harness that should consume these bytes. If sent by a harness, this
@@ -2504,7 +2482,7 @@
 	// Note that a single element may span multiple Data messages.
 	//
 	// Note that a sending/receiving pair should share the same identifier.
-	PtransformId string `protobuf:"bytes,2,opt,name=ptransform_id,json=ptransformId,proto3" json:"ptransform_id,omitempty"`
+	TransformId string `protobuf:"bytes,2,opt,name=transform_id,json=transformId,proto3" json:"transform_id,omitempty"`
 	// (Optional) Represents a part of a logical byte stream. Elements within
 	// the logical byte stream are encoded in the nested context and
 	// concatenated together.
@@ -2521,7 +2499,7 @@
 func (m *Elements_Data) String() string { return proto.CompactTextString(m) }
 func (*Elements_Data) ProtoMessage()    {}
 func (*Elements_Data) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{17, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{17, 0}
 }
 func (m *Elements_Data) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Elements_Data.Unmarshal(m, b)
@@ -2541,16 +2519,16 @@
 
 var xxx_messageInfo_Elements_Data proto.InternalMessageInfo
 
-func (m *Elements_Data) GetInstructionReference() string {
+func (m *Elements_Data) GetInstructionId() string {
 	if m != nil {
-		return m.InstructionReference
+		return m.InstructionId
 	}
 	return ""
 }
 
-func (m *Elements_Data) GetPtransformId() string {
+func (m *Elements_Data) GetTransformId() string {
 	if m != nil {
-		return m.PtransformId
+		return m.TransformId
 	}
 	return ""
 }
@@ -2570,7 +2548,7 @@
 	// (Required) The associated instruction id of the work that is currently
 	// being processed. This allows for the runner to associate any modifications
 	// to state to be committed with the appropriate work execution.
-	InstructionReference string `protobuf:"bytes,2,opt,name=instruction_reference,json=instructionReference,proto3" json:"instruction_reference,omitempty"`
+	InstructionId string `protobuf:"bytes,2,opt,name=instruction_id,json=instructionId,proto3" json:"instruction_id,omitempty"`
 	// (Required) The state key this request is for.
 	StateKey *StateKey `protobuf:"bytes,3,opt,name=state_key,json=stateKey,proto3" json:"state_key,omitempty"`
 	// (Required) The action to take on this request.
@@ -2589,7 +2567,7 @@
 func (m *StateRequest) String() string { return proto.CompactTextString(m) }
 func (*StateRequest) ProtoMessage()    {}
 func (*StateRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{18}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{18}
 }
 func (m *StateRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateRequest.Unmarshal(m, b)
@@ -2641,9 +2619,9 @@
 	return ""
 }
 
-func (m *StateRequest) GetInstructionReference() string {
+func (m *StateRequest) GetInstructionId() string {
 	if m != nil {
-		return m.InstructionReference
+		return m.InstructionId
 	}
 	return ""
 }
@@ -2794,7 +2772,7 @@
 func (m *StateResponse) String() string { return proto.CompactTextString(m) }
 func (*StateResponse) ProtoMessage()    {}
 func (*StateResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{19}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{19}
 }
 func (m *StateResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateResponse.Unmarshal(m, b)
@@ -2984,7 +2962,7 @@
 func (m *StateKey) String() string { return proto.CompactTextString(m) }
 func (*StateKey) ProtoMessage()    {}
 func (*StateKey) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{20}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{20}
 }
 func (m *StateKey) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateKey.Unmarshal(m, b)
@@ -3161,7 +3139,7 @@
 func (m *StateKey_Runner) String() string { return proto.CompactTextString(m) }
 func (*StateKey_Runner) ProtoMessage()    {}
 func (*StateKey_Runner) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{20, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{20, 0}
 }
 func (m *StateKey_Runner) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateKey_Runner.Unmarshal(m, b)
@@ -3190,7 +3168,7 @@
 
 type StateKey_MultimapSideInput struct {
 	// (Required) The id of the PTransform containing a side input.
-	PtransformId string `protobuf:"bytes,1,opt,name=ptransform_id,json=ptransformId,proto3" json:"ptransform_id,omitempty"`
+	TransformId string `protobuf:"bytes,1,opt,name=transform_id,json=transformId,proto3" json:"transform_id,omitempty"`
 	// (Required) The id of the side input.
 	SideInputId string `protobuf:"bytes,2,opt,name=side_input_id,json=sideInputId,proto3" json:"side_input_id,omitempty"`
 	// (Required) The window (after mapping the currently executing elements
@@ -3207,7 +3185,7 @@
 func (m *StateKey_MultimapSideInput) String() string { return proto.CompactTextString(m) }
 func (*StateKey_MultimapSideInput) ProtoMessage()    {}
 func (*StateKey_MultimapSideInput) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{20, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{20, 1}
 }
 func (m *StateKey_MultimapSideInput) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateKey_MultimapSideInput.Unmarshal(m, b)
@@ -3227,9 +3205,9 @@
 
 var xxx_messageInfo_StateKey_MultimapSideInput proto.InternalMessageInfo
 
-func (m *StateKey_MultimapSideInput) GetPtransformId() string {
+func (m *StateKey_MultimapSideInput) GetTransformId() string {
 	if m != nil {
-		return m.PtransformId
+		return m.TransformId
 	}
 	return ""
 }
@@ -3257,7 +3235,7 @@
 
 type StateKey_BagUserState struct {
 	// (Required) The id of the PTransform containing user state.
-	PtransformId string `protobuf:"bytes,1,opt,name=ptransform_id,json=ptransformId,proto3" json:"ptransform_id,omitempty"`
+	TransformId string `protobuf:"bytes,1,opt,name=transform_id,json=transformId,proto3" json:"transform_id,omitempty"`
 	// (Required) The id of the user state.
 	UserStateId string `protobuf:"bytes,2,opt,name=user_state_id,json=userStateId,proto3" json:"user_state_id,omitempty"`
 	// (Required) The window encoded in a nested context.
@@ -3274,7 +3252,7 @@
 func (m *StateKey_BagUserState) String() string { return proto.CompactTextString(m) }
 func (*StateKey_BagUserState) ProtoMessage()    {}
 func (*StateKey_BagUserState) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{20, 2}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{20, 2}
 }
 func (m *StateKey_BagUserState) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateKey_BagUserState.Unmarshal(m, b)
@@ -3294,9 +3272,9 @@
 
 var xxx_messageInfo_StateKey_BagUserState proto.InternalMessageInfo
 
-func (m *StateKey_BagUserState) GetPtransformId() string {
+func (m *StateKey_BagUserState) GetTransformId() string {
 	if m != nil {
-		return m.PtransformId
+		return m.TransformId
 	}
 	return ""
 }
@@ -3339,7 +3317,7 @@
 func (m *StateGetRequest) String() string { return proto.CompactTextString(m) }
 func (*StateGetRequest) ProtoMessage()    {}
 func (*StateGetRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{21}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{21}
 }
 func (m *StateGetRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateGetRequest.Unmarshal(m, b)
@@ -3386,7 +3364,7 @@
 func (m *StateGetResponse) String() string { return proto.CompactTextString(m) }
 func (*StateGetResponse) ProtoMessage()    {}
 func (*StateGetResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{22}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{22}
 }
 func (m *StateGetResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateGetResponse.Unmarshal(m, b)
@@ -3435,7 +3413,7 @@
 func (m *StateAppendRequest) String() string { return proto.CompactTextString(m) }
 func (*StateAppendRequest) ProtoMessage()    {}
 func (*StateAppendRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{23}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{23}
 }
 func (m *StateAppendRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateAppendRequest.Unmarshal(m, b)
@@ -3473,7 +3451,7 @@
 func (m *StateAppendResponse) String() string { return proto.CompactTextString(m) }
 func (*StateAppendResponse) ProtoMessage()    {}
 func (*StateAppendResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{24}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{24}
 }
 func (m *StateAppendResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateAppendResponse.Unmarshal(m, b)
@@ -3504,7 +3482,7 @@
 func (m *StateClearRequest) String() string { return proto.CompactTextString(m) }
 func (*StateClearRequest) ProtoMessage()    {}
 func (*StateClearRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{25}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{25}
 }
 func (m *StateClearRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateClearRequest.Unmarshal(m, b)
@@ -3535,7 +3513,7 @@
 func (m *StateClearResponse) String() string { return proto.CompactTextString(m) }
 func (*StateClearResponse) ProtoMessage()    {}
 func (*StateClearResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{26}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{26}
 }
 func (m *StateClearResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateClearResponse.Unmarshal(m, b)
@@ -3568,10 +3546,10 @@
 	Trace string `protobuf:"bytes,4,opt,name=trace,proto3" json:"trace,omitempty"`
 	// (Optional) A reference to the instruction this log statement is associated
 	// with.
-	InstructionReference string `protobuf:"bytes,5,opt,name=instruction_reference,json=instructionReference,proto3" json:"instruction_reference,omitempty"`
-	// (Optional) A reference to the primitive transform this log statement is
+	InstructionId string `protobuf:"bytes,5,opt,name=instruction_id,json=instructionId,proto3" json:"instruction_id,omitempty"`
+	// (Optional) A reference to the transform this log statement is
 	// associated with.
-	PrimitiveTransformReference string `protobuf:"bytes,6,opt,name=primitive_transform_reference,json=primitiveTransformReference,proto3" json:"primitive_transform_reference,omitempty"`
+	TransformId string `protobuf:"bytes,6,opt,name=transform_id,json=transformId,proto3" json:"transform_id,omitempty"`
 	// (Optional) Human-readable name of the function or method being invoked,
 	// with optional context such as the class or package name. The format can
 	// vary by language. For example:
@@ -3591,7 +3569,7 @@
 func (m *LogEntry) String() string { return proto.CompactTextString(m) }
 func (*LogEntry) ProtoMessage()    {}
 func (*LogEntry) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{27}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{27}
 }
 func (m *LogEntry) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_LogEntry.Unmarshal(m, b)
@@ -3639,16 +3617,16 @@
 	return ""
 }
 
-func (m *LogEntry) GetInstructionReference() string {
+func (m *LogEntry) GetInstructionId() string {
 	if m != nil {
-		return m.InstructionReference
+		return m.InstructionId
 	}
 	return ""
 }
 
-func (m *LogEntry) GetPrimitiveTransformReference() string {
+func (m *LogEntry) GetTransformId() string {
 	if m != nil {
-		return m.PrimitiveTransformReference
+		return m.TransformId
 	}
 	return ""
 }
@@ -3681,7 +3659,7 @@
 func (m *LogEntry_List) String() string { return proto.CompactTextString(m) }
 func (*LogEntry_List) ProtoMessage()    {}
 func (*LogEntry_List) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{27, 0}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{27, 0}
 }
 func (m *LogEntry_List) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_LogEntry_List.Unmarshal(m, b)
@@ -3731,7 +3709,7 @@
 func (m *LogEntry_Severity) String() string { return proto.CompactTextString(m) }
 func (*LogEntry_Severity) ProtoMessage()    {}
 func (*LogEntry_Severity) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{27, 1}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{27, 1}
 }
 func (m *LogEntry_Severity) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_LogEntry_Severity.Unmarshal(m, b)
@@ -3761,7 +3739,7 @@
 func (m *LogControl) String() string { return proto.CompactTextString(m) }
 func (*LogControl) ProtoMessage()    {}
 func (*LogControl) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{28}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{28}
 }
 func (m *LogControl) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_LogControl.Unmarshal(m, b)
@@ -3797,7 +3775,7 @@
 func (m *StartWorkerRequest) String() string { return proto.CompactTextString(m) }
 func (*StartWorkerRequest) ProtoMessage()    {}
 func (*StartWorkerRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{29}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{29}
 }
 func (m *StartWorkerRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StartWorkerRequest.Unmarshal(m, b)
@@ -3870,7 +3848,7 @@
 func (m *StartWorkerResponse) String() string { return proto.CompactTextString(m) }
 func (*StartWorkerResponse) ProtoMessage()    {}
 func (*StartWorkerResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{30}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{30}
 }
 func (m *StartWorkerResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StartWorkerResponse.Unmarshal(m, b)
@@ -3908,7 +3886,7 @@
 func (m *StopWorkerRequest) String() string { return proto.CompactTextString(m) }
 func (*StopWorkerRequest) ProtoMessage()    {}
 func (*StopWorkerRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{31}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{31}
 }
 func (m *StopWorkerRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StopWorkerRequest.Unmarshal(m, b)
@@ -3946,7 +3924,7 @@
 func (m *StopWorkerResponse) String() string { return proto.CompactTextString(m) }
 func (*StopWorkerResponse) ProtoMessage()    {}
 func (*StopWorkerResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_fn_api_fa77b71575f0478b, []int{32}
+	return fileDescriptor_beam_fn_api_d24d1635dfa071c8, []int{32}
 }
 func (m *StopWorkerResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StopWorkerResponse.Unmarshal(m, b)
@@ -4011,7 +3989,6 @@
 	proto.RegisterType((*Metrics_User_GaugeData)(nil), "org.apache.beam.model.fn_execution.v1.Metrics.User.GaugeData")
 	proto.RegisterType((*ProcessBundleProgressResponse)(nil), "org.apache.beam.model.fn_execution.v1.ProcessBundleProgressResponse")
 	proto.RegisterType((*ProcessBundleSplitRequest)(nil), "org.apache.beam.model.fn_execution.v1.ProcessBundleSplitRequest")
-	proto.RegisterMapType((map[string][]byte)(nil), "org.apache.beam.model.fn_execution.v1.ProcessBundleSplitRequest.BacklogRemainingEntry")
 	proto.RegisterMapType((map[string]*ProcessBundleSplitRequest_DesiredSplit)(nil), "org.apache.beam.model.fn_execution.v1.ProcessBundleSplitRequest.DesiredSplitsEntry")
 	proto.RegisterType((*ProcessBundleSplitRequest_DesiredSplit)(nil), "org.apache.beam.model.fn_execution.v1.ProcessBundleSplitRequest.DesiredSplit")
 	proto.RegisterType((*ProcessBundleSplitResponse)(nil), "org.apache.beam.model.fn_execution.v1.ProcessBundleSplitResponse")
@@ -4549,210 +4526,205 @@
 	Metadata: "beam_fn_api.proto",
 }
 
-func init() { proto.RegisterFile("beam_fn_api.proto", fileDescriptor_beam_fn_api_fa77b71575f0478b) }
+func init() { proto.RegisterFile("beam_fn_api.proto", fileDescriptor_beam_fn_api_d24d1635dfa071c8) }
 
-var fileDescriptor_beam_fn_api_fa77b71575f0478b = []byte{
-	// 3219 bytes of a gzipped FileDescriptorProto
-	0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xbc, 0x5a, 0xdb, 0x6f, 0x1b, 0xc7,
-	0xd5, 0xf7, 0xf2, 0x22, 0x91, 0x87, 0x94, 0x44, 0x8e, 0x24, 0x9b, 0xde, 0x38, 0xdf, 0xe7, 0x8f,
-	0xf9, 0x02, 0x08, 0x29, 0x42, 0x5f, 0x91, 0xd8, 0x69, 0xe2, 0x44, 0xa2, 0x68, 0x9b, 0x89, 0x6c,
-	0xb3, 0x2b, 0x39, 0x6e, 0x93, 0x26, 0x8b, 0x15, 0x77, 0x48, 0x0f, 0xbc, 0xdc, 0xdd, 0xcc, 0x2e,
-	0x65, 0xc9, 0x0d, 0x5a, 0xb4, 0x01, 0x52, 0xb4, 0x68, 0x91, 0xd7, 0xa0, 0xed, 0x4b, 0x5b, 0xa0,
-	0x40, 0x5f, 0xfa, 0x07, 0xf4, 0x3f, 0x68, 0x51, 0xa0, 0xe8, 0x6b, 0x91, 0x97, 0x02, 0x2d, 0x90,
-	0x36, 0xfd, 0x03, 0x0a, 0xf4, 0xa5, 0x98, 0xcb, 0x5e, 0xb8, 0x24, 0x65, 0x5e, 0x94, 0xbe, 0xed,
-	0xcc, 0xec, 0xf9, 0xfd, 0xce, 0x9e, 0x39, 0x73, 0xe6, 0x9c, 0x99, 0x85, 0xf2, 0x3e, 0x36, 0x7a,
-	0x7a, 0xc7, 0xd6, 0x0d, 0x97, 0xd4, 0x5c, 0xea, 0xf8, 0x0e, 0x7a, 0xde, 0xa1, 0xdd, 0x9a, 0xe1,
-	0x1a, 0xed, 0x87, 0xb8, 0xc6, 0x46, 0x6b, 0x3d, 0xc7, 0xc4, 0x56, 0xad, 0x63, 0xeb, 0xf8, 0x10,
-	0xb7, 0xfb, 0x3e, 0x71, 0xec, 0xda, 0xc1, 0x25, 0x75, 0x9d, 0x4b, 0xd2, 0xbe, 0x6d, 0x63, 0x1a,
-	0x49, 0xab, 0x2b, 0xd8, 0x36, 0x5d, 0x87, 0xd8, 0xbe, 0x27, 0x3b, 0xce, 0x77, 0x1d, 0xa7, 0x6b,
-	0xe1, 0x0b, 0xbc, 0xb5, 0xdf, 0xef, 0x5c, 0x30, 0xb1, 0xd7, 0xa6, 0xc4, 0xf5, 0x1d, 0x2a, 0xdf,
-	0xf8, 0xdf, 0xe4, 0x1b, 0x3e, 0xe9, 0x61, 0xcf, 0x37, 0x7a, 0xae, 0x7c, 0xe1, 0x7f, 0x92, 0x2f,
-	0x3c, 0xa6, 0x86, 0xeb, 0x62, 0x1a, 0x50, 0x2c, 0xf5, 0xb0, 0x4f, 0x49, 0x5b, 0x36, 0xab, 0x3f,
-	0x51, 0x60, 0x59, 0xc3, 0x3d, 0xc7, 0xc7, 0xb7, 0xa8, 0xdb, 0x6e, 0x39, 0xd4, 0x47, 0x3d, 0x38,
-	0x6d, 0xb8, 0x44, 0xf7, 0x30, 0x3d, 0x20, 0x6d, 0xac, 0x47, 0x2a, 0x54, 0x94, 0xf3, 0xca, 0x46,
-	0xe1, 0xf2, 0xcb, 0xb5, 0xd1, 0x1f, 0xed, 0x12, 0x17, 0x5b, 0xc4, 0xc6, 0xb5, 0x83, 0x4b, 0xb5,
-	0x4d, 0x97, 0xec, 0x0a, 0xf9, 0xed, 0x50, 0x5c, 0x5b, 0x33, 0x46, 0xf4, 0xa2, 0xb3, 0x90, 0x6b,
-	0x3b, 0x26, 0xa6, 0x3a, 0x31, 0x2b, 0xa9, 0xf3, 0xca, 0x46, 0x5e, 0x5b, 0xe4, 0xed, 0xa6, 0x59,
-	0xfd, 0x5b, 0x06, 0x50, 0xd3, 0xf6, 0x7c, 0xda, 0x6f, 0x33, 0x4b, 0x6a, 0xf8, 0x83, 0x3e, 0xf6,
-	0x7c, 0xf4, 0x3c, 0x2c, 0x93, 0xa8, 0x97, 0xc9, 0x29, 0x5c, 0x6e, 0x29, 0xd6, 0xdb, 0x34, 0xd1,
-	0x7d, 0xc8, 0x51, 0xdc, 0x25, 0x9e, 0x8f, 0x69, 0xe5, 0xf3, 0x45, 0xae, 0xfa, 0x4b, 0xb5, 0x89,
-	0xe6, 0xab, 0xa6, 0x49, 0x39, 0xc9, 0x78, 0xfb, 0x94, 0x16, 0x42, 0x21, 0x0c, 0xcb, 0x2e, 0x75,
-	0xda, 0xd8, 0xf3, 0xf4, 0xfd, 0xbe, 0x6d, 0x5a, 0xb8, 0xf2, 0x77, 0x01, 0xfe, 0xd5, 0x09, 0xc1,
-	0x5b, 0x42, 0x7a, 0x8b, 0x0b, 0x47, 0x0c, 0x4b, 0x6e, 0xbc, 0x1f, 0x7d, 0x1b, 0xce, 0x0c, 0xd2,
-	0xe8, 0x2e, 0x75, 0xba, 0x14, 0x7b, 0x5e, 0xe5, 0x1f, 0x82, 0xaf, 0x3e, 0x0b, 0x5f, 0x4b, 0x82,
-	0x44, 0xbc, 0xeb, 0xee, 0xa8, 0x71, 0xd4, 0x87, 0xb5, 0x04, 0xbf, 0xe7, 0x5a, 0xc4, 0xaf, 0x7c,
-	0x21, 0xc8, 0xdf, 0x98, 0x85, 0x7c, 0x97, 0x21, 0x44, 0xcc, 0xc8, 0x1d, 0x1a, 0x44, 0x0f, 0x61,
-	0xa5, 0x43, 0x6c, 0xc3, 0x22, 0x4f, 0x70, 0x60, 0xde, 0x7f, 0x0a, 0xc6, 0x57, 0x27, 0x64, 0xbc,
-	0x29, 0xc5, 0x93, 0xf6, 0x5d, 0xee, 0x0c, 0x0c, 0x6c, 0xe5, 0x61, 0x91, 0x8a, 0xc1, 0xea, 0xf7,
-	0xb2, 0xb0, 0x3a, 0xe0, 0x67, 0x9e, 0xeb, 0xd8, 0x1e, 0x9e, 0xd4, 0xd1, 0xd6, 0x20, 0x8b, 0x29,
-	0x75, 0xa8, 0x74, 0x5f, 0xd1, 0x40, 0x6f, 0x0f, 0xbb, 0xdf, 0xcb, 0x53, 0xbb, 0x9f, 0x50, 0x64,
-	0xc0, 0xff, 0x3a, 0xe3, 0xfc, 0xef, 0xd5, 0xd9, 0xfc, 0x2f, 0xa4, 0x48, 0x38, 0xe0, 0x77, 0x9e,
-	0xea, 0x80, 0xdb, 0xf3, 0x39, 0x60, 0x48, 0x3c, 0xc6, 0x03, 0x0f, 0x8e, 0xf7, 0xc0, 0xcd, 0x39,
-	0x3c, 0x30, 0xa4, 0x1e, 0xe5, 0x82, 0x64, 0xac, 0x0b, 0xbe, 0x36, 0xa3, 0x0b, 0x86, 0x74, 0x49,
-	0x1f, 0x04, 0xe6, 0x23, 0x62, 0xb4, 0xfa, 0x63, 0x05, 0x56, 0x12, 0x71, 0x07, 0x3d, 0x81, 0xb3,
-	0x09, 0x13, 0x0c, 0x44, 0xe3, 0xf4, 0x46, 0xe1, 0xf2, 0x8d, 0x59, 0xcc, 0x10, 0x0b, 0xca, 0x67,
-	0xdc, 0xd1, 0x03, 0x55, 0x04, 0xa5, 0xa4, 0x1f, 0x56, 0x7f, 0x09, 0x70, 0x66, 0x0c, 0x10, 0x5a,
-	0x86, 0x54, 0xb8, 0x40, 0x52, 0xc4, 0x44, 0x36, 0x80, 0x4f, 0x0d, 0xdb, 0xeb, 0x38, 0xb4, 0xe7,
-	0x55, 0x52, 0x5c, 0xd9, 0xbb, 0xf3, 0x29, 0x5b, 0xdb, 0x0b, 0x01, 0x1b, 0xb6, 0x4f, 0x8f, 0xb4,
-	0x18, 0x03, 0xf2, 0xa1, 0xe8, 0xb6, 0x1d, 0xcb, 0xc2, 0x7c, 0x59, 0x7a, 0x95, 0x34, 0x67, 0x6c,
-	0xcd, 0xc9, 0xd8, 0x8a, 0x41, 0x0a, 0xce, 0x01, 0x16, 0xf4, 0x43, 0x05, 0xd6, 0x1e, 0x13, 0xdb,
-	0x74, 0x1e, 0x13, 0xbb, 0xab, 0x7b, 0x3e, 0x35, 0x7c, 0xdc, 0x25, 0xd8, 0xab, 0x64, 0x38, 0xfd,
-	0x83, 0x39, 0xe9, 0x1f, 0x04, 0xd0, 0xbb, 0x21, 0xb2, 0xd0, 0x62, 0xf5, 0xf1, 0xf0, 0x08, 0xda,
-	0x87, 0x05, 0xbe, 0x75, 0x7a, 0x95, 0x2c, 0x67, 0x7f, 0x73, 0x4e, 0xf6, 0x3a, 0x07, 0x13, 0x84,
-	0x12, 0x99, 0x99, 0x19, 0xdb, 0x07, 0x84, 0x3a, 0x76, 0x0f, 0xdb, 0xbe, 0x57, 0x59, 0x38, 0x11,
-	0x33, 0x37, 0x62, 0x90, 0xd2, 0xcc, 0x71, 0x16, 0x74, 0x08, 0xe7, 0x3c, 0xdf, 0xf0, 0xb1, 0x3e,
-	0x26, 0x33, 0x59, 0x9c, 0x2f, 0x33, 0x39, 0xcb, 0xc1, 0x47, 0x0d, 0xa9, 0x16, 0xac, 0x24, 0xbc,
-	0x0e, 0x95, 0x20, 0xfd, 0x08, 0x1f, 0x49, 0x57, 0x67, 0x8f, 0xa8, 0x0e, 0xd9, 0x03, 0xc3, 0xea,
-	0x63, 0xbe, 0x03, 0x14, 0x2e, 0xbf, 0x38, 0x81, 0x1e, 0xad, 0x10, 0x55, 0x13, 0xb2, 0xaf, 0xa4,
-	0xae, 0x29, 0xaa, 0x03, 0xe5, 0x21, 0x8f, 0x1b, 0xc1, 0xb7, 0x3d, 0xc8, 0x57, 0x9b, 0x84, 0xaf,
-	0x1e, 0xc2, 0xc6, 0x09, 0x3f, 0x84, 0xca, 0x38, 0x1f, 0x1b, 0xc1, 0xfb, 0xe6, 0x20, 0xef, 0xd5,
-	0x09, 0x78, 0x93, 0xe8, 0x47, 0x71, 0xf6, 0x36, 0x14, 0x62, 0x3e, 0x36, 0x82, 0xf0, 0xc6, 0x20,
-	0xe1, 0xc6, 0x04, 0x84, 0x1c, 0x30, 0x61, 0xd3, 0x21, 0xf7, 0x3a, 0x19, 0x9b, 0xc6, 0x60, 0x63,
-	0x84, 0xd5, 0x7f, 0xa7, 0xa1, 0x2c, 0x3c, 0x7c, 0xd3, 0x75, 0x2d, 0xd2, 0x36, 0x98, 0xd1, 0xd1,
-	0x73, 0xb0, 0xe4, 0x86, 0xe1, 0x2a, 0xca, 0x25, 0x8a, 0x51, 0x67, 0xd3, 0x64, 0xc9, 0x30, 0xb1,
-	0xdd, 0xbe, 0x1f, 0x4b, 0x86, 0x79, 0xbb, 0x69, 0xa2, 0x0a, 0x2c, 0x62, 0x0b, 0x33, 0xae, 0x4a,
-	0xfa, 0xbc, 0xb2, 0x51, 0xd4, 0x82, 0x26, 0xfa, 0x16, 0x94, 0x9d, 0xbe, 0xcf, 0xa4, 0x1e, 0x1b,
-	0x3e, 0xa6, 0x3d, 0x83, 0x3e, 0x0a, 0xe2, 0xcf, 0xa4, 0x01, 0x77, 0x48, 0xdd, 0xda, 0x3d, 0x8e,
-	0xf8, 0x20, 0x04, 0x14, 0xab, 0xb2, 0xe4, 0x24, 0xba, 0x51, 0x0b, 0x80, 0x78, 0xfa, 0xbe, 0xd3,
-	0xb7, 0x4d, 0x6c, 0x56, 0xb2, 0xe7, 0x95, 0x8d, 0xe5, 0xcb, 0x97, 0x26, 0xb0, 0x5d, 0xd3, 0xdb,
-	0x12, 0x32, 0xb5, 0x86, 0xdd, 0xef, 0x69, 0x79, 0x12, 0xb4, 0xd1, 0x37, 0xa1, 0xd4, 0x73, 0x6c,
-	0xe2, 0x3b, 0x94, 0x85, 0x54, 0x62, 0x77, 0x9c, 0x20, 0xca, 0x4c, 0x82, 0x7b, 0x27, 0x14, 0x6d,
-	0xda, 0x1d, 0x47, 0x5b, 0xe9, 0x0d, 0xb4, 0x3d, 0x55, 0x87, 0xf5, 0x91, 0x9f, 0x36, 0xc2, 0x23,
-	0x2e, 0x0e, 0x7a, 0x84, 0x5a, 0x13, 0xa5, 0x55, 0x2d, 0x28, 0xad, 0x6a, 0x7b, 0x41, 0xed, 0x15,
-	0x9f, 0xfd, 0x3f, 0x28, 0x50, 0xd9, 0xc6, 0x96, 0x71, 0x84, 0xcd, 0x61, 0x27, 0xd8, 0x83, 0x8a,
-	0x4c, 0x3a, 0xb1, 0x19, 0xcd, 0x80, 0xce, 0x8a, 0x38, 0x59, 0x5d, 0x1d, 0xc7, 0x72, 0x3a, 0x94,
-	0x6d, 0x04, 0xa2, 0x6c, 0x10, 0xbd, 0x03, 0x05, 0x23, 0x22, 0x91, 0xea, 0x5e, 0x9b, 0x75, 0xea,
-	0xb5, 0x38, 0x58, 0xf5, 0x67, 0x19, 0x58, 0x1b, 0x55, 0xb1, 0xa0, 0x3b, 0xf0, 0xdc, 0xd8, 0xdc,
-	0x44, 0xa7, 0xb8, 0x83, 0x29, 0xb6, 0xdb, 0x58, 0xda, 0xf3, 0xfc, 0x98, 0x2c, 0x43, 0x0b, 0xde,
-	0x43, 0x04, 0x8a, 0x6d, 0xa6, 0xaa, 0xee, 0x3b, 0x8f, 0xb0, 0x1d, 0x24, 0x0c, 0x37, 0xe7, 0xa8,
-	0xa9, 0x6a, 0x75, 0x26, 0xb5, 0xc7, 0xe0, 0xb4, 0x42, 0x3b, 0x7c, 0xf6, 0xd4, 0xdf, 0xa5, 0x00,
-	0xa2, 0x31, 0xf4, 0x01, 0x40, 0xdf, 0xc3, 0x54, 0xe7, 0x7b, 0x80, 0x9c, 0x85, 0xd6, 0xc9, 0xf0,
-	0xd6, 0xee, 0x7b, 0x98, 0xee, 0x32, 0xdc, 0xdb, 0xa7, 0xb4, 0x7c, 0x3f, 0x68, 0x30, 0x4a, 0x8f,
-	0x98, 0x58, 0xe7, 0x6b, 0x5b, 0xce, 0xd7, 0x49, 0x51, 0xee, 0x12, 0x13, 0x37, 0x19, 0x2e, 0xa3,
-	0xf4, 0x82, 0x06, 0x2b, 0x52, 0xb8, 0x65, 0x2b, 0xc0, 0x83, 0x87, 0x68, 0xa8, 0x05, 0xc8, 0x87,
-	0x2a, 0xaa, 0x2f, 0x40, 0x3e, 0x14, 0x46, 0xcf, 0x0e, 0xa8, 0x28, 0x66, 0x31, 0x82, 0xdb, 0x5a,
-	0x80, 0x8c, 0x7f, 0xe4, 0xe2, 0xea, 0x67, 0x29, 0x58, 0x1f, 0x59, 0x50, 0xa0, 0xdb, 0xb0, 0x28,
-	0x8f, 0x1a, 0xa4, 0x4d, 0x6b, 0x13, 0x7e, 0xe0, 0x1d, 0x21, 0xa5, 0x05, 0xe2, 0xac, 0xe2, 0xa1,
-	0xd8, 0x23, 0x66, 0xdf, 0xb0, 0x74, 0xea, 0x38, 0x7e, 0xe0, 0x1c, 0xaf, 0x4f, 0x08, 0x38, 0x6e,
-	0x35, 0x6a, 0x4b, 0x01, 0xac, 0xc6, 0x50, 0x47, 0x06, 0x9e, 0xf4, 0x49, 0x05, 0x1e, 0x74, 0x05,
-	0xd6, 0xd9, 0xf2, 0x25, 0x14, 0x7b, 0xba, 0x2c, 0x03, 0xc4, 0x72, 0xcd, 0x9c, 0x57, 0x36, 0x72,
-	0xda, 0x5a, 0x30, 0x78, 0x33, 0x36, 0x56, 0xdd, 0x85, 0x73, 0xc7, 0x95, 0xef, 0x0c, 0x34, 0x5e,
-	0xa1, 0x26, 0x97, 0xdd, 0x1a, 0x89, 0x57, 0xb5, 0x72, 0xac, 0xfa, 0xe9, 0x2a, 0x2c, 0x4a, 0x23,
-	0x23, 0x03, 0x0a, 0x6e, 0x2c, 0x4d, 0x57, 0xa6, 0x32, 0xac, 0x04, 0xa9, 0xb5, 0xfc, 0x44, 0x5e,
-	0x1e, 0xc7, 0x54, 0x3f, 0x2b, 0x00, 0x44, 0xd9, 0x0e, 0x7a, 0x02, 0x41, 0xd1, 0xc5, 0x42, 0xa0,
-	0xd8, 0xc2, 0x02, 0x17, 0x79, 0x6b, 0x5a, 0xe2, 0x10, 0x36, 0x58, 0x16, 0xd8, 0x6c, 0x48, 0x48,
-	0xad, 0xec, 0x26, 0xbb, 0xd0, 0x07, 0xb0, 0x62, 0xb4, 0x7d, 0x72, 0x80, 0x23, 0x62, 0xb1, 0xf8,
-	0x6e, 0xcf, 0x4e, 0xbc, 0xc9, 0x01, 0x43, 0xd6, 0x65, 0x63, 0xa0, 0x8d, 0x08, 0x40, 0x6c, 0x57,
-	0x16, 0xee, 0xd4, 0x9c, 0x9d, 0x2d, 0xb9, 0x21, 0xc7, 0xc0, 0xd1, 0x2d, 0xc8, 0xb0, 0x10, 0x23,
-	0xb7, 0xfe, 0x2b, 0x53, 0x92, 0xb0, 0x38, 0xa0, 0x71, 0x00, 0xf5, 0xaf, 0x69, 0xc8, 0xdd, 0xc1,
-	0x86, 0xd7, 0xa7, 0xd8, 0x44, 0x3f, 0x52, 0x60, 0x4d, 0xe4, 0x24, 0xd2, 0x66, 0x7a, 0xdb, 0xe9,
-	0x8b, 0x29, 0x63, 0x34, 0xef, 0xcc, 0xfe, 0x2d, 0x01, 0x45, 0x8d, 0x87, 0x14, 0x69, 0xb1, 0x3a,
-	0x07, 0x17, 0x1f, 0x87, 0xc8, 0xd0, 0x00, 0xfa, 0x44, 0x81, 0x75, 0x99, 0xed, 0x24, 0xf4, 0x11,
-	0x41, 0xe1, 0xdd, 0x13, 0xd0, 0x47, 0x24, 0x08, 0x23, 0x14, 0x5a, 0x75, 0x86, 0x47, 0xd0, 0x06,
-	0x94, 0x7c, 0xc7, 0x37, 0x2c, 0xbe, 0x8b, 0xeb, 0x9e, 0x1b, 0x64, 0x68, 0x8a, 0xb6, 0xcc, 0xfb,
-	0xd9, 0x16, 0xbd, 0xcb, 0x7a, 0xd5, 0x06, 0x9c, 0x19, 0xf3, 0xa9, 0x23, 0xb2, 0x8f, 0xb5, 0x78,
-	0xf6, 0x91, 0x8e, 0x27, 0xb4, 0x37, 0xa1, 0x32, 0x4e, 0xc3, 0xa9, 0x70, 0x3c, 0x28, 0x0f, 0xad,
-	0x1a, 0xf4, 0x3e, 0xe4, 0x7a, 0xd2, 0x0e, 0x72, 0x51, 0x6e, 0xcd, 0x6f, 0x51, 0x2d, 0xc4, 0x54,
-	0x3f, 0x49, 0xc3, 0xf2, 0xe0, 0x92, 0xf9, 0xb2, 0x29, 0xd1, 0x8b, 0x80, 0x3a, 0xd4, 0x08, 0x22,
-	0x64, 0xcf, 0x20, 0x36, 0xb1, 0xbb, 0xdc, 0x1c, 0x8a, 0x56, 0x0e, 0x46, 0xb4, 0x60, 0x00, 0xfd,
-	0x5c, 0x81, 0xb3, 0x83, 0x1e, 0xe6, 0xc5, 0xc4, 0xc4, 0x0a, 0xc6, 0x27, 0x15, 0x2f, 0x06, 0x7d,
-	0xcd, 0x0b, 0xb5, 0x10, 0xfe, 0x76, 0xc6, 0x19, 0x3d, 0xaa, 0xbe, 0x09, 0xe7, 0x8e, 0x13, 0x9c,
-	0xca, 0x0d, 0x5e, 0x83, 0x95, 0xa7, 0xe7, 0xc2, 0xe3, 0xc5, 0xff, 0x98, 0x85, 0x0c, 0x8b, 0x1d,
-	0x48, 0x87, 0x82, 0xd8, 0xb1, 0x75, 0xdb, 0x08, 0xd3, 0xd9, 0x1b, 0x33, 0x44, 0x21, 0xd9, 0xb8,
-	0x6b, 0xf4, 0xb0, 0x06, 0xbd, 0xf0, 0x19, 0x61, 0x28, 0xf2, 0xa5, 0x8e, 0xa9, 0x6e, 0x1a, 0xbe,
-	0x11, 0x9c, 0x7b, 0xbe, 0x3e, 0x0b, 0x45, 0x5d, 0x00, 0x6d, 0x1b, 0xbe, 0x71, 0xfb, 0x94, 0x56,
-	0x68, 0x47, 0x4d, 0xe4, 0x43, 0xd9, 0x24, 0x9e, 0x4f, 0xc9, 0xbe, 0x48, 0xce, 0x39, 0xd7, 0x94,
-	0x47, 0x9e, 0x03, 0x5c, 0xdb, 0x31, 0x34, 0x49, 0x58, 0x32, 0x13, 0x7d, 0x48, 0x07, 0xe8, 0x1a,
-	0xfd, 0x2e, 0x16, 0x74, 0x5f, 0x4c, 0x77, 0xe0, 0x38, 0x40, 0x77, 0x8b, 0xc1, 0x48, 0x9e, 0x7c,
-	0x37, 0x68, 0xa8, 0x37, 0x00, 0x22, 0xbb, 0xa2, 0x73, 0x90, 0x67, 0xb3, 0xe4, 0xb9, 0x46, 0x1b,
-	0xcb, 0x4a, 0x33, 0xea, 0x40, 0x08, 0x32, 0x7c, 0x0e, 0xd3, 0x7c, 0x80, 0x3f, 0xab, 0xcf, 0xb1,
-	0x5a, 0x3d, 0xb2, 0x52, 0xe8, 0x10, 0x4a, 0xcc, 0x21, 0xd4, 0xf7, 0xa1, 0x94, 0xfc, 0x5a, 0xf6,
-	0x26, 0x37, 0x6f, 0xf0, 0x26, 0x6f, 0x30, 0x17, 0xf3, 0xfa, 0x3d, 0xe9, 0x4e, 0xec, 0x91, 0xf5,
-	0xf4, 0x88, 0xcd, 0x39, 0xd3, 0x1a, 0x7b, 0xe4, 0x3d, 0xc6, 0x21, 0x4f, 0x90, 0x58, 0x8f, 0x71,
-	0xa8, 0xbe, 0x0b, 0xf9, 0xf0, 0xf3, 0x46, 0xab, 0x80, 0xae, 0x41, 0x3e, 0xbc, 0x13, 0x9b, 0xa0,
-	0x72, 0x8b, 0x5e, 0x66, 0x39, 0x2d, 0x33, 0xbe, 0x7a, 0x04, 0xa5, 0x64, 0x46, 0x33, 0x62, 0x45,
-	0xdc, 0x1b, 0xac, 0x0e, 0xaf, 0xcf, 0x1c, 0x11, 0xe2, 0xc5, 0xe3, 0xaf, 0x52, 0xf0, 0xec, 0xb1,
-	0xc7, 0xe5, 0x27, 0x98, 0x56, 0x7f, 0xb9, 0xe9, 0xee, 0x7b, 0xb0, 0xe4, 0x52, 0xd2, 0x33, 0xe8,
-	0x91, 0xcc, 0xd9, 0x45, 0x56, 0x32, 0x7b, 0x55, 0x5a, 0x94, 0x70, 0x3c, 0x57, 0xaf, 0xfe, 0x39,
-	0x0b, 0x67, 0xc7, 0xde, 0x2d, 0xcd, 0x94, 0x16, 0xa3, 0x8f, 0x14, 0x28, 0xef, 0x1b, 0xed, 0x47,
-	0x96, 0xd3, 0x1d, 0xd8, 0x26, 0x98, 0xda, 0x6f, 0xcf, 0x7b, 0xdd, 0x55, 0xdb, 0x12, 0xc8, 0x89,
-	0x00, 0x5f, 0xda, 0x4f, 0x74, 0xa3, 0x27, 0xb0, 0x6c, 0x62, 0x8f, 0x50, 0x6c, 0x8a, 0xeb, 0x8e,
-	0x60, 0x4e, 0x76, 0xe7, 0xd6, 0x60, 0x5b, 0xc0, 0xf2, 0x3e, 0x99, 0xcf, 0x2c, 0x99, 0xf1, 0x3e,
-	0xb5, 0x0e, 0xeb, 0x23, 0xd5, 0x7c, 0xda, 0x7e, 0x50, 0x8c, 0xef, 0x07, 0xbf, 0x51, 0xa0, 0x18,
-	0xa7, 0x42, 0x97, 0x61, 0x3d, 0xdc, 0x7e, 0x9d, 0x8e, 0x34, 0xad, 0x89, 0xc5, 0x75, 0x72, 0x4a,
-	0x5b, 0x0d, 0x06, 0xef, 0x75, 0xb4, 0x60, 0x08, 0x5d, 0x84, 0x35, 0xc3, 0xb2, 0x9c, 0xc7, 0x81,
-	0x15, 0x74, 0x71, 0x4d, 0xce, 0x6d, 0x91, 0xd6, 0x90, 0x1c, 0xe3, 0xf8, 0x2d, 0x3e, 0x82, 0xae,
-	0x41, 0x05, 0x7b, 0x3e, 0xe9, 0x19, 0x3e, 0x36, 0xf5, 0x81, 0x7c, 0xd5, 0x93, 0x41, 0xe6, 0x74,
-	0x38, 0x1e, 0x4f, 0xc2, 0x3c, 0xf5, 0x13, 0x05, 0xd0, 0xb0, 0x6d, 0x46, 0x7c, 0x73, 0x7b, 0x70,
-	0xc5, 0xdf, 0x39, 0xd1, 0x19, 0x89, 0x47, 0x81, 0x7f, 0xa5, 0x41, 0x1d, 0x7f, 0x6d, 0x35, 0xbc,
-	0xb4, 0x94, 0x93, 0x5c, 0x5a, 0xff, 0xb5, 0x72, 0xbb, 0x0f, 0xcb, 0xed, 0x87, 0x86, 0x6d, 0x63,
-	0x6b, 0xd0, 0xd3, 0xef, 0xce, 0x7d, 0xb1, 0x57, 0xab, 0x0b, 0x5c, 0xd1, 0xb9, 0xd4, 0x8e, 0xb5,
-	0x3c, 0xf5, 0xa7, 0x0a, 0x14, 0xe3, 0xe3, 0x93, 0x1d, 0xcc, 0x5e, 0x84, 0x35, 0xcb, 0xf0, 0x7c,
-	0x3d, 0x30, 0x7c, 0x70, 0x14, 0xcb, 0x5c, 0x21, 0xab, 0x21, 0x36, 0xd6, 0x12, 0x43, 0xd2, 0xaf,
-	0xd0, 0x55, 0x38, 0xdd, 0x21, 0xd4, 0xf3, 0xf5, 0xd0, 0x98, 0xf1, 0xe3, 0xdb, 0xac, 0xb6, 0xc6,
-	0x47, 0x35, 0x39, 0x28, 0xa5, 0xaa, 0x3b, 0xb0, 0x3e, 0xf2, 0x02, 0x7b, 0xb6, 0x4a, 0xbf, 0x02,
-	0xa7, 0x47, 0xdf, 0x45, 0x56, 0x7f, 0xaf, 0x40, 0x2e, 0x4c, 0xc0, 0x6f, 0x8b, 0x8d, 0x4f, 0xfa,
-	0xd1, 0xd5, 0x09, 0xed, 0x1f, 0xa6, 0xb0, 0x6c, 0x33, 0xd6, 0xc4, 0xd6, 0xe9, 0x43, 0x86, 0x6f,
-	0xcd, 0x33, 0x05, 0xe0, 0xa1, 0x89, 0x48, 0x8d, 0x98, 0x08, 0x24, 0x75, 0x15, 0x67, 0xe0, 0xfc,
-	0xb9, 0xfa, 0x8b, 0x34, 0x14, 0xf9, 0x11, 0x56, 0x60, 0xac, 0xe4, 0x5d, 0xe4, 0x58, 0x75, 0x52,
-	0xc7, 0xa8, 0xb3, 0x03, 0x79, 0x71, 0xe7, 0xc4, 0xc2, 0x40, 0x9a, 0x2f, 0xf9, 0x0b, 0x13, 0x9a,
-	0x86, 0x2b, 0xf3, 0x16, 0x3e, 0xd2, 0x72, 0x9e, 0x7c, 0x42, 0x6f, 0x41, 0xba, 0x8b, 0xfd, 0x69,
-	0x7f, 0x44, 0xe1, 0x40, 0xb7, 0x70, 0xec, 0xa7, 0x09, 0x86, 0x82, 0xf6, 0x60, 0xc1, 0x70, 0x5d,
-	0x6c, 0x9b, 0x41, 0x0e, 0x7c, 0x7d, 0x1a, 0xbc, 0x4d, 0x2e, 0x1a, 0x41, 0x4a, 0x2c, 0xf4, 0x35,
-	0xc8, 0xb6, 0x2d, 0x6c, 0xd0, 0x20, 0xd9, 0xbd, 0x36, 0x0d, 0x68, 0x9d, 0x49, 0x46, 0x98, 0x02,
-	0x29, 0xfe, 0x93, 0xc5, 0x6f, 0x53, 0xb0, 0x24, 0x27, 0x49, 0xc6, 0xb1, 0xe4, 0x2c, 0x8d, 0xfe,
-	0x8f, 0x62, 0x67, 0xc0, 0x70, 0x2f, 0x4f, 0x6d, 0xb8, 0xf0, 0xf2, 0x9d, 0x5b, 0xee, 0x7e, 0xd2,
-	0x72, 0xaf, 0xcc, 0x62, 0xb9, 0x10, 0x33, 0x30, 0x9d, 0x96, 0x30, 0xdd, 0xf5, 0x19, 0x4c, 0x17,
-	0x82, 0x4a, 0xdb, 0xc5, 0x7f, 0x0e, 0xf8, 0x3c, 0x03, 0xb9, 0xc0, 0xa9, 0x50, 0x0b, 0x16, 0xc4,
-	0xaf, 0x64, 0x32, 0x03, 0x7c, 0x69, 0x4a, 0xaf, 0xac, 0x69, 0x5c, 0x9a, 0xa9, 0x2f, 0x70, 0x90,
-	0x07, 0xab, 0xbd, 0xbe, 0xc5, 0x76, 0x47, 0x57, 0x1f, 0x3a, 0x98, 0xde, 0x9c, 0x16, 0xfe, 0x8e,
-	0x84, 0x8a, 0x9f, 0x44, 0x97, 0x7b, 0xc9, 0x4e, 0x64, 0xc2, 0xf2, 0xbe, 0xd1, 0xd5, 0x63, 0x67,
-	0xef, 0xe9, 0xa9, 0xfe, 0x63, 0x09, 0xf9, 0xb6, 0x8c, 0x6e, 0xfc, 0x9c, 0xbd, 0xb8, 0x1f, 0x6b,
-	0xab, 0x2a, 0x2c, 0x88, 0xcf, 0x8d, 0x6f, 0xe8, 0x45, 0xbe, 0xa1, 0xab, 0x1f, 0x2b, 0x50, 0x1e,
-	0x52, 0x76, 0xb2, 0xfd, 0xa0, 0x0a, 0x4b, 0x91, 0xa1, 0xa2, 0x58, 0x55, 0x08, 0x4f, 0xc8, 0x9b,
-	0x26, 0x3a, 0x0d, 0x0b, 0xe2, 0x96, 0x5e, 0x06, 0x2b, 0xd9, 0x0a, 0x14, 0xc9, 0x44, 0x8a, 0x7c,
-	0x57, 0x81, 0x62, 0xfc, 0x2b, 0x26, 0xd6, 0x21, 0x32, 0x5e, 0x4c, 0x87, 0xf0, 0x9e, 0x61, 0x1a,
-	0x1d, 0xc2, 0x13, 0xfd, 0x37, 0x60, 0x25, 0x11, 0x75, 0xd0, 0x8b, 0x80, 0xda, 0x8e, 0xed, 0x13,
-	0xbb, 0x6f, 0x88, 0xeb, 0x2a, 0x7e, 0x91, 0x20, 0x0c, 0x59, 0x8e, 0x8f, 0xf0, 0x1b, 0x88, 0xea,
-	0x7d, 0x28, 0x25, 0x97, 0xdf, 0x94, 0x10, 0x61, 0x94, 0x4f, 0xc5, 0xa2, 0xfc, 0x06, 0xa0, 0xe1,
-	0xf0, 0x15, 0xbe, 0xa9, 0xc4, 0xde, 0x5c, 0x87, 0xd5, 0x11, 0xcb, 0xb5, 0xba, 0x0a, 0xe5, 0xa1,
-	0x50, 0x55, 0x5d, 0x93, 0xa8, 0x03, 0x8b, 0xb0, 0xfa, 0xa7, 0x0c, 0xe4, 0x76, 0x1c, 0x99, 0xfd,
-	0x7e, 0x03, 0x72, 0x1e, 0x3e, 0xc0, 0x94, 0xf8, 0xc2, 0x7b, 0x96, 0x27, 0xae, 0xcb, 0x03, 0x88,
-	0xda, 0xae, 0x94, 0x17, 0x97, 0x9d, 0x21, 0xdc, 0xec, 0xc5, 0x2a, 0xaa, 0xb0, 0x3a, 0xd0, 0xf3,
-	0x8c, 0x6e, 0x50, 0xa5, 0x07, 0x4d, 0x7e, 0xd3, 0x43, 0x59, 0x59, 0x9f, 0x11, 0x61, 0x94, 0x37,
-	0xc6, 0x6f, 0x81, 0xd9, 0x63, 0xb6, 0xc0, 0x2d, 0x78, 0x96, 0x25, 0x3c, 0x84, 0x1f, 0x99, 0x47,
-	0xfe, 0x18, 0x09, 0x2f, 0x70, 0xe1, 0x67, 0xc2, 0x97, 0xa2, 0xa2, 0x36, 0xc4, 0xf8, 0x3f, 0x28,
-	0xb2, 0x8a, 0xca, 0x72, 0xe4, 0xed, 0xe4, 0xa2, 0x70, 0x52, 0xcb, 0xe9, 0xee, 0xc8, 0x2e, 0xe6,
-	0xa4, 0xfe, 0x43, 0x8a, 0x0d, 0xb3, 0x92, 0xe3, 0x83, 0xb2, 0xa5, 0x7e, 0x1d, 0x32, 0x3b, 0xc4,
-	0xf3, 0x51, 0x0b, 0xd8, 0xeb, 0x3a, 0xb6, 0x7d, 0x4a, 0x70, 0x90, 0xee, 0x5e, 0x98, 0x72, 0x0e,
-	0x34, 0xb0, 0xc4, 0x13, 0xc1, 0x9e, 0x4a, 0x21, 0x17, 0x4c, 0x49, 0xb5, 0x03, 0x19, 0x36, 0x2b,
-	0x68, 0x05, 0x0a, 0xf7, 0xef, 0xee, 0xb6, 0x1a, 0xf5, 0xe6, 0xcd, 0x66, 0x63, 0xbb, 0x74, 0x0a,
-	0xe5, 0x21, 0xbb, 0xa7, 0x6d, 0xd6, 0x1b, 0x25, 0x85, 0x3d, 0x6e, 0x37, 0xb6, 0xee, 0xdf, 0x2a,
-	0xa5, 0x50, 0x0e, 0x32, 0xcd, 0xbb, 0x37, 0xef, 0x95, 0xd2, 0x08, 0x60, 0xe1, 0xee, 0xbd, 0xbd,
-	0x66, 0xbd, 0x51, 0xca, 0xb0, 0xde, 0x07, 0x9b, 0xda, 0xdd, 0x52, 0x96, 0xbd, 0xda, 0xd0, 0xb4,
-	0x7b, 0x5a, 0x69, 0x01, 0x15, 0x21, 0x57, 0xd7, 0x9a, 0x7b, 0xcd, 0xfa, 0xe6, 0x4e, 0x69, 0xb1,
-	0x5a, 0x04, 0xd8, 0x71, 0xba, 0x75, 0xc7, 0xf6, 0xa9, 0x63, 0x55, 0xff, 0x92, 0xe1, 0x8e, 0x47,
-	0xfd, 0x07, 0x0e, 0x7d, 0x14, 0xfd, 0xf1, 0xf5, 0x0c, 0xe4, 0x1f, 0xf3, 0x8e, 0x68, 0xd1, 0xe7,
-	0x44, 0x47, 0xd3, 0x44, 0xfb, 0x50, 0x6a, 0x0b, 0x71, 0x3d, 0xf8, 0x73, 0x58, 0x3a, 0xcd, 0xcc,
-	0x7f, 0xbe, 0xac, 0x48, 0xc0, 0x86, 0xc4, 0x63, 0x1c, 0x96, 0xd3, 0xed, 0x12, 0xbb, 0x1b, 0x71,
-	0xa4, 0xe7, 0xe4, 0x90, 0x80, 0x21, 0x87, 0x09, 0x65, 0x83, 0xfa, 0xa4, 0x63, 0xb4, 0xfd, 0x88,
-	0x24, 0x33, 0x1f, 0x49, 0x29, 0x40, 0x0c, 0x59, 0x3a, 0xfc, 0xa2, 0xe9, 0x80, 0x78, 0xcc, 0xdf,
-	0x43, 0x9a, 0xec, 0x7c, 0x34, 0xe5, 0x10, 0x32, 0xe4, 0x79, 0x0f, 0x16, 0x5c, 0x83, 0x1a, 0x3d,
-	0xaf, 0x02, 0xdc, 0x31, 0x1b, 0x93, 0xef, 0x5f, 0x89, 0xd9, 0xaf, 0xb5, 0x38, 0x8e, 0xfc, 0xe1,
-	0x4a, 0x80, 0xaa, 0xd7, 0xa1, 0x10, 0xeb, 0x7e, 0x5a, 0x29, 0x9e, 0x8f, 0xd7, 0x91, 0x5f, 0xe1,
-	0x71, 0x30, 0x22, 0x91, 0xb1, 0x38, 0xcc, 0xb3, 0x94, 0x58, 0x9e, 0x55, 0xbd, 0xc8, 0xa2, 0xa3,
-	0xe3, 0x4e, 0xee, 0x8e, 0xd5, 0x17, 0x98, 0x07, 0x47, 0x12, 0xc7, 0xa1, 0x5f, 0xfe, 0x54, 0x81,
-	0xa5, 0x2d, 0x6c, 0xf4, 0x6e, 0xda, 0x72, 0x01, 0xa0, 0x8f, 0x15, 0x58, 0x0c, 0x9e, 0x27, 0x4d,
-	0xc2, 0x46, 0xfc, 0xa4, 0xab, 0x5e, 0x9f, 0x45, 0x56, 0xc4, 0xfe, 0x53, 0x1b, 0xca, 0x45, 0xe5,
-	0xf2, 0x87, 0x00, 0x42, 0x33, 0x5e, 0xb9, 0xd8, 0xb2, 0x82, 0xb9, 0x30, 0x65, 0x15, 0xa4, 0x4e,
-	0x2b, 0x20, 0xd9, 0xbf, 0xaf, 0x40, 0x41, 0xd0, 0x8b, 0x9d, 0xff, 0x10, 0xb2, 0xe2, 0xe1, 0xca,
-	0x34, 0x69, 0x90, 0xfc, 0x22, 0xf5, 0xea, 0x74, 0x42, 0x72, 0xb7, 0x13, 0x9a, 0xfc, 0x20, 0x9c,
-	0xa2, 0x1d, 0xb1, 0x5e, 0xd1, 0x21, 0x2c, 0x06, 0x8f, 0x57, 0xa7, 0xdd, 0xf1, 0x58, 0xe0, 0x56,
-	0x2f, 0x4d, 0x2e, 0x15, 0xc4, 0x45, 0xa1, 0xcb, 0xaf, 0x53, 0x50, 0x11, 0xba, 0x34, 0x0e, 0x7d,
-	0x4c, 0x6d, 0xc3, 0x12, 0x5e, 0xd6, 0x72, 0x84, 0xe7, 0x14, 0x62, 0x7e, 0x8d, 0xae, 0xcf, 0xbc,
-	0xe0, 0xd4, 0x57, 0x66, 0x11, 0x0d, 0xac, 0x86, 0x3e, 0x52, 0x00, 0xa2, 0x15, 0x80, 0x26, 0xaf,
-	0x97, 0x12, 0xcb, 0x4c, 0xbd, 0x3e, 0x83, 0x64, 0xa0, 0xc5, 0xd6, 0x26, 0xfc, 0xff, 0x38, 0xe9,
-	0xb8, 0xf0, 0x56, 0x5e, 0x18, 0x74, 0xd3, 0x25, 0xef, 0x2c, 0xc7, 0x86, 0xf4, 0x83, 0x4b, 0xfb,
-	0x0b, 0x3c, 0xd7, 0xb8, 0xf2, 0x9f, 0x00, 0x00, 0x00, 0xff, 0xff, 0x8f, 0xa1, 0x14, 0x54, 0xe2,
-	0x32, 0x00, 0x00,
+var fileDescriptor_beam_fn_api_d24d1635dfa071c8 = []byte{
+	// 3139 bytes of a gzipped FileDescriptorProto
+	0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xbc, 0x5a, 0x5b, 0x6f, 0x1b, 0xc7,
+	0xf5, 0xf7, 0x92, 0x94, 0x44, 0x1e, 0x52, 0x12, 0x35, 0x92, 0x6c, 0x7a, 0xff, 0xce, 0xbf, 0x0e,
+	0xdb, 0x00, 0x42, 0x8a, 0xac, 0xaf, 0x48, 0xec, 0x34, 0x71, 0x22, 0x51, 0xb4, 0xcd, 0x44, 0xb6,
+	0xd9, 0x95, 0x5c, 0xb7, 0x49, 0x93, 0xc5, 0x8a, 0x3b, 0xa4, 0x17, 0x5e, 0xee, 0x6e, 0x66, 0x96,
+	0xb2, 0xe4, 0x06, 0x0d, 0x7a, 0x41, 0x8b, 0x16, 0x6d, 0xf3, 0xd2, 0x87, 0xb4, 0x6f, 0x6d, 0x81,
+	0x02, 0x7d, 0xe9, 0x07, 0xc8, 0x37, 0x28, 0x50, 0xa0, 0x5f, 0x20, 0x2f, 0x45, 0x5b, 0xa0, 0x6d,
+	0xfa, 0x5c, 0xa0, 0x6f, 0xc5, 0x5c, 0xf6, 0xc2, 0x25, 0xe9, 0x2c, 0x29, 0xa5, 0x6f, 0x3b, 0x73,
+	0xf6, 0xfc, 0x7e, 0x33, 0x67, 0xcf, 0x9c, 0x39, 0x67, 0x66, 0x61, 0x65, 0x1f, 0x9b, 0x7d, 0xa3,
+	0xeb, 0x1a, 0xa6, 0x6f, 0x6b, 0x3e, 0xf1, 0x02, 0x0f, 0x3d, 0xe7, 0x91, 0x9e, 0x66, 0xfa, 0x66,
+	0xe7, 0x21, 0xd6, 0x98, 0x54, 0xeb, 0x7b, 0x16, 0x76, 0xb4, 0xae, 0x6b, 0xe0, 0x43, 0xdc, 0x19,
+	0x04, 0xb6, 0xe7, 0x6a, 0x07, 0x97, 0xd4, 0x75, 0xae, 0x49, 0x06, 0xae, 0x8b, 0x49, 0xac, 0xad,
+	0x2e, 0x63, 0xd7, 0xf2, 0x3d, 0xdb, 0x0d, 0xa8, 0xec, 0x38, 0xdf, 0xf3, 0xbc, 0x9e, 0x83, 0x2f,
+	0xf0, 0xd6, 0xfe, 0xa0, 0x7b, 0xc1, 0xc2, 0xb4, 0x43, 0x6c, 0x3f, 0xf0, 0x88, 0x7c, 0xe3, 0x0b,
+	0xe9, 0x37, 0x02, 0xbb, 0x8f, 0x69, 0x60, 0xf6, 0x7d, 0xf9, 0xc2, 0xff, 0xa7, 0x5f, 0x78, 0x4c,
+	0x4c, 0xdf, 0xc7, 0x24, 0xa4, 0x58, 0xec, 0xe3, 0x80, 0xd8, 0x1d, 0xd9, 0xac, 0xff, 0x52, 0x81,
+	0x25, 0x1d, 0xf7, 0xbd, 0x00, 0xdf, 0x22, 0x7e, 0xa7, 0xed, 0x91, 0x00, 0xf5, 0xe1, 0xb4, 0xe9,
+	0xdb, 0x06, 0xc5, 0xe4, 0xc0, 0xee, 0x60, 0x23, 0x1e, 0x42, 0x4d, 0x39, 0xaf, 0x6c, 0x94, 0x2f,
+	0xbf, 0xa4, 0x8d, 0x9f, 0xb4, 0x6f, 0xfb, 0xd8, 0xb1, 0x5d, 0xac, 0x1d, 0x5c, 0xd2, 0x36, 0x7d,
+	0x7b, 0x57, 0xe8, 0x6f, 0x47, 0xea, 0xfa, 0x9a, 0x39, 0xa6, 0x17, 0x9d, 0x85, 0x62, 0xc7, 0xb3,
+	0x30, 0x31, 0x6c, 0xab, 0x96, 0x3b, 0xaf, 0x6c, 0x94, 0xf4, 0x05, 0xde, 0x6e, 0x59, 0xf5, 0xbf,
+	0x15, 0x00, 0xb5, 0x5c, 0x1a, 0x90, 0x41, 0x87, 0x59, 0x52, 0xc7, 0xef, 0x0d, 0x30, 0x0d, 0xd0,
+	0x73, 0xb0, 0x64, 0xc7, 0xbd, 0x4c, 0x4f, 0xe1, 0x7a, 0x8b, 0x89, 0xde, 0x96, 0x85, 0xee, 0x43,
+	0x91, 0xe0, 0x9e, 0x4d, 0x03, 0x4c, 0x6a, 0x7f, 0x5f, 0xe0, 0x43, 0x7f, 0x51, 0xcb, 0xf4, 0xbd,
+	0x34, 0x5d, 0xea, 0x49, 0xc6, 0xdb, 0xa7, 0xf4, 0x08, 0x0a, 0x61, 0x58, 0xf2, 0x89, 0xd7, 0xc1,
+	0x94, 0x1a, 0xfb, 0x03, 0xd7, 0x72, 0x70, 0xed, 0x1f, 0x02, 0xfc, 0x2b, 0x19, 0xc1, 0xdb, 0x42,
+	0x7b, 0x8b, 0x2b, 0xc7, 0x0c, 0x8b, 0x7e, 0xb2, 0x1f, 0x7d, 0x1b, 0xce, 0x0c, 0xd3, 0x18, 0x3e,
+	0xf1, 0x7a, 0x04, 0x53, 0x5a, 0xfb, 0xa7, 0xe0, 0x6b, 0xcc, 0xc2, 0xd7, 0x96, 0x20, 0x31, 0xef,
+	0xba, 0x3f, 0x4e, 0x8e, 0x06, 0xb0, 0x96, 0xe2, 0xa7, 0xbe, 0x63, 0x07, 0xb5, 0x4f, 0x05, 0xf9,
+	0xeb, 0xb3, 0x90, 0xef, 0x32, 0x84, 0x98, 0x19, 0xf9, 0x23, 0x42, 0xf4, 0x10, 0x96, 0xbb, 0xb6,
+	0x6b, 0x3a, 0xf6, 0x13, 0x1c, 0x9a, 0xf7, 0x5f, 0x82, 0xf1, 0x95, 0x8c, 0x8c, 0x37, 0xa5, 0x7a,
+	0xda, 0xbe, 0x4b, 0xdd, 0x21, 0xc1, 0x56, 0x09, 0x16, 0x88, 0x10, 0xd6, 0xbf, 0x3b, 0x07, 0xab,
+	0x43, 0x7e, 0x46, 0x7d, 0xcf, 0xa5, 0x38, 0xab, 0xa3, 0xad, 0xc1, 0x1c, 0x26, 0xc4, 0x23, 0xd2,
+	0x7d, 0x45, 0x03, 0x7d, 0x6d, 0xd4, 0xfd, 0x5e, 0x9a, 0xda, 0xfd, 0xc4, 0x40, 0x86, 0xfc, 0xaf,
+	0x3b, 0xc9, 0xff, 0x5e, 0x99, 0xcd, 0xff, 0x22, 0x8a, 0x94, 0x03, 0x7e, 0xf0, 0x99, 0x0e, 0xb8,
+	0x7d, 0x3c, 0x07, 0x8c, 0x88, 0x27, 0x78, 0xe0, 0xc1, 0xd3, 0x3d, 0x70, 0xf3, 0x18, 0x1e, 0x18,
+	0x51, 0x8f, 0x73, 0x41, 0x7b, 0xa2, 0x0b, 0xbe, 0x3a, 0xa3, 0x0b, 0x46, 0x74, 0x69, 0x1f, 0x04,
+	0xe6, 0x23, 0x42, 0x5a, 0xff, 0xa9, 0x02, 0xcb, 0xa9, 0xb8, 0x83, 0x9e, 0xc0, 0xd9, 0x94, 0x09,
+	0x86, 0xa2, 0x71, 0x7e, 0xa3, 0x7c, 0xf9, 0xc6, 0x2c, 0x66, 0x48, 0x04, 0xe5, 0x33, 0xfe, 0x78,
+	0x41, 0x1d, 0x41, 0x35, 0xed, 0x87, 0xf5, 0xdf, 0x00, 0x9c, 0x99, 0x00, 0x84, 0x96, 0x20, 0x17,
+	0x2d, 0x90, 0x9c, 0x6d, 0x21, 0x17, 0x20, 0x20, 0xa6, 0x4b, 0xbb, 0x1e, 0xe9, 0xd3, 0x5a, 0x8e,
+	0x0f, 0xf6, 0xee, 0xf1, 0x06, 0xab, 0xed, 0x45, 0x80, 0x4d, 0x37, 0x20, 0x47, 0x7a, 0x82, 0x01,
+	0x05, 0x50, 0xf1, 0x3b, 0x9e, 0xe3, 0x60, 0xbe, 0x2c, 0x69, 0x2d, 0xcf, 0x19, 0xdb, 0xc7, 0x64,
+	0x6c, 0x27, 0x20, 0x05, 0xe7, 0x10, 0x0b, 0xfa, 0xb1, 0x02, 0x6b, 0x8f, 0x6d, 0xd7, 0xf2, 0x1e,
+	0xdb, 0x6e, 0xcf, 0xa0, 0x01, 0x31, 0x03, 0xdc, 0xb3, 0x31, 0xad, 0x15, 0x38, 0xfd, 0x83, 0x63,
+	0xd2, 0x3f, 0x08, 0xa1, 0x77, 0x23, 0x64, 0x31, 0x8a, 0xd5, 0xc7, 0xa3, 0x12, 0xb4, 0x0f, 0xf3,
+	0x7c, 0xeb, 0xa4, 0xb5, 0x39, 0xce, 0xfe, 0xc6, 0x31, 0xd9, 0x1b, 0x1c, 0x4c, 0x10, 0x4a, 0x64,
+	0x66, 0x66, 0xec, 0x1e, 0xd8, 0xc4, 0x73, 0xfb, 0xd8, 0x0d, 0x68, 0x6d, 0xfe, 0x44, 0xcc, 0xdc,
+	0x4c, 0x40, 0x4a, 0x33, 0x27, 0x59, 0xd0, 0x21, 0x9c, 0xa3, 0x81, 0x19, 0x60, 0x63, 0x42, 0x66,
+	0xb2, 0x70, 0xbc, 0xcc, 0xe4, 0x2c, 0x07, 0x1f, 0x27, 0x52, 0x1d, 0x58, 0x4e, 0x79, 0x1d, 0xaa,
+	0x42, 0xfe, 0x11, 0x3e, 0x92, 0xae, 0xce, 0x1e, 0x51, 0x03, 0xe6, 0x0e, 0x4c, 0x67, 0x80, 0xf9,
+	0x0e, 0x50, 0xbe, 0xfc, 0x42, 0x86, 0x71, 0xb4, 0x23, 0x54, 0x5d, 0xe8, 0xbe, 0x9c, 0xbb, 0xa6,
+	0xa8, 0x1e, 0xac, 0x8c, 0x78, 0xdc, 0x18, 0xbe, 0xed, 0x61, 0x3e, 0x2d, 0x0b, 0x5f, 0x23, 0x82,
+	0x4d, 0x12, 0xbe, 0x0f, 0xb5, 0x49, 0x3e, 0x36, 0x86, 0xf7, 0x8d, 0x61, 0xde, 0xab, 0x19, 0x78,
+	0xd3, 0xe8, 0x47, 0x49, 0xf6, 0x0e, 0x94, 0x13, 0x3e, 0x36, 0x86, 0xf0, 0xc6, 0x30, 0xe1, 0x46,
+	0x06, 0x42, 0x0e, 0x98, 0xb2, 0xe9, 0x88, 0x7b, 0x9d, 0x8c, 0x4d, 0x13, 0xb0, 0x09, 0xc2, 0xfa,
+	0x7f, 0xf2, 0xb0, 0x22, 0x3c, 0x7c, 0xd3, 0xf7, 0x1d, 0xbb, 0x63, 0x32, 0xa3, 0xa3, 0x67, 0xa1,
+	0x12, 0x45, 0xab, 0x38, 0x95, 0x28, 0x47, 0x7d, 0x2d, 0x8b, 0xa5, 0xc2, 0xb6, 0xeb, 0x0f, 0x82,
+	0x44, 0x2a, 0xcc, 0xdb, 0x2d, 0x0b, 0xd5, 0x60, 0x01, 0x3b, 0x98, 0x31, 0xd5, 0xf2, 0xe7, 0x95,
+	0x8d, 0x8a, 0x1e, 0x36, 0xd1, 0xb7, 0x60, 0xc5, 0x1b, 0x04, 0x4c, 0xeb, 0xb1, 0x19, 0x60, 0xd2,
+	0x37, 0xc9, 0xa3, 0x30, 0xfa, 0x64, 0x0d, 0xb7, 0x23, 0x83, 0xd5, 0xee, 0x71, 0xc4, 0x07, 0x11,
+	0xa0, 0x58, 0x93, 0x55, 0x2f, 0xd5, 0x8d, 0xda, 0x00, 0x36, 0x35, 0xf6, 0xbd, 0x81, 0x6b, 0x61,
+	0xab, 0x36, 0x77, 0x5e, 0xd9, 0x58, 0xba, 0x7c, 0x29, 0x83, 0xe5, 0x5a, 0x74, 0x4b, 0xe8, 0x68,
+	0x4d, 0x77, 0xd0, 0xd7, 0x4b, 0x76, 0xd8, 0x46, 0xdf, 0x84, 0x6a, 0xdf, 0x73, 0xed, 0xc0, 0x23,
+	0x2c, 0xa0, 0xda, 0x6e, 0xd7, 0x0b, 0x63, 0x4c, 0x16, 0xdc, 0x3b, 0x91, 0x6a, 0xcb, 0xed, 0x7a,
+	0xfa, 0x72, 0x7f, 0xa8, 0x4d, 0x55, 0x03, 0xd6, 0xc7, 0x4e, 0x6d, 0x8c, 0x3f, 0x5c, 0x1c, 0xf6,
+	0x07, 0x55, 0x13, 0x85, 0x95, 0x16, 0x16, 0x56, 0xda, 0x5e, 0x58, 0x79, 0x25, 0xbf, 0xfd, 0x1f,
+	0x15, 0xa8, 0x6d, 0x63, 0xc7, 0x3c, 0xc2, 0xd6, 0xa8, 0x0b, 0xec, 0x41, 0x4d, 0xa6, 0x9c, 0xd8,
+	0x8a, 0xbf, 0x80, 0xc1, 0x4a, 0x38, 0x59, 0x5b, 0x3d, 0x8d, 0xe5, 0x74, 0xa4, 0xdb, 0x0c, 0x55,
+	0x99, 0x10, 0xbd, 0x05, 0x65, 0x33, 0x26, 0x91, 0xc3, 0xbd, 0x36, 0xeb, 0xa7, 0xd7, 0x93, 0x60,
+	0xf5, 0x9f, 0x15, 0x60, 0x6d, 0x5c, 0xbd, 0x82, 0x5e, 0x83, 0x73, 0x13, 0x33, 0x93, 0xd8, 0xbb,
+	0xcf, 0x4e, 0x48, 0x2e, 0x5a, 0x16, 0xb2, 0xa1, 0xd2, 0x61, 0x83, 0x33, 0x02, 0xef, 0x11, 0x76,
+	0xc3, 0x04, 0xe1, 0xe6, 0x31, 0x6a, 0x28, 0xad, 0xc1, 0xb4, 0xf6, 0x18, 0x9c, 0x5e, 0xee, 0x44,
+	0xcf, 0x54, 0xfd, 0x43, 0x0e, 0x20, 0x96, 0xa1, 0xf7, 0x00, 0x06, 0x14, 0x13, 0x83, 0xc7, 0x7c,
+	0x69, 0xf7, 0xf6, 0xc9, 0xf0, 0x6a, 0xf7, 0x29, 0x26, 0xbb, 0x0c, 0xf7, 0xf6, 0x29, 0xbd, 0x34,
+	0x08, 0x1b, 0x8c, 0x92, 0xda, 0x16, 0x36, 0xf8, 0x6a, 0x96, 0x5f, 0xe8, 0xa4, 0x28, 0x77, 0x6d,
+	0x0b, 0xb7, 0x18, 0x2e, 0xa3, 0xa4, 0x61, 0x83, 0x15, 0x25, 0xdc, 0xb2, 0x35, 0xe0, 0xe1, 0x42,
+	0x34, 0xd4, 0x32, 0x94, 0xa2, 0x21, 0xaa, 0xcf, 0x43, 0x29, 0x52, 0x46, 0xcf, 0x0c, 0x0d, 0x51,
+	0x7c, 0xbe, 0x18, 0x6e, 0x6b, 0x1e, 0x0a, 0xc1, 0x91, 0x8f, 0xeb, 0x9f, 0xe4, 0x60, 0x7d, 0x6c,
+	0x01, 0x81, 0x6e, 0xc3, 0x82, 0x3c, 0x5a, 0x90, 0x36, 0xd5, 0x32, 0x4e, 0xf0, 0x8e, 0xd0, 0xd2,
+	0x43, 0x75, 0x56, 0xe1, 0x10, 0x4c, 0x6d, 0x6b, 0x60, 0x3a, 0x06, 0xf1, 0xbc, 0x20, 0x74, 0x8e,
+	0xd7, 0x32, 0x02, 0x4e, 0x5a, 0x7f, 0xfa, 0x62, 0x08, 0xab, 0x33, 0xd4, 0xb1, 0xa1, 0x26, 0x7f,
+	0x52, 0xa1, 0x06, 0x5d, 0x81, 0x75, 0xb6, 0x60, 0x6d, 0x82, 0xa9, 0x21, 0xd3, 0x7e, 0xb1, 0x40,
+	0x0b, 0xe7, 0x95, 0x8d, 0xa2, 0xbe, 0x16, 0x0a, 0x6f, 0x26, 0x64, 0xf5, 0x26, 0x9c, 0x7b, 0x5a,
+	0xb9, 0x9e, 0xb1, 0x22, 0xad, 0x7f, 0xb4, 0x0a, 0x0b, 0xd2, 0xac, 0xc8, 0x84, 0xb2, 0x9f, 0x48,
+	0xc4, 0x95, 0xa9, 0x4c, 0x29, 0x41, 0xb4, 0x76, 0x90, 0xca, 0xbc, 0x93, 0x98, 0xea, 0x27, 0x65,
+	0x80, 0x38, 0x9f, 0x41, 0x4f, 0x20, 0x2c, 0xab, 0x58, 0x98, 0x13, 0xdb, 0x54, 0xe8, 0x14, 0x6f,
+	0x4e, 0x4b, 0x1c, 0xc1, 0x86, 0x0b, 0x01, 0x5b, 0x4d, 0x09, 0xa9, 0xaf, 0xf8, 0xe9, 0x2e, 0xf4,
+	0x1e, 0x2c, 0x9b, 0x9d, 0xc0, 0x3e, 0xc0, 0x31, 0xb1, 0x58, 0x6e, 0xb7, 0x67, 0x27, 0xde, 0xe4,
+	0x80, 0x11, 0xeb, 0x92, 0x39, 0xd4, 0x46, 0x36, 0x40, 0x62, 0xe7, 0x15, 0x0e, 0xd4, 0x9a, 0x9d,
+	0x2d, 0xbd, 0xe9, 0x26, 0xc0, 0xd1, 0x2d, 0x28, 0xb0, 0xa0, 0x22, 0xb7, 0xf7, 0x2b, 0x53, 0x92,
+	0xb0, 0x95, 0xaf, 0x73, 0x00, 0xf5, 0xaf, 0x79, 0x28, 0xde, 0xc1, 0x26, 0x1d, 0x10, 0x6c, 0xa1,
+	0x9f, 0x28, 0xb0, 0x26, 0xf2, 0x0e, 0x69, 0x33, 0xa3, 0xe3, 0x0d, 0xc4, 0x27, 0x63, 0x34, 0x6f,
+	0xcd, 0x3e, 0x97, 0x90, 0x42, 0xe3, 0x41, 0x44, 0x5a, 0xac, 0xc1, 0xc1, 0xc5, 0xe4, 0x90, 0x3d,
+	0x22, 0x40, 0x1f, 0x2a, 0xb0, 0x2e, 0x33, 0x9a, 0xd4, 0x78, 0x44, 0x18, 0x78, 0xfb, 0x04, 0xc6,
+	0x23, 0x92, 0x80, 0x31, 0x03, 0x5a, 0xf5, 0x46, 0x25, 0x68, 0x03, 0xaa, 0x81, 0x17, 0x98, 0x0e,
+	0xdf, 0xa9, 0x0d, 0xea, 0x87, 0x59, 0x98, 0xa2, 0x2f, 0xf1, 0x7e, 0xb6, 0x0d, 0xef, 0xb2, 0x5e,
+	0xb5, 0x09, 0x67, 0x26, 0x4c, 0x75, 0x4c, 0x86, 0xb1, 0x96, 0xcc, 0x30, 0xf2, 0xc9, 0x94, 0xf5,
+	0x26, 0xd4, 0x26, 0x8d, 0x70, 0x2a, 0x1c, 0x0a, 0x2b, 0x23, 0xab, 0x06, 0xbd, 0x0b, 0xc5, 0xbe,
+	0xb4, 0x83, 0x5c, 0x94, 0x5b, 0xc7, 0xb7, 0xa8, 0x1e, 0x61, 0xaa, 0x1f, 0xe6, 0x61, 0x69, 0x78,
+	0xc9, 0x7c, 0xde, 0x94, 0xe8, 0x05, 0x40, 0x5d, 0x62, 0x8a, 0x98, 0x48, 0x70, 0xdf, 0xb4, 0x5d,
+	0xdb, 0xed, 0x71, 0x73, 0x28, 0xfa, 0x4a, 0x28, 0xd1, 0x43, 0x01, 0xfa, 0x95, 0x02, 0x67, 0x87,
+	0x3d, 0x8c, 0x26, 0xd4, 0xc4, 0x0a, 0xc6, 0x27, 0x15, 0x2f, 0x86, 0x7d, 0x8d, 0x46, 0xa3, 0x10,
+	0xfe, 0x76, 0xc6, 0x1b, 0x2f, 0x55, 0xdf, 0x80, 0x73, 0x4f, 0x53, 0x9c, 0xca, 0x0d, 0x5e, 0x85,
+	0xe5, 0xcf, 0xce, 0x77, 0x27, 0xab, 0xff, 0x69, 0x0e, 0x0a, 0x2c, 0x76, 0x20, 0x03, 0xca, 0x62,
+	0x8f, 0x36, 0x5c, 0x33, 0x4a, 0x59, 0x6f, 0xcc, 0x10, 0x85, 0x64, 0xe3, 0xae, 0xd9, 0xc7, 0x3a,
+	0xf4, 0xa3, 0x67, 0x84, 0xa1, 0xc2, 0x97, 0x3a, 0x26, 0x86, 0x65, 0x06, 0x66, 0x78, 0xb2, 0xf9,
+	0xda, 0x2c, 0x14, 0x0d, 0x01, 0xb4, 0x6d, 0x06, 0xe6, 0xed, 0x53, 0x7a, 0xb9, 0x13, 0x37, 0x51,
+	0x00, 0x2b, 0x96, 0x4d, 0x03, 0x62, 0xef, 0x8b, 0x04, 0x9c, 0x73, 0x4d, 0x79, 0xa8, 0x39, 0xc4,
+	0xb5, 0x9d, 0x40, 0x93, 0x84, 0x55, 0x2b, 0xd5, 0x87, 0x0c, 0x80, 0x9e, 0x39, 0xe8, 0x61, 0x41,
+	0xf7, 0xe9, 0x74, 0x47, 0x8a, 0x43, 0x74, 0xb7, 0x18, 0x8c, 0xe4, 0x29, 0xf5, 0xc2, 0x86, 0x7a,
+	0x03, 0x20, 0xb6, 0x2b, 0x3a, 0x07, 0x25, 0xf6, 0x95, 0xa8, 0x6f, 0x76, 0xb0, 0xac, 0x26, 0xe3,
+	0x0e, 0x84, 0xa0, 0xc0, 0xbf, 0x61, 0x9e, 0x0b, 0xf8, 0xb3, 0xfa, 0x45, 0x56, 0x8d, 0xc7, 0x56,
+	0x8a, 0x1c, 0x42, 0x49, 0x38, 0x84, 0xfa, 0x2e, 0x54, 0xd3, 0xb3, 0x65, 0x6f, 0x72, 0xf3, 0x86,
+	0x6f, 0xf2, 0x06, 0x73, 0x31, 0x3a, 0xe8, 0x4b, 0x77, 0x62, 0x8f, 0xac, 0xa7, 0x6f, 0xbb, 0x9c,
+	0x33, 0xaf, 0xb3, 0x47, 0xde, 0x63, 0x1e, 0xf2, 0x94, 0x88, 0xf5, 0x98, 0x87, 0xea, 0xdb, 0x50,
+	0x8a, 0xa6, 0x37, 0x7e, 0x08, 0xe8, 0x1a, 0x94, 0xa2, 0x5b, 0xaf, 0x0c, 0xd5, 0x59, 0xfc, 0x32,
+	0xcb, 0x62, 0x99, 0xf1, 0xd5, 0x23, 0xa8, 0xa6, 0x33, 0x9a, 0x31, 0x2b, 0xe2, 0xde, 0x70, 0x05,
+	0x78, 0x7d, 0xe6, 0x88, 0x90, 0x2c, 0x10, 0x7f, 0x9b, 0x83, 0x67, 0x9e, 0x7a, 0x20, 0x7e, 0x82,
+	0x89, 0xf4, 0xe7, 0x9b, 0xe0, 0xbe, 0x03, 0x8b, 0x3e, 0xb1, 0xfb, 0x26, 0x39, 0x92, 0x59, 0xba,
+	0xc8, 0x4a, 0x66, 0xaf, 0x3c, 0x2b, 0x12, 0x8e, 0x67, 0xe7, 0xf5, 0xef, 0x14, 0xe0, 0xec, 0xc4,
+	0xdb, 0xa3, 0xac, 0x57, 0x33, 0x4f, 0x60, 0xc9, 0xc2, 0xd4, 0x26, 0xd8, 0x12, 0x97, 0x07, 0xe1,
+	0xfc, 0x77, 0x8f, 0x7b, 0x7d, 0xa5, 0x6d, 0x0b, 0x58, 0xde, 0x27, 0x73, 0x87, 0x45, 0x2b, 0xd9,
+	0xa7, 0xfe, 0x5e, 0x81, 0x4a, 0xf2, 0x2d, 0x74, 0x19, 0xd6, 0xa3, 0x5d, 0xca, 0xeb, 0xca, 0x1d,
+	0xc7, 0xc2, 0xe2, 0x5e, 0x35, 0xa7, 0xaf, 0x86, 0xc2, 0x7b, 0x5d, 0x3d, 0x14, 0xa1, 0x8b, 0xb0,
+	0x66, 0x3a, 0x8e, 0xf7, 0x38, 0x9c, 0x80, 0x21, 0xee, 0x8b, 0xf9, 0x34, 0xf2, 0x3a, 0x92, 0x32,
+	0x8e, 0xdf, 0xe6, 0x12, 0x74, 0x0d, 0x6a, 0x98, 0x06, 0x76, 0xdf, 0x0c, 0xb0, 0x65, 0x0c, 0xa5,
+	0x75, 0x54, 0xae, 0xc5, 0xd3, 0x91, 0x3c, 0x99, 0xab, 0x50, 0xf5, 0x43, 0x05, 0xd0, 0xe8, 0xb4,
+	0xc6, 0x2c, 0x8c, 0xce, 0xf0, 0xc2, 0xb8, 0x73, 0xa2, 0xc6, 0x4c, 0x2e, 0x96, 0x7f, 0xe7, 0x41,
+	0x9d, 0x7c, 0x7f, 0x33, 0xea, 0x81, 0xca, 0x49, 0x7a, 0xe0, 0xff, 0xac, 0x0e, 0x1d, 0xc0, 0x52,
+	0xe7, 0xa1, 0xe9, 0xba, 0xd8, 0x19, 0x76, 0xd2, 0xbb, 0xc7, 0xbe, 0xe1, 0xd2, 0x1a, 0x02, 0x57,
+	0x74, 0x2e, 0x76, 0x12, 0x2d, 0xaa, 0xfe, 0x42, 0x81, 0x4a, 0x52, 0x9e, 0xe5, 0x84, 0xf2, 0x22,
+	0xac, 0x39, 0x26, 0x0d, 0x8c, 0xd0, 0xec, 0xe1, 0x99, 0x24, 0x73, 0x84, 0x39, 0x1d, 0x31, 0x59,
+	0x5b, 0x88, 0xa4, 0x57, 0xa1, 0xab, 0x70, 0xba, 0x6b, 0x13, 0x1a, 0x18, 0x91, 0x29, 0x93, 0xe7,
+	0x98, 0x73, 0xfa, 0x1a, 0x97, 0xea, 0x52, 0x28, 0xb5, 0xea, 0x37, 0x60, 0x7d, 0xec, 0x3d, 0x6e,
+	0xd6, 0x02, 0xb8, 0x06, 0xa7, 0xc7, 0x5f, 0xc2, 0xd5, 0x3f, 0x56, 0xa0, 0x18, 0xe5, 0xa5, 0xb7,
+	0xc5, 0x7e, 0x20, 0xfd, 0xe6, 0x6a, 0x46, 0x7b, 0x47, 0x99, 0x1d, 0xdb, 0xa3, 0x74, 0xb1, 0xa3,
+	0x58, 0x50, 0xe0, 0x3b, 0x56, 0xc6, 0xb8, 0x94, 0x36, 0x75, 0x6e, 0xd4, 0xd4, 0x48, 0x8e, 0x4d,
+	0x1c, 0xf7, 0xf2, 0xe7, 0xfa, 0xcf, 0xf3, 0x50, 0xe1, 0x67, 0x37, 0xa1, 0x39, 0xd2, 0x97, 0x6e,
+	0xa3, 0xf4, 0xb9, 0x71, 0xf4, 0x3b, 0x50, 0x12, 0xd7, 0x29, 0x6c, 0x61, 0xe7, 0xf9, 0x22, 0xbe,
+	0x90, 0x71, 0xf2, 0x9c, 0xfe, 0x4d, 0x7c, 0xa4, 0x17, 0xa9, 0x7c, 0x42, 0x6f, 0x42, 0xbe, 0x87,
+	0x83, 0x69, 0xff, 0xb1, 0xe0, 0x40, 0xb7, 0x70, 0xe2, 0x7f, 0x00, 0x86, 0x82, 0xf6, 0x60, 0xde,
+	0xf4, 0x7d, 0xec, 0x5a, 0x61, 0xf2, 0x77, 0x7d, 0x1a, 0xbc, 0x4d, 0xae, 0x1a, 0x43, 0x4a, 0x2c,
+	0xf4, 0x55, 0x98, 0xeb, 0x38, 0xd8, 0x24, 0x61, 0x96, 0x77, 0x6d, 0x1a, 0xd0, 0x06, 0xd3, 0x8c,
+	0x31, 0x05, 0x52, 0xf2, 0xff, 0x81, 0x8f, 0x73, 0xb0, 0x28, 0x3f, 0x8b, 0x8c, 0x4c, 0xe9, 0xef,
+	0x32, 0xfe, 0x17, 0x81, 0x9d, 0x21, 0xc3, 0xbd, 0x34, 0xb5, 0xe1, 0xa2, 0x7b, 0x65, 0x6e, 0xb9,
+	0xfb, 0x69, 0xcb, 0xbd, 0x3c, 0x8b, 0xe5, 0x22, 0xcc, 0xd0, 0x74, 0x7a, 0xca, 0x74, 0xd7, 0x67,
+	0x30, 0x5d, 0x04, 0x2a, 0x6d, 0x97, 0xbc, 0xf7, 0xfe, 0x4b, 0x01, 0x8a, 0xa1, 0x53, 0xa1, 0x36,
+	0xcc, 0x8b, 0xbf, 0xa4, 0x64, 0xea, 0xf3, 0xe2, 0x94, 0x5e, 0xa9, 0xe9, 0x5c, 0x9b, 0x0d, 0x5f,
+	0xe0, 0x20, 0x0a, 0xab, 0xfd, 0x81, 0xc3, 0xf6, 0x3b, 0xdf, 0x18, 0x39, 0x83, 0xdd, 0x9c, 0x16,
+	0xfe, 0x8e, 0x84, 0x4a, 0x1e, 0xba, 0xae, 0xf4, 0xd3, 0x9d, 0xc8, 0x82, 0xa5, 0x7d, 0xb3, 0x67,
+	0x24, 0x8e, 0x99, 0xf3, 0x53, 0xfd, 0xa2, 0x11, 0xf1, 0x6d, 0x99, 0xbd, 0xe4, 0x91, 0x72, 0x65,
+	0x3f, 0xd1, 0x56, 0x55, 0x98, 0x17, 0xd3, 0x4d, 0x6e, 0xd1, 0x15, 0xbe, 0x45, 0xab, 0xdf, 0x57,
+	0x60, 0x65, 0x64, 0xb0, 0x59, 0x22, 0x7c, 0x1d, 0x16, 0x63, 0x33, 0x25, 0x42, 0x53, 0x74, 0x14,
+	0xdc, 0xb2, 0xd0, 0x69, 0x98, 0x17, 0xd7, 0xcf, 0x32, 0x38, 0xc9, 0x56, 0x38, 0x8c, 0x42, 0x3c,
+	0x8c, 0x0f, 0xa0, 0x92, 0x9c, 0x42, 0xc6, 0x01, 0xc4, 0x76, 0x4b, 0x0c, 0x20, 0x3a, 0x4d, 0x9f,
+	0x66, 0x00, 0xd1, 0xb9, 0xf5, 0xeb, 0xb0, 0x9c, 0x0a, 0x38, 0xe8, 0x05, 0x40, 0x1d, 0xcf, 0x0d,
+	0x6c, 0x77, 0x60, 0x8a, 0x6b, 0x18, 0x7e, 0x5c, 0x2e, 0x6c, 0xb8, 0x92, 0x94, 0xf0, 0x73, 0xf6,
+	0xfa, 0x7d, 0xa8, 0xa6, 0x57, 0xde, 0x94, 0x10, 0x51, 0x48, 0xcf, 0x25, 0x42, 0xfa, 0x06, 0xa0,
+	0xd1, 0xc8, 0x15, 0xbd, 0xa9, 0x24, 0xde, 0x5c, 0x87, 0xd5, 0x31, 0x2b, 0xb5, 0xbe, 0x0a, 0x2b,
+	0x23, 0x51, 0xaa, 0xbe, 0x26, 0x51, 0x87, 0xd6, 0x5f, 0xfd, 0xd7, 0x05, 0x28, 0xee, 0x78, 0xf2,
+	0x00, 0xe1, 0x1b, 0x50, 0xa4, 0xf8, 0x00, 0x13, 0x3b, 0x10, 0x8e, 0xb3, 0x94, 0xb9, 0x16, 0x0d,
+	0x21, 0xb4, 0x5d, 0xa9, 0x2f, 0x2e, 0xf1, 0x22, 0xb8, 0xd9, 0x0b, 0x34, 0x54, 0x63, 0xb5, 0x0f,
+	0xa5, 0x66, 0x2f, 0xac, 0x4c, 0xc3, 0x26, 0xbf, 0xcf, 0x20, 0xac, 0x94, 0x2d, 0x88, 0x08, 0xca,
+	0x1b, 0x63, 0xf6, 0xbb, 0xb9, 0x2c, 0xdb, 0xed, 0xfc, 0xa8, 0xdb, 0x3d, 0x0b, 0x15, 0xc7, 0xeb,
+	0x19, 0x8e, 0x27, 0xaf, 0xd1, 0x16, 0xc4, 0x2b, 0x8e, 0xd7, 0xdb, 0x91, 0x5d, 0xcc, 0xeb, 0x82,
+	0x87, 0x04, 0x9b, 0x56, 0xad, 0xc8, 0x85, 0xb2, 0xa5, 0x7e, 0x1d, 0x0a, 0x3b, 0x36, 0x0d, 0x50,
+	0x1b, 0xd8, 0xeb, 0x06, 0x76, 0x03, 0x62, 0xe3, 0x30, 0x19, 0xbd, 0x30, 0xa5, 0x51, 0x75, 0x70,
+	0xc4, 0x93, 0x8d, 0xa9, 0x4a, 0xa0, 0x18, 0xda, 0xb8, 0xde, 0x85, 0x02, 0x33, 0x33, 0x5a, 0x86,
+	0xf2, 0xfd, 0xbb, 0xbb, 0xed, 0x66, 0xa3, 0x75, 0xb3, 0xd5, 0xdc, 0xae, 0x9e, 0x42, 0x25, 0x98,
+	0xdb, 0xd3, 0x37, 0x1b, 0xcd, 0xaa, 0xc2, 0x1e, 0xb7, 0x9b, 0x5b, 0xf7, 0x6f, 0x55, 0x73, 0xa8,
+	0x08, 0x85, 0xd6, 0xdd, 0x9b, 0xf7, 0xaa, 0x79, 0x04, 0x30, 0x7f, 0xf7, 0xde, 0x5e, 0xab, 0xd1,
+	0xac, 0x16, 0x58, 0xef, 0x83, 0x4d, 0xfd, 0x6e, 0x75, 0x8e, 0xbd, 0xda, 0xd4, 0xf5, 0x7b, 0x7a,
+	0x75, 0x1e, 0x55, 0xa0, 0xd8, 0xd0, 0x5b, 0x7b, 0xad, 0xc6, 0xe6, 0x4e, 0x75, 0xa1, 0x5e, 0x01,
+	0xd8, 0xf1, 0x7a, 0x0d, 0xcf, 0x0d, 0x88, 0xe7, 0xd4, 0xff, 0x5c, 0xe0, 0x9e, 0x44, 0x82, 0x07,
+	0x1e, 0x79, 0x14, 0xff, 0x98, 0xf4, 0x7f, 0x50, 0x7a, 0xcc, 0x3b, 0xe2, 0x45, 0x5c, 0x14, 0x1d,
+	0x2d, 0x0b, 0xed, 0x43, 0xb5, 0x23, 0xd4, 0x8d, 0xf0, 0x07, 0x57, 0xe9, 0x05, 0x33, 0xff, 0xa0,
+	0xb1, 0x2c, 0x01, 0x9b, 0x12, 0x8f, 0x71, 0x38, 0x5e, 0xaf, 0xc7, 0xea, 0xda, 0x88, 0x23, 0x7f,
+	0x4c, 0x0e, 0x09, 0x18, 0x71, 0x58, 0xb0, 0x62, 0x92, 0xc0, 0xee, 0x9a, 0x9d, 0x20, 0x26, 0x29,
+	0x1c, 0x8f, 0xa4, 0x1a, 0x22, 0x46, 0x2c, 0x5d, 0x7e, 0x5b, 0x72, 0x60, 0x53, 0xe6, 0xc0, 0x11,
+	0xcd, 0xdc, 0xf1, 0x68, 0x56, 0x22, 0xc8, 0x88, 0xe7, 0x1d, 0x98, 0xf7, 0x4d, 0x62, 0xf6, 0x69,
+	0x0d, 0xb8, 0x63, 0x36, 0xb3, 0xef, 0x45, 0xa9, 0xaf, 0xaf, 0xb5, 0x39, 0x8e, 0xfc, 0x2f, 0x48,
+	0x80, 0xaa, 0xd7, 0xa1, 0x9c, 0xe8, 0xfe, 0xac, 0xf3, 0xc5, 0x52, 0xb2, 0xca, 0xfb, 0x32, 0x0f,
+	0x6c, 0x31, 0x89, 0x0c, 0xae, 0x51, 0xce, 0xa4, 0x24, 0x72, 0xa6, 0xfa, 0x45, 0x16, 0xee, 0x3c,
+	0x3f, 0xbb, 0x3b, 0xd6, 0x9f, 0x67, 0x1e, 0x1c, 0x6b, 0x3c, 0x0d, 0xfd, 0xf2, 0x47, 0x0a, 0x2c,
+	0x6e, 0x61, 0xb3, 0x7f, 0xd3, 0x95, 0x0b, 0x00, 0xfd, 0x40, 0x81, 0x85, 0xf0, 0x39, 0x6b, 0x42,
+	0x35, 0xe6, 0x5f, 0x52, 0xf5, 0xfa, 0x2c, 0xba, 0x22, 0x98, 0x9f, 0xda, 0x50, 0x2e, 0x2a, 0x97,
+	0xdf, 0x07, 0x10, 0x23, 0xe3, 0x75, 0x86, 0x2b, 0xeb, 0x8d, 0x0b, 0x53, 0xd6, 0x2c, 0xea, 0xb4,
+	0x0a, 0x92, 0xfd, 0x87, 0x0a, 0x94, 0x05, 0xbd, 0xd8, 0xc8, 0x0f, 0x61, 0x4e, 0x3c, 0x5c, 0x99,
+	0x26, 0xa5, 0x91, 0x33, 0x52, 0xaf, 0x4e, 0xa7, 0x24, 0xb7, 0x2f, 0x31, 0x92, 0x1f, 0x45, 0x9f,
+	0x68, 0x47, 0xac, 0x57, 0x74, 0x08, 0x0b, 0xe1, 0xe3, 0xd5, 0x69, 0xb7, 0x30, 0x16, 0xb8, 0xd5,
+	0x4b, 0xd9, 0xb5, 0xc2, 0xb8, 0x28, 0xc6, 0xf2, 0xbb, 0x1c, 0xd4, 0xc4, 0x58, 0x9a, 0x87, 0x01,
+	0x26, 0xae, 0xe9, 0x08, 0x2f, 0x6b, 0x7b, 0xc2, 0x73, 0xca, 0x09, 0xbf, 0x46, 0xd7, 0x67, 0x5e,
+	0x70, 0xea, 0xcb, 0xb3, 0xa8, 0x86, 0x56, 0x43, 0xdf, 0x53, 0x00, 0xe2, 0x15, 0x80, 0xb2, 0xd7,
+	0x3e, 0xa9, 0x65, 0xa6, 0x5e, 0x9f, 0x41, 0x33, 0x1c, 0xc5, 0xd6, 0x26, 0x7c, 0x69, 0x92, 0x76,
+	0x52, 0x79, 0xab, 0x24, 0x0c, 0xba, 0xe9, 0xdb, 0x6f, 0x2d, 0x25, 0x44, 0xc6, 0xc1, 0xa5, 0xfd,
+	0x79, 0x9e, 0x3c, 0x5c, 0xf9, 0x6f, 0x00, 0x00, 0x00, 0xff, 0xff, 0x0f, 0x46, 0x65, 0x7e, 0x89,
+	0x31, 0x00, 0x00,
 }
diff --git a/sdks/go/pkg/beam/model/jobmanagement_v1/beam_artifact_api.pb.go b/sdks/go/pkg/beam/model/jobmanagement_v1/beam_artifact_api.pb.go
index 9988c16..d2c8261 100644
--- a/sdks/go/pkg/beam/model/jobmanagement_v1/beam_artifact_api.pb.go
+++ b/sdks/go/pkg/beam/model/jobmanagement_v1/beam_artifact_api.pb.go
@@ -29,9 +29,6 @@
 	Name string `protobuf:"bytes,1,opt,name=name,proto3" json:"name,omitempty"`
 	// (Optional) The Unix-like permissions of the artifact
 	Permissions uint32 `protobuf:"varint,2,opt,name=permissions,proto3" json:"permissions,omitempty"`
-	// (Optional) The base64-encoded md5 checksum of the artifact. Used, among other things, by
-	// harness boot code to validate the integrity of the artifact.
-	Md5X string `protobuf:"bytes,3,opt,name=md5X,proto3" json:"md5X,omitempty"`
 	// (Optional) The hex-encoded sha256 checksum of the artifact. Used, among other things, by
 	// harness boot code to validate the integrity of the artifact.
 	Sha256               string   `protobuf:"bytes,4,opt,name=sha256,proto3" json:"sha256,omitempty"`
@@ -44,7 +41,7 @@
 func (m *ArtifactMetadata) String() string { return proto.CompactTextString(m) }
 func (*ArtifactMetadata) ProtoMessage()    {}
 func (*ArtifactMetadata) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{0}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{0}
 }
 func (m *ArtifactMetadata) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ArtifactMetadata.Unmarshal(m, b)
@@ -78,13 +75,6 @@
 	return 0
 }
 
-func (m *ArtifactMetadata) GetMd5X() string {
-	if m != nil {
-		return m.Md5X
-	}
-	return ""
-}
-
 func (m *ArtifactMetadata) GetSha256() string {
 	if m != nil {
 		return m.Sha256
@@ -104,7 +94,7 @@
 func (m *Manifest) String() string { return proto.CompactTextString(m) }
 func (*Manifest) ProtoMessage()    {}
 func (*Manifest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{1}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{1}
 }
 func (m *Manifest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Manifest.Unmarshal(m, b)
@@ -144,7 +134,7 @@
 func (m *ProxyManifest) String() string { return proto.CompactTextString(m) }
 func (*ProxyManifest) ProtoMessage()    {}
 func (*ProxyManifest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{2}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{2}
 }
 func (m *ProxyManifest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProxyManifest.Unmarshal(m, b)
@@ -190,7 +180,7 @@
 func (m *ProxyManifest_Location) String() string { return proto.CompactTextString(m) }
 func (*ProxyManifest_Location) ProtoMessage()    {}
 func (*ProxyManifest_Location) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{2, 0}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{2, 0}
 }
 func (m *ProxyManifest_Location) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProxyManifest_Location.Unmarshal(m, b)
@@ -238,7 +228,7 @@
 func (m *GetManifestRequest) String() string { return proto.CompactTextString(m) }
 func (*GetManifestRequest) ProtoMessage()    {}
 func (*GetManifestRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{3}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{3}
 }
 func (m *GetManifestRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_GetManifestRequest.Unmarshal(m, b)
@@ -277,7 +267,7 @@
 func (m *GetManifestResponse) String() string { return proto.CompactTextString(m) }
 func (*GetManifestResponse) ProtoMessage()    {}
 func (*GetManifestResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{4}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{4}
 }
 func (m *GetManifestResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_GetManifestResponse.Unmarshal(m, b)
@@ -320,7 +310,7 @@
 func (m *GetArtifactRequest) String() string { return proto.CompactTextString(m) }
 func (*GetArtifactRequest) ProtoMessage()    {}
 func (*GetArtifactRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{5}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{5}
 }
 func (m *GetArtifactRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_GetArtifactRequest.Unmarshal(m, b)
@@ -366,7 +356,7 @@
 func (m *ArtifactChunk) String() string { return proto.CompactTextString(m) }
 func (*ArtifactChunk) ProtoMessage()    {}
 func (*ArtifactChunk) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{6}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{6}
 }
 func (m *ArtifactChunk) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ArtifactChunk.Unmarshal(m, b)
@@ -408,7 +398,7 @@
 func (m *PutArtifactMetadata) String() string { return proto.CompactTextString(m) }
 func (*PutArtifactMetadata) ProtoMessage()    {}
 func (*PutArtifactMetadata) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{7}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{7}
 }
 func (m *PutArtifactMetadata) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_PutArtifactMetadata.Unmarshal(m, b)
@@ -459,7 +449,7 @@
 func (m *PutArtifactRequest) String() string { return proto.CompactTextString(m) }
 func (*PutArtifactRequest) ProtoMessage()    {}
 func (*PutArtifactRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{8}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{8}
 }
 func (m *PutArtifactRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_PutArtifactRequest.Unmarshal(m, b)
@@ -598,7 +588,7 @@
 func (m *PutArtifactResponse) String() string { return proto.CompactTextString(m) }
 func (*PutArtifactResponse) ProtoMessage()    {}
 func (*PutArtifactResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{9}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{9}
 }
 func (m *PutArtifactResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_PutArtifactResponse.Unmarshal(m, b)
@@ -635,7 +625,7 @@
 func (m *CommitManifestRequest) String() string { return proto.CompactTextString(m) }
 func (*CommitManifestRequest) ProtoMessage()    {}
 func (*CommitManifestRequest) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{10}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{10}
 }
 func (m *CommitManifestRequest) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_CommitManifestRequest.Unmarshal(m, b)
@@ -684,7 +674,7 @@
 func (m *CommitManifestResponse) String() string { return proto.CompactTextString(m) }
 func (*CommitManifestResponse) ProtoMessage()    {}
 func (*CommitManifestResponse) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_artifact_api_d51e52a8ee148278, []int{11}
+	return fileDescriptor_beam_artifact_api_09b5b695a8be46db, []int{11}
 }
 func (m *CommitManifestResponse) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_CommitManifestResponse.Unmarshal(m, b)
@@ -1011,49 +1001,48 @@
 }
 
 func init() {
-	proto.RegisterFile("beam_artifact_api.proto", fileDescriptor_beam_artifact_api_d51e52a8ee148278)
+	proto.RegisterFile("beam_artifact_api.proto", fileDescriptor_beam_artifact_api_09b5b695a8be46db)
 }
 
-var fileDescriptor_beam_artifact_api_d51e52a8ee148278 = []byte{
-	// 626 bytes of a gzipped FileDescriptorProto
-	0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xac, 0x56, 0xc1, 0x6e, 0xd3, 0x40,
-	0x10, 0xed, 0xba, 0x55, 0x49, 0xc7, 0xb4, 0x54, 0x5b, 0xb5, 0x58, 0x39, 0x45, 0x46, 0xa2, 0x39,
-	0x59, 0xad, 0x51, 0x2b, 0x21, 0x0a, 0x55, 0xdb, 0x43, 0x7b, 0x68, 0xa4, 0xe2, 0x82, 0x84, 0xca,
-	0xc1, 0xda, 0x24, 0xdb, 0x64, 0x69, 0xbc, 0x6b, 0xec, 0x4d, 0x04, 0x77, 0x0e, 0x88, 0x1b, 0x57,
-	0x4e, 0x7c, 0x00, 0x3f, 0xc0, 0x27, 0xf0, 0x31, 0xfc, 0x03, 0xf2, 0xda, 0xeb, 0xc6, 0x8d, 0x23,
-	0x39, 0xa1, 0xb7, 0xc9, 0x6c, 0xde, 0x9b, 0x37, 0x6f, 0x66, 0x57, 0x86, 0xc7, 0x6d, 0x4a, 0x02,
-	0x9f, 0x44, 0x92, 0x5d, 0x93, 0x8e, 0xf4, 0x49, 0xc8, 0x9c, 0x30, 0x12, 0x52, 0xe0, 0x6d, 0x11,
-	0xf5, 0x1c, 0x12, 0x92, 0x4e, 0x9f, 0x3a, 0xc9, 0x7f, 0x9c, 0x40, 0x74, 0xe9, 0xc0, 0xf9, 0x20,
-	0xda, 0x7e, 0x40, 0x38, 0xe9, 0xd1, 0x80, 0x72, 0xe9, 0x8c, 0x76, 0x6d, 0x09, 0xeb, 0x47, 0x19,
-	0xbc, 0x45, 0x25, 0xe9, 0x12, 0x49, 0x30, 0x86, 0x25, 0x4e, 0x02, 0x6a, 0xa1, 0x06, 0x6a, 0xae,
-	0x78, 0x2a, 0xc6, 0x0d, 0x30, 0x43, 0x1a, 0x05, 0x2c, 0x8e, 0x99, 0xe0, 0xb1, 0x65, 0x34, 0x50,
-	0x73, 0xd5, 0x1b, 0x4f, 0x25, 0xa8, 0xa0, 0xbb, 0xf7, 0xce, 0x5a, 0x4c, 0x51, 0x49, 0x8c, 0xb7,
-	0x60, 0x39, 0xee, 0x13, 0x77, 0x6f, 0xdf, 0x5a, 0x52, 0xd9, 0xec, 0x97, 0x4d, 0xa0, 0xd6, 0x22,
-	0x9c, 0x5d, 0xd3, 0x58, 0xe2, 0xb7, 0x50, 0xd3, 0x0d, 0x58, 0xa8, 0xb1, 0xd8, 0x34, 0xdd, 0xe7,
-	0x4e, 0x45, 0xf5, 0xce, 0x5d, 0xe9, 0x5e, 0x4e, 0x65, 0xff, 0x45, 0xb0, 0x7a, 0x11, 0x89, 0x4f,
-	0x9f, 0xf3, 0x42, 0x2d, 0xa8, 0x05, 0x59, 0xac, 0x5a, 0x33, 0xdd, 0xdd, 0xca, 0x85, 0x34, 0x89,
-	0x97, 0x53, 0xe0, 0xf7, 0x50, 0x1b, 0x88, 0x0e, 0x91, 0x4c, 0x70, 0xcb, 0x50, 0xba, 0x0f, 0x2b,
-	0xd3, 0x15, 0x84, 0x39, 0xe7, 0x19, 0x8d, 0x97, 0x13, 0xd6, 0x77, 0xa0, 0xa6, 0xb3, 0xa5, 0xe3,
-	0x58, 0x87, 0xc5, 0x61, 0xc4, 0xd4, 0x18, 0x56, 0xbc, 0x24, 0xb4, 0x5f, 0x02, 0x3e, 0xa5, 0x32,
-	0xd7, 0x49, 0x3f, 0x0e, 0x13, 0x91, 0xdb, 0xf0, 0x28, 0xa2, 0x32, 0x62, 0x74, 0x44, 0x06, 0xbe,
-	0x14, 0x37, 0x94, 0x67, 0x34, 0x6b, 0x79, 0xfa, 0x4d, 0x92, 0xb5, 0xbb, 0xb0, 0x51, 0x80, 0xc7,
-	0xa1, 0xe0, 0x31, 0xbd, 0x67, 0xcf, 0xec, 0xd7, 0x4a, 0xa4, 0x9e, 0x9a, 0x16, 0x59, 0xd6, 0x60,
-	0x89, 0x70, 0xa3, 0x54, 0xf8, 0x13, 0x58, 0xd5, 0x7c, 0x27, 0xfd, 0x21, 0xbf, 0x49, 0xd8, 0x92,
-	0x55, 0x50, 0x6c, 0x0f, 0x3d, 0x15, 0xdb, 0x3f, 0x11, 0x6c, 0x5c, 0x0c, 0xe5, 0xc4, 0xa6, 0xbb,
-	0xb0, 0x19, 0x4b, 0xd2, 0x63, 0xbc, 0xe7, 0xc7, 0x54, 0xed, 0x71, 0xc1, 0xa4, 0x8d, 0xec, 0xf0,
-	0x32, 0x3d, 0x53, 0x05, 0x93, 0x7d, 0x0d, 0x32, 0xbc, 0x92, 0xf4, 0x7f, 0xfb, 0xaa, 0xa9, 0xec,
-	0x3f, 0x08, 0xf0, 0x98, 0x44, 0xed, 0xcd, 0xd5, 0x58, 0xb5, 0x74, 0x00, 0x07, 0xd5, 0xb7, 0x6c,
-	0xb2, 0xe3, 0xb3, 0x85, 0xdb, 0x92, 0xf8, 0x3c, 0x73, 0x2a, 0xed, 0x62, 0x7f, 0xe6, 0x2e, 0x94,
-	0xdf, 0x67, 0x0b, 0xa9, 0xc7, 0xc7, 0x2b, 0xf0, 0xa0, 0x23, 0xb8, 0xa4, 0x5c, 0xda, 0x9b, 0x05,
-	0xb7, 0xf5, 0x32, 0xd9, 0x3f, 0x10, 0x6c, 0x9e, 0x88, 0x20, 0x60, 0x13, 0x6b, 0x7a, 0xcf, 0x57,
-	0x73, 0xea, 0x58, 0x8d, 0xa9, 0x63, 0xb5, 0x8f, 0x60, 0xeb, 0xae, 0xb6, 0xec, 0x0e, 0x54, 0xbd,
-	0x43, 0xee, 0x6f, 0x03, 0xb6, 0x74, 0xd3, 0x97, 0xba, 0x44, 0x34, 0x62, 0x1d, 0x8a, 0xbf, 0x21,
-	0x30, 0xc7, 0x2c, 0xc1, 0x2f, 0xe6, 0x19, 0x62, 0xe6, 0x56, 0xfd, 0x60, 0x3e, 0x70, 0xda, 0x4e,
-	0x13, 0xe1, 0xef, 0x08, 0xd6, 0x8a, 0xbd, 0xe2, 0x57, 0x95, 0x29, 0x4b, 0x07, 0x58, 0x3f, 0x9c,
-	0x1b, 0x9f, 0xaa, 0x72, 0x7f, 0x19, 0x60, 0xdd, 0x4a, 0xcd, 0x6c, 0xd5, 0xee, 0x7d, 0x45, 0x60,
-	0x8e, 0xbd, 0x4e, 0x33, 0xb8, 0x37, 0xf9, 0x24, 0xce, 0xe0, 0x5e, 0xd9, 0x83, 0xf8, 0x25, 0x95,
-	0x32, 0xc7, 0x20, 0x27, 0x1f, 0xbe, 0xfa, 0x9c, 0x57, 0x6e, 0x07, 0x1d, 0x9f, 0xc2, 0xd3, 0xa9,
-	0xd0, 0x02, 0xf2, 0xd8, 0xd4, 0xd0, 0xa3, 0x90, 0x5d, 0xad, 0x17, 0x8e, 0xfd, 0xd1, 0x6e, 0x7b,
-	0x59, 0x7d, 0x2f, 0x3c, 0xfb, 0x17, 0x00, 0x00, 0xff, 0xff, 0x03, 0x0a, 0x3b, 0x02, 0x4a, 0x08,
-	0x00, 0x00,
+var fileDescriptor_beam_artifact_api_09b5b695a8be46db = []byte{
+	// 618 bytes of a gzipped FileDescriptorProto
+	0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xac, 0x96, 0xc1, 0x6e, 0xd3, 0x4c,
+	0x10, 0xc7, 0xbb, 0x6e, 0xd5, 0x2f, 0x1d, 0x7f, 0x2d, 0xd5, 0x56, 0x2d, 0x56, 0x4e, 0x91, 0x91,
+	0x68, 0x4e, 0x56, 0x6b, 0x44, 0x25, 0x44, 0xa1, 0x6a, 0x7a, 0x68, 0x0f, 0x8d, 0x54, 0x5c, 0xb8,
+	0x94, 0x83, 0xd9, 0x24, 0xdb, 0x64, 0x69, 0xbc, 0x6b, 0xec, 0x4d, 0x04, 0x77, 0x0e, 0x88, 0x1b,
+	0x57, 0x4e, 0x3c, 0x00, 0x2f, 0xc0, 0x23, 0xf0, 0x30, 0xbc, 0x03, 0xf2, 0xda, 0xeb, 0xc6, 0x8d,
+	0x23, 0x39, 0xa1, 0xb7, 0xcd, 0x6e, 0xe6, 0x3f, 0xbf, 0xf9, 0xcf, 0xec, 0xca, 0xf0, 0xb0, 0x43,
+	0x49, 0xe0, 0x93, 0x48, 0xb2, 0x6b, 0xd2, 0x95, 0x3e, 0x09, 0x99, 0x13, 0x46, 0x42, 0x0a, 0xbc,
+	0x2b, 0xa2, 0xbe, 0x43, 0x42, 0xd2, 0x1d, 0x50, 0x27, 0xf9, 0x8f, 0x13, 0x88, 0x1e, 0x1d, 0x3a,
+	0xef, 0x45, 0xc7, 0x0f, 0x08, 0x27, 0x7d, 0x1a, 0x50, 0x2e, 0x9d, 0xf1, 0xbe, 0xfd, 0x0e, 0x36,
+	0x8f, 0xb3, 0xf0, 0x36, 0x95, 0xa4, 0x47, 0x24, 0xc1, 0x18, 0x56, 0x38, 0x09, 0xa8, 0x85, 0x1a,
+	0xa8, 0xb9, 0xe6, 0xa9, 0x35, 0x6e, 0x80, 0x19, 0xd2, 0x28, 0x60, 0x71, 0xcc, 0x04, 0x8f, 0x2d,
+	0xa3, 0x81, 0x9a, 0xeb, 0xde, 0xe4, 0x16, 0xde, 0x81, 0xd5, 0x78, 0x40, 0xdc, 0xa7, 0x07, 0xd6,
+	0x8a, 0x8a, 0xcb, 0x7e, 0xd9, 0x04, 0x6a, 0x6d, 0xc2, 0xd9, 0x35, 0x8d, 0x25, 0x7e, 0x03, 0x35,
+	0x0d, 0x6b, 0xa1, 0xc6, 0x72, 0xd3, 0x74, 0x9f, 0x39, 0x15, 0x49, 0x9d, 0xbb, 0x98, 0x5e, 0x2e,
+	0x65, 0xff, 0x41, 0xb0, 0x7e, 0x11, 0x89, 0x8f, 0x9f, 0xf2, 0x44, 0x6d, 0xa8, 0x05, 0xd9, 0x5a,
+	0x95, 0x61, 0xba, 0xfb, 0x95, 0x13, 0x69, 0x11, 0x2f, 0x97, 0xc0, 0x6f, 0xa1, 0x36, 0x14, 0x5d,
+	0x22, 0x99, 0xe0, 0x96, 0xa1, 0xb8, 0x8f, 0x2a, 0xcb, 0x15, 0xc0, 0x9c, 0xf3, 0x4c, 0xc6, 0xcb,
+	0x05, 0xeb, 0x7b, 0x50, 0xd3, 0xbb, 0xa5, 0xd6, 0x6f, 0xc2, 0xf2, 0x28, 0x62, 0xca, 0xf2, 0x35,
+	0x2f, 0x59, 0xda, 0x2f, 0x00, 0x9f, 0x52, 0x99, 0x73, 0xd2, 0x0f, 0xa3, 0x04, 0x72, 0x17, 0x1e,
+	0x44, 0x54, 0x46, 0x8c, 0x8e, 0xc9, 0xd0, 0x97, 0xe2, 0x86, 0xf2, 0x4c, 0x66, 0x23, 0xdf, 0x7e,
+	0x9d, 0xec, 0xda, 0x3d, 0xd8, 0x2a, 0x84, 0xc7, 0xa1, 0xe0, 0x31, 0xbd, 0x67, 0xcf, 0xec, 0x57,
+	0x0a, 0x52, 0x77, 0x4d, 0x43, 0x96, 0x15, 0x58, 0x02, 0x6e, 0x94, 0x82, 0x3f, 0x82, 0x75, 0xad,
+	0x77, 0x32, 0x18, 0xf1, 0x9b, 0x44, 0x2d, 0x19, 0x05, 0xa5, 0xf6, 0xbf, 0xa7, 0xd6, 0xf6, 0x0f,
+	0x04, 0x5b, 0x17, 0x23, 0x39, 0x35, 0xd5, 0x2e, 0x6c, 0xc7, 0x92, 0xf4, 0x19, 0xef, 0xfb, 0x31,
+	0x55, 0x33, 0x5b, 0x30, 0x69, 0x2b, 0x3b, 0xbc, 0x4c, 0xcf, 0x54, 0xc2, 0x64, 0x5e, 0x83, 0x2c,
+	0x5e, 0x21, 0xfd, 0xdb, 0xbc, 0x6a, 0x29, 0xfb, 0x37, 0x02, 0x3c, 0x81, 0xa8, 0xbd, 0xb9, 0x9a,
+	0xc8, 0x96, 0x36, 0xe0, 0xb0, 0xfa, 0x94, 0x4d, 0x57, 0x7c, 0xb6, 0x74, 0x9b, 0x12, 0x9f, 0x67,
+	0x4e, 0xa5, 0x55, 0x1c, 0xcc, 0x5d, 0x85, 0xf2, 0xfb, 0x6c, 0x29, 0xf5, 0xb8, 0xb5, 0x06, 0xff,
+	0x75, 0x05, 0x97, 0x94, 0x4b, 0x7b, 0xbb, 0xe0, 0xb6, 0x1e, 0x26, 0xfb, 0x3b, 0x82, 0xed, 0x13,
+	0x11, 0x04, 0x6c, 0x6a, 0x4c, 0xef, 0xf9, 0x6a, 0xce, 0x6c, 0xab, 0x31, 0xb3, 0xad, 0xf6, 0x31,
+	0xec, 0xdc, 0x65, 0xcb, 0xee, 0x40, 0xd5, 0x3b, 0xe4, 0xfe, 0x32, 0x60, 0x47, 0x17, 0x7d, 0xa9,
+	0x53, 0x44, 0x63, 0xd6, 0xa5, 0xf8, 0x2b, 0x02, 0x73, 0xc2, 0x12, 0xfc, 0x7c, 0x91, 0x26, 0x66,
+	0x6e, 0xd5, 0x0f, 0x17, 0x0b, 0x4e, 0xcb, 0x69, 0x22, 0xfc, 0x0d, 0xc1, 0x46, 0xb1, 0x56, 0xfc,
+	0xb2, 0xb2, 0x64, 0x69, 0x03, 0xeb, 0x47, 0x0b, 0xc7, 0xa7, 0x54, 0xee, 0x4f, 0x03, 0xac, 0x5b,
+	0xd4, 0xcc, 0x56, 0xed, 0xde, 0x17, 0x04, 0xe6, 0xc4, 0xeb, 0x34, 0x87, 0x7b, 0xd3, 0x4f, 0xe2,
+	0x1c, 0xee, 0x95, 0x3d, 0x88, 0x9f, 0x53, 0x94, 0x05, 0x1a, 0x39, 0xfd, 0xf0, 0xd5, 0x17, 0xbc,
+	0x72, 0x7b, 0xa8, 0x75, 0x0a, 0x8f, 0x67, 0x86, 0x16, 0x22, 0x5b, 0xa6, 0x0e, 0x3d, 0x0e, 0xd9,
+	0xd5, 0x66, 0xe1, 0xd8, 0x1f, 0xef, 0x77, 0x56, 0xd5, 0xb7, 0xc1, 0x93, 0xbf, 0x01, 0x00, 0x00,
+	0xff, 0xff, 0xb2, 0x30, 0x58, 0x4f, 0x36, 0x08, 0x00, 0x00,
 }
diff --git a/sdks/go/pkg/beam/model/pipeline_v1/beam_runner_api.pb.go b/sdks/go/pkg/beam/model/pipeline_v1/beam_runner_api.pb.go
index 0a307eb..8d8face 100644
--- a/sdks/go/pkg/beam/model/pipeline_v1/beam_runner_api.pb.go
+++ b/sdks/go/pkg/beam/model/pipeline_v1/beam_runner_api.pb.go
@@ -51,7 +51,7 @@
 	return proto.EnumName(BeamConstants_Constants_name, int32(x))
 }
 func (BeamConstants_Constants) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{0, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{0, 0}
 }
 
 type StandardPTransforms_Primitives int32
@@ -130,7 +130,7 @@
 	return proto.EnumName(StandardPTransforms_Primitives_name, int32(x))
 }
 func (StandardPTransforms_Primitives) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{4, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{4, 0}
 }
 
 type StandardPTransforms_DeprecatedPrimitives int32
@@ -157,7 +157,7 @@
 	return proto.EnumName(StandardPTransforms_DeprecatedPrimitives_name, int32(x))
 }
 func (StandardPTransforms_DeprecatedPrimitives) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{4, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{4, 1}
 }
 
 type StandardPTransforms_Composites int32
@@ -196,7 +196,7 @@
 	return proto.EnumName(StandardPTransforms_Composites_name, int32(x))
 }
 func (StandardPTransforms_Composites) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{4, 2}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{4, 2}
 }
 
 // Payload for all of these: CombinePayload
@@ -242,7 +242,7 @@
 	return proto.EnumName(StandardPTransforms_CombineComponents_name, int32(x))
 }
 func (StandardPTransforms_CombineComponents) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{4, 3}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{4, 3}
 }
 
 // Payload for all of these: ParDoPayload containing the user's SDF
@@ -302,7 +302,7 @@
 	return proto.EnumName(StandardPTransforms_SplittableParDoComponents_name, int32(x))
 }
 func (StandardPTransforms_SplittableParDoComponents) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{4, 4}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{4, 4}
 }
 
 type StandardSideInputTypes_Enum int32
@@ -325,7 +325,7 @@
 	return proto.EnumName(StandardSideInputTypes_Enum_name, int32(x))
 }
 func (StandardSideInputTypes_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{5, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{5, 0}
 }
 
 type Parameter_Type_Enum int32
@@ -354,7 +354,7 @@
 	return proto.EnumName(Parameter_Type_Enum_name, int32(x))
 }
 func (Parameter_Type_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{8, 0, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{8, 0, 0}
 }
 
 type IsBounded_Enum int32
@@ -380,7 +380,7 @@
 	return proto.EnumName(IsBounded_Enum_name, int32(x))
 }
 func (IsBounded_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{16, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{16, 0}
 }
 
 type StandardCoders_Enum int32
@@ -497,69 +497,7 @@
 	return proto.EnumName(StandardCoders_Enum_name, int32(x))
 }
 func (StandardCoders_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{23, 0}
-}
-
-type Schema_TypeName int32
-
-const (
-	Schema_BYTE         Schema_TypeName = 0
-	Schema_INT16        Schema_TypeName = 1
-	Schema_INT32        Schema_TypeName = 2
-	Schema_INT64        Schema_TypeName = 3
-	Schema_DECIMAL      Schema_TypeName = 4
-	Schema_FLOAT        Schema_TypeName = 5
-	Schema_DOUBLE       Schema_TypeName = 6
-	Schema_STRING       Schema_TypeName = 7
-	Schema_DATETIME     Schema_TypeName = 8
-	Schema_BOOLEAN      Schema_TypeName = 9
-	Schema_BYTES        Schema_TypeName = 10
-	Schema_ARRAY        Schema_TypeName = 11
-	Schema_MAP          Schema_TypeName = 13
-	Schema_ROW          Schema_TypeName = 14
-	Schema_LOGICAL_TYPE Schema_TypeName = 15
-)
-
-var Schema_TypeName_name = map[int32]string{
-	0:  "BYTE",
-	1:  "INT16",
-	2:  "INT32",
-	3:  "INT64",
-	4:  "DECIMAL",
-	5:  "FLOAT",
-	6:  "DOUBLE",
-	7:  "STRING",
-	8:  "DATETIME",
-	9:  "BOOLEAN",
-	10: "BYTES",
-	11: "ARRAY",
-	13: "MAP",
-	14: "ROW",
-	15: "LOGICAL_TYPE",
-}
-var Schema_TypeName_value = map[string]int32{
-	"BYTE":         0,
-	"INT16":        1,
-	"INT32":        2,
-	"INT64":        3,
-	"DECIMAL":      4,
-	"FLOAT":        5,
-	"DOUBLE":       6,
-	"STRING":       7,
-	"DATETIME":     8,
-	"BOOLEAN":      9,
-	"BYTES":        10,
-	"ARRAY":        11,
-	"MAP":          13,
-	"ROW":          14,
-	"LOGICAL_TYPE": 15,
-}
-
-func (x Schema_TypeName) String() string {
-	return proto.EnumName(Schema_TypeName_name, int32(x))
-}
-func (Schema_TypeName) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{24, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{23, 0}
 }
 
 type MergeStatus_Enum int32
@@ -596,7 +534,7 @@
 	return proto.EnumName(MergeStatus_Enum_name, int32(x))
 }
 func (MergeStatus_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{26, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{25, 0}
 }
 
 type AccumulationMode_Enum int32
@@ -628,7 +566,7 @@
 	return proto.EnumName(AccumulationMode_Enum_name, int32(x))
 }
 func (AccumulationMode_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{27, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{26, 0}
 }
 
 type ClosingBehavior_Enum int32
@@ -657,7 +595,7 @@
 	return proto.EnumName(ClosingBehavior_Enum_name, int32(x))
 }
 func (ClosingBehavior_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{28, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{27, 0}
 }
 
 type OnTimeBehavior_Enum int32
@@ -686,7 +624,7 @@
 	return proto.EnumName(OnTimeBehavior_Enum_name, int32(x))
 }
 func (OnTimeBehavior_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{29, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{28, 0}
 }
 
 type OutputTime_Enum int32
@@ -720,7 +658,7 @@
 	return proto.EnumName(OutputTime_Enum_name, int32(x))
 }
 func (OutputTime_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{30, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{29, 0}
 }
 
 type TimeDomain_Enum int32
@@ -757,7 +695,7 @@
 	return proto.EnumName(TimeDomain_Enum_name, int32(x))
 }
 func (TimeDomain_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{31, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{30, 0}
 }
 
 type StandardEnvironments_Environments int32
@@ -783,7 +721,7 @@
 	return proto.EnumName(StandardEnvironments_Environments_name, int32(x))
 }
 func (StandardEnvironments_Environments) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{36, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{35, 0}
 }
 
 type DisplayData_Type_Enum int32
@@ -824,7 +762,7 @@
 	return proto.EnumName(DisplayData_Type_Enum_name, int32(x))
 }
 func (DisplayData_Type_Enum) EnumDescriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{42, 2, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{41, 2, 0}
 }
 
 type BeamConstants struct {
@@ -837,7 +775,7 @@
 func (m *BeamConstants) String() string { return proto.CompactTextString(m) }
 func (*BeamConstants) ProtoMessage()    {}
 func (*BeamConstants) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{0}
 }
 func (m *BeamConstants) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_BeamConstants.Unmarshal(m, b)
@@ -879,7 +817,7 @@
 func (m *Components) String() string { return proto.CompactTextString(m) }
 func (*Components) ProtoMessage()    {}
 func (*Components) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{1}
 }
 func (m *Components) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Components.Unmarshal(m, b)
@@ -963,7 +901,7 @@
 func (m *Pipeline) String() string { return proto.CompactTextString(m) }
 func (*Pipeline) ProtoMessage()    {}
 func (*Pipeline) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{2}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{2}
 }
 func (m *Pipeline) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Pipeline.Unmarshal(m, b)
@@ -1077,7 +1015,7 @@
 func (m *PTransform) String() string { return proto.CompactTextString(m) }
 func (*PTransform) ProtoMessage()    {}
 func (*PTransform) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{3}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{3}
 }
 func (m *PTransform) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_PTransform.Unmarshal(m, b)
@@ -1149,7 +1087,7 @@
 func (m *StandardPTransforms) String() string { return proto.CompactTextString(m) }
 func (*StandardPTransforms) ProtoMessage()    {}
 func (*StandardPTransforms) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{4}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{4}
 }
 func (m *StandardPTransforms) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StandardPTransforms.Unmarshal(m, b)
@@ -1179,7 +1117,7 @@
 func (m *StandardSideInputTypes) String() string { return proto.CompactTextString(m) }
 func (*StandardSideInputTypes) ProtoMessage()    {}
 func (*StandardSideInputTypes) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{5}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{5}
 }
 func (m *StandardSideInputTypes) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StandardSideInputTypes.Unmarshal(m, b)
@@ -1229,7 +1167,7 @@
 func (m *PCollection) String() string { return proto.CompactTextString(m) }
 func (*PCollection) ProtoMessage()    {}
 func (*PCollection) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{6}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{6}
 }
 func (m *PCollection) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_PCollection.Unmarshal(m, b)
@@ -1314,7 +1252,7 @@
 func (m *ParDoPayload) String() string { return proto.CompactTextString(m) }
 func (*ParDoPayload) ProtoMessage()    {}
 func (*ParDoPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{7}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{7}
 }
 func (m *ParDoPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ParDoPayload.Unmarshal(m, b)
@@ -1415,7 +1353,7 @@
 func (m *Parameter) String() string { return proto.CompactTextString(m) }
 func (*Parameter) ProtoMessage()    {}
 func (*Parameter) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{8}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{8}
 }
 func (m *Parameter) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Parameter.Unmarshal(m, b)
@@ -1452,7 +1390,7 @@
 func (m *Parameter_Type) String() string { return proto.CompactTextString(m) }
 func (*Parameter_Type) ProtoMessage()    {}
 func (*Parameter_Type) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{8, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{8, 0}
 }
 func (m *Parameter_Type) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Parameter_Type.Unmarshal(m, b)
@@ -1489,7 +1427,7 @@
 func (m *StateSpec) String() string { return proto.CompactTextString(m) }
 func (*StateSpec) ProtoMessage()    {}
 func (*StateSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{9}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{9}
 }
 func (m *StateSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StateSpec.Unmarshal(m, b)
@@ -1719,7 +1657,7 @@
 func (m *ValueStateSpec) String() string { return proto.CompactTextString(m) }
 func (*ValueStateSpec) ProtoMessage()    {}
 func (*ValueStateSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{10}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{10}
 }
 func (m *ValueStateSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ValueStateSpec.Unmarshal(m, b)
@@ -1757,7 +1695,7 @@
 func (m *BagStateSpec) String() string { return proto.CompactTextString(m) }
 func (*BagStateSpec) ProtoMessage()    {}
 func (*BagStateSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{11}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{11}
 }
 func (m *BagStateSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_BagStateSpec.Unmarshal(m, b)
@@ -1796,7 +1734,7 @@
 func (m *CombiningStateSpec) String() string { return proto.CompactTextString(m) }
 func (*CombiningStateSpec) ProtoMessage()    {}
 func (*CombiningStateSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{12}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{12}
 }
 func (m *CombiningStateSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_CombiningStateSpec.Unmarshal(m, b)
@@ -1842,7 +1780,7 @@
 func (m *MapStateSpec) String() string { return proto.CompactTextString(m) }
 func (*MapStateSpec) ProtoMessage()    {}
 func (*MapStateSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{13}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{13}
 }
 func (m *MapStateSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_MapStateSpec.Unmarshal(m, b)
@@ -1887,7 +1825,7 @@
 func (m *SetStateSpec) String() string { return proto.CompactTextString(m) }
 func (*SetStateSpec) ProtoMessage()    {}
 func (*SetStateSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{14}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{14}
 }
 func (m *SetStateSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_SetStateSpec.Unmarshal(m, b)
@@ -1926,7 +1864,7 @@
 func (m *TimerSpec) String() string { return proto.CompactTextString(m) }
 func (*TimerSpec) ProtoMessage()    {}
 func (*TimerSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{15}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{15}
 }
 func (m *TimerSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TimerSpec.Unmarshal(m, b)
@@ -1970,7 +1908,7 @@
 func (m *IsBounded) String() string { return proto.CompactTextString(m) }
 func (*IsBounded) ProtoMessage()    {}
 func (*IsBounded) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{16}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{16}
 }
 func (m *IsBounded) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_IsBounded.Unmarshal(m, b)
@@ -2005,7 +1943,7 @@
 func (m *ReadPayload) String() string { return proto.CompactTextString(m) }
 func (*ReadPayload) ProtoMessage()    {}
 func (*ReadPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{17}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{17}
 }
 func (m *ReadPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ReadPayload.Unmarshal(m, b)
@@ -2052,7 +1990,7 @@
 func (m *WindowIntoPayload) String() string { return proto.CompactTextString(m) }
 func (*WindowIntoPayload) ProtoMessage()    {}
 func (*WindowIntoPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{18}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{18}
 }
 func (m *WindowIntoPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_WindowIntoPayload.Unmarshal(m, b)
@@ -2094,7 +2032,7 @@
 func (m *CombinePayload) String() string { return proto.CompactTextString(m) }
 func (*CombinePayload) ProtoMessage()    {}
 func (*CombinePayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{19}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{19}
 }
 func (m *CombinePayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_CombinePayload.Unmarshal(m, b)
@@ -2142,7 +2080,7 @@
 func (m *TestStreamPayload) String() string { return proto.CompactTextString(m) }
 func (*TestStreamPayload) ProtoMessage()    {}
 func (*TestStreamPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{20}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{20}
 }
 func (m *TestStreamPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TestStreamPayload.Unmarshal(m, b)
@@ -2191,7 +2129,7 @@
 func (m *TestStreamPayload_Event) String() string { return proto.CompactTextString(m) }
 func (*TestStreamPayload_Event) ProtoMessage()    {}
 func (*TestStreamPayload_Event) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{20, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{20, 0}
 }
 func (m *TestStreamPayload_Event) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TestStreamPayload_Event.Unmarshal(m, b)
@@ -2363,7 +2301,7 @@
 func (m *TestStreamPayload_Event_AdvanceWatermark) String() string { return proto.CompactTextString(m) }
 func (*TestStreamPayload_Event_AdvanceWatermark) ProtoMessage()    {}
 func (*TestStreamPayload_Event_AdvanceWatermark) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{20, 0, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{20, 0, 0}
 }
 func (m *TestStreamPayload_Event_AdvanceWatermark) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TestStreamPayload_Event_AdvanceWatermark.Unmarshal(m, b)
@@ -2405,7 +2343,7 @@
 }
 func (*TestStreamPayload_Event_AdvanceProcessingTime) ProtoMessage() {}
 func (*TestStreamPayload_Event_AdvanceProcessingTime) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{20, 0, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{20, 0, 1}
 }
 func (m *TestStreamPayload_Event_AdvanceProcessingTime) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TestStreamPayload_Event_AdvanceProcessingTime.Unmarshal(m, b)
@@ -2443,7 +2381,7 @@
 func (m *TestStreamPayload_Event_AddElements) String() string { return proto.CompactTextString(m) }
 func (*TestStreamPayload_Event_AddElements) ProtoMessage()    {}
 func (*TestStreamPayload_Event_AddElements) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{20, 0, 2}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{20, 0, 2}
 }
 func (m *TestStreamPayload_Event_AddElements) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TestStreamPayload_Event_AddElements.Unmarshal(m, b)
@@ -2482,7 +2420,7 @@
 func (m *TestStreamPayload_TimestampedElement) String() string { return proto.CompactTextString(m) }
 func (*TestStreamPayload_TimestampedElement) ProtoMessage()    {}
 func (*TestStreamPayload_TimestampedElement) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{20, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{20, 1}
 }
 func (m *TestStreamPayload_TimestampedElement) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TestStreamPayload_TimestampedElement.Unmarshal(m, b)
@@ -2534,7 +2472,7 @@
 func (m *WriteFilesPayload) String() string { return proto.CompactTextString(m) }
 func (*WriteFilesPayload) ProtoMessage()    {}
 func (*WriteFilesPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{21}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{21}
 }
 func (m *WriteFilesPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_WriteFilesPayload.Unmarshal(m, b)
@@ -2611,7 +2549,7 @@
 func (m *Coder) String() string { return proto.CompactTextString(m) }
 func (*Coder) ProtoMessage()    {}
 func (*Coder) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{22}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{22}
 }
 func (m *Coder) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Coder.Unmarshal(m, b)
@@ -2655,7 +2593,7 @@
 func (m *StandardCoders) String() string { return proto.CompactTextString(m) }
 func (*StandardCoders) ProtoMessage()    {}
 func (*StandardCoders) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{23}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{23}
 }
 func (m *StandardCoders) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StandardCoders.Unmarshal(m, b)
@@ -2675,452 +2613,6 @@
 
 var xxx_messageInfo_StandardCoders proto.InternalMessageInfo
 
-// Experimental: A representation of a Beam Schema.
-type Schema struct {
-	Fields               []*Schema_Field `protobuf:"bytes,1,rep,name=fields,proto3" json:"fields,omitempty"`
-	Id                   string          `protobuf:"bytes,2,opt,name=id,proto3" json:"id,omitempty"`
-	XXX_NoUnkeyedLiteral struct{}        `json:"-"`
-	XXX_unrecognized     []byte          `json:"-"`
-	XXX_sizecache        int32           `json:"-"`
-}
-
-func (m *Schema) Reset()         { *m = Schema{} }
-func (m *Schema) String() string { return proto.CompactTextString(m) }
-func (*Schema) ProtoMessage()    {}
-func (*Schema) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{24}
-}
-func (m *Schema) XXX_Unmarshal(b []byte) error {
-	return xxx_messageInfo_Schema.Unmarshal(m, b)
-}
-func (m *Schema) XXX_Marshal(b []byte, deterministic bool) ([]byte, error) {
-	return xxx_messageInfo_Schema.Marshal(b, m, deterministic)
-}
-func (dst *Schema) XXX_Merge(src proto.Message) {
-	xxx_messageInfo_Schema.Merge(dst, src)
-}
-func (m *Schema) XXX_Size() int {
-	return xxx_messageInfo_Schema.Size(m)
-}
-func (m *Schema) XXX_DiscardUnknown() {
-	xxx_messageInfo_Schema.DiscardUnknown(m)
-}
-
-var xxx_messageInfo_Schema proto.InternalMessageInfo
-
-func (m *Schema) GetFields() []*Schema_Field {
-	if m != nil {
-		return m.Fields
-	}
-	return nil
-}
-
-func (m *Schema) GetId() string {
-	if m != nil {
-		return m.Id
-	}
-	return ""
-}
-
-type Schema_LogicalType struct {
-	Id                   string            `protobuf:"bytes,1,opt,name=id,proto3" json:"id,omitempty"`
-	Args                 string            `protobuf:"bytes,2,opt,name=args,proto3" json:"args,omitempty"`
-	BaseType             *Schema_FieldType `protobuf:"bytes,3,opt,name=base_type,json=baseType,proto3" json:"base_type,omitempty"`
-	SerializedClass      []byte            `protobuf:"bytes,4,opt,name=serialized_class,json=serializedClass,proto3" json:"serialized_class,omitempty"`
-	XXX_NoUnkeyedLiteral struct{}          `json:"-"`
-	XXX_unrecognized     []byte            `json:"-"`
-	XXX_sizecache        int32             `json:"-"`
-}
-
-func (m *Schema_LogicalType) Reset()         { *m = Schema_LogicalType{} }
-func (m *Schema_LogicalType) String() string { return proto.CompactTextString(m) }
-func (*Schema_LogicalType) ProtoMessage()    {}
-func (*Schema_LogicalType) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{24, 0}
-}
-func (m *Schema_LogicalType) XXX_Unmarshal(b []byte) error {
-	return xxx_messageInfo_Schema_LogicalType.Unmarshal(m, b)
-}
-func (m *Schema_LogicalType) XXX_Marshal(b []byte, deterministic bool) ([]byte, error) {
-	return xxx_messageInfo_Schema_LogicalType.Marshal(b, m, deterministic)
-}
-func (dst *Schema_LogicalType) XXX_Merge(src proto.Message) {
-	xxx_messageInfo_Schema_LogicalType.Merge(dst, src)
-}
-func (m *Schema_LogicalType) XXX_Size() int {
-	return xxx_messageInfo_Schema_LogicalType.Size(m)
-}
-func (m *Schema_LogicalType) XXX_DiscardUnknown() {
-	xxx_messageInfo_Schema_LogicalType.DiscardUnknown(m)
-}
-
-var xxx_messageInfo_Schema_LogicalType proto.InternalMessageInfo
-
-func (m *Schema_LogicalType) GetId() string {
-	if m != nil {
-		return m.Id
-	}
-	return ""
-}
-
-func (m *Schema_LogicalType) GetArgs() string {
-	if m != nil {
-		return m.Args
-	}
-	return ""
-}
-
-func (m *Schema_LogicalType) GetBaseType() *Schema_FieldType {
-	if m != nil {
-		return m.BaseType
-	}
-	return nil
-}
-
-func (m *Schema_LogicalType) GetSerializedClass() []byte {
-	if m != nil {
-		return m.SerializedClass
-	}
-	return nil
-}
-
-type Schema_MapType struct {
-	KeyType              *Schema_FieldType `protobuf:"bytes,1,opt,name=key_type,json=keyType,proto3" json:"key_type,omitempty"`
-	ValueType            *Schema_FieldType `protobuf:"bytes,2,opt,name=value_type,json=valueType,proto3" json:"value_type,omitempty"`
-	XXX_NoUnkeyedLiteral struct{}          `json:"-"`
-	XXX_unrecognized     []byte            `json:"-"`
-	XXX_sizecache        int32             `json:"-"`
-}
-
-func (m *Schema_MapType) Reset()         { *m = Schema_MapType{} }
-func (m *Schema_MapType) String() string { return proto.CompactTextString(m) }
-func (*Schema_MapType) ProtoMessage()    {}
-func (*Schema_MapType) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{24, 1}
-}
-func (m *Schema_MapType) XXX_Unmarshal(b []byte) error {
-	return xxx_messageInfo_Schema_MapType.Unmarshal(m, b)
-}
-func (m *Schema_MapType) XXX_Marshal(b []byte, deterministic bool) ([]byte, error) {
-	return xxx_messageInfo_Schema_MapType.Marshal(b, m, deterministic)
-}
-func (dst *Schema_MapType) XXX_Merge(src proto.Message) {
-	xxx_messageInfo_Schema_MapType.Merge(dst, src)
-}
-func (m *Schema_MapType) XXX_Size() int {
-	return xxx_messageInfo_Schema_MapType.Size(m)
-}
-func (m *Schema_MapType) XXX_DiscardUnknown() {
-	xxx_messageInfo_Schema_MapType.DiscardUnknown(m)
-}
-
-var xxx_messageInfo_Schema_MapType proto.InternalMessageInfo
-
-func (m *Schema_MapType) GetKeyType() *Schema_FieldType {
-	if m != nil {
-		return m.KeyType
-	}
-	return nil
-}
-
-func (m *Schema_MapType) GetValueType() *Schema_FieldType {
-	if m != nil {
-		return m.ValueType
-	}
-	return nil
-}
-
-type Schema_FieldType struct {
-	TypeName Schema_TypeName `protobuf:"varint,1,opt,name=type_name,json=typeName,proto3,enum=org.apache.beam.model.pipeline.v1.Schema_TypeName" json:"type_name,omitempty"`
-	Nullable bool            `protobuf:"varint,2,opt,name=nullable,proto3" json:"nullable,omitempty"`
-	// Types that are valid to be assigned to TypeInfo:
-	//	*Schema_FieldType_CollectionElementType
-	//	*Schema_FieldType_MapType
-	//	*Schema_FieldType_RowSchema
-	//	*Schema_FieldType_LogicalType
-	TypeInfo             isSchema_FieldType_TypeInfo `protobuf_oneof:"type_info"`
-	XXX_NoUnkeyedLiteral struct{}                    `json:"-"`
-	XXX_unrecognized     []byte                      `json:"-"`
-	XXX_sizecache        int32                       `json:"-"`
-}
-
-func (m *Schema_FieldType) Reset()         { *m = Schema_FieldType{} }
-func (m *Schema_FieldType) String() string { return proto.CompactTextString(m) }
-func (*Schema_FieldType) ProtoMessage()    {}
-func (*Schema_FieldType) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{24, 2}
-}
-func (m *Schema_FieldType) XXX_Unmarshal(b []byte) error {
-	return xxx_messageInfo_Schema_FieldType.Unmarshal(m, b)
-}
-func (m *Schema_FieldType) XXX_Marshal(b []byte, deterministic bool) ([]byte, error) {
-	return xxx_messageInfo_Schema_FieldType.Marshal(b, m, deterministic)
-}
-func (dst *Schema_FieldType) XXX_Merge(src proto.Message) {
-	xxx_messageInfo_Schema_FieldType.Merge(dst, src)
-}
-func (m *Schema_FieldType) XXX_Size() int {
-	return xxx_messageInfo_Schema_FieldType.Size(m)
-}
-func (m *Schema_FieldType) XXX_DiscardUnknown() {
-	xxx_messageInfo_Schema_FieldType.DiscardUnknown(m)
-}
-
-var xxx_messageInfo_Schema_FieldType proto.InternalMessageInfo
-
-type isSchema_FieldType_TypeInfo interface {
-	isSchema_FieldType_TypeInfo()
-}
-
-type Schema_FieldType_CollectionElementType struct {
-	CollectionElementType *Schema_FieldType `protobuf:"bytes,3,opt,name=collection_element_type,json=collectionElementType,proto3,oneof"`
-}
-type Schema_FieldType_MapType struct {
-	MapType *Schema_MapType `protobuf:"bytes,4,opt,name=map_type,json=mapType,proto3,oneof"`
-}
-type Schema_FieldType_RowSchema struct {
-	RowSchema *Schema `protobuf:"bytes,5,opt,name=row_schema,json=rowSchema,proto3,oneof"`
-}
-type Schema_FieldType_LogicalType struct {
-	LogicalType *Schema_LogicalType `protobuf:"bytes,6,opt,name=logical_type,json=logicalType,proto3,oneof"`
-}
-
-func (*Schema_FieldType_CollectionElementType) isSchema_FieldType_TypeInfo() {}
-func (*Schema_FieldType_MapType) isSchema_FieldType_TypeInfo()               {}
-func (*Schema_FieldType_RowSchema) isSchema_FieldType_TypeInfo()             {}
-func (*Schema_FieldType_LogicalType) isSchema_FieldType_TypeInfo()           {}
-
-func (m *Schema_FieldType) GetTypeInfo() isSchema_FieldType_TypeInfo {
-	if m != nil {
-		return m.TypeInfo
-	}
-	return nil
-}
-
-func (m *Schema_FieldType) GetTypeName() Schema_TypeName {
-	if m != nil {
-		return m.TypeName
-	}
-	return Schema_BYTE
-}
-
-func (m *Schema_FieldType) GetNullable() bool {
-	if m != nil {
-		return m.Nullable
-	}
-	return false
-}
-
-func (m *Schema_FieldType) GetCollectionElementType() *Schema_FieldType {
-	if x, ok := m.GetTypeInfo().(*Schema_FieldType_CollectionElementType); ok {
-		return x.CollectionElementType
-	}
-	return nil
-}
-
-func (m *Schema_FieldType) GetMapType() *Schema_MapType {
-	if x, ok := m.GetTypeInfo().(*Schema_FieldType_MapType); ok {
-		return x.MapType
-	}
-	return nil
-}
-
-func (m *Schema_FieldType) GetRowSchema() *Schema {
-	if x, ok := m.GetTypeInfo().(*Schema_FieldType_RowSchema); ok {
-		return x.RowSchema
-	}
-	return nil
-}
-
-func (m *Schema_FieldType) GetLogicalType() *Schema_LogicalType {
-	if x, ok := m.GetTypeInfo().(*Schema_FieldType_LogicalType); ok {
-		return x.LogicalType
-	}
-	return nil
-}
-
-// XXX_OneofFuncs is for the internal use of the proto package.
-func (*Schema_FieldType) XXX_OneofFuncs() (func(msg proto.Message, b *proto.Buffer) error, func(msg proto.Message, tag, wire int, b *proto.Buffer) (bool, error), func(msg proto.Message) (n int), []interface{}) {
-	return _Schema_FieldType_OneofMarshaler, _Schema_FieldType_OneofUnmarshaler, _Schema_FieldType_OneofSizer, []interface{}{
-		(*Schema_FieldType_CollectionElementType)(nil),
-		(*Schema_FieldType_MapType)(nil),
-		(*Schema_FieldType_RowSchema)(nil),
-		(*Schema_FieldType_LogicalType)(nil),
-	}
-}
-
-func _Schema_FieldType_OneofMarshaler(msg proto.Message, b *proto.Buffer) error {
-	m := msg.(*Schema_FieldType)
-	// type_info
-	switch x := m.TypeInfo.(type) {
-	case *Schema_FieldType_CollectionElementType:
-		b.EncodeVarint(3<<3 | proto.WireBytes)
-		if err := b.EncodeMessage(x.CollectionElementType); err != nil {
-			return err
-		}
-	case *Schema_FieldType_MapType:
-		b.EncodeVarint(4<<3 | proto.WireBytes)
-		if err := b.EncodeMessage(x.MapType); err != nil {
-			return err
-		}
-	case *Schema_FieldType_RowSchema:
-		b.EncodeVarint(5<<3 | proto.WireBytes)
-		if err := b.EncodeMessage(x.RowSchema); err != nil {
-			return err
-		}
-	case *Schema_FieldType_LogicalType:
-		b.EncodeVarint(6<<3 | proto.WireBytes)
-		if err := b.EncodeMessage(x.LogicalType); err != nil {
-			return err
-		}
-	case nil:
-	default:
-		return fmt.Errorf("Schema_FieldType.TypeInfo has unexpected type %T", x)
-	}
-	return nil
-}
-
-func _Schema_FieldType_OneofUnmarshaler(msg proto.Message, tag, wire int, b *proto.Buffer) (bool, error) {
-	m := msg.(*Schema_FieldType)
-	switch tag {
-	case 3: // type_info.collection_element_type
-		if wire != proto.WireBytes {
-			return true, proto.ErrInternalBadWireType
-		}
-		msg := new(Schema_FieldType)
-		err := b.DecodeMessage(msg)
-		m.TypeInfo = &Schema_FieldType_CollectionElementType{msg}
-		return true, err
-	case 4: // type_info.map_type
-		if wire != proto.WireBytes {
-			return true, proto.ErrInternalBadWireType
-		}
-		msg := new(Schema_MapType)
-		err := b.DecodeMessage(msg)
-		m.TypeInfo = &Schema_FieldType_MapType{msg}
-		return true, err
-	case 5: // type_info.row_schema
-		if wire != proto.WireBytes {
-			return true, proto.ErrInternalBadWireType
-		}
-		msg := new(Schema)
-		err := b.DecodeMessage(msg)
-		m.TypeInfo = &Schema_FieldType_RowSchema{msg}
-		return true, err
-	case 6: // type_info.logical_type
-		if wire != proto.WireBytes {
-			return true, proto.ErrInternalBadWireType
-		}
-		msg := new(Schema_LogicalType)
-		err := b.DecodeMessage(msg)
-		m.TypeInfo = &Schema_FieldType_LogicalType{msg}
-		return true, err
-	default:
-		return false, nil
-	}
-}
-
-func _Schema_FieldType_OneofSizer(msg proto.Message) (n int) {
-	m := msg.(*Schema_FieldType)
-	// type_info
-	switch x := m.TypeInfo.(type) {
-	case *Schema_FieldType_CollectionElementType:
-		s := proto.Size(x.CollectionElementType)
-		n += 1 // tag and wire
-		n += proto.SizeVarint(uint64(s))
-		n += s
-	case *Schema_FieldType_MapType:
-		s := proto.Size(x.MapType)
-		n += 1 // tag and wire
-		n += proto.SizeVarint(uint64(s))
-		n += s
-	case *Schema_FieldType_RowSchema:
-		s := proto.Size(x.RowSchema)
-		n += 1 // tag and wire
-		n += proto.SizeVarint(uint64(s))
-		n += s
-	case *Schema_FieldType_LogicalType:
-		s := proto.Size(x.LogicalType)
-		n += 1 // tag and wire
-		n += proto.SizeVarint(uint64(s))
-		n += s
-	case nil:
-	default:
-		panic(fmt.Sprintf("proto: unexpected type %T in oneof", x))
-	}
-	return n
-}
-
-type Schema_Field struct {
-	Name                 string            `protobuf:"bytes,1,opt,name=name,proto3" json:"name,omitempty"`
-	Description          string            `protobuf:"bytes,2,opt,name=description,proto3" json:"description,omitempty"`
-	Type                 *Schema_FieldType `protobuf:"bytes,3,opt,name=type,proto3" json:"type,omitempty"`
-	Id                   int32             `protobuf:"varint,4,opt,name=id,proto3" json:"id,omitempty"`
-	EncodingPosition     int32             `protobuf:"varint,5,opt,name=encoding_position,json=encodingPosition,proto3" json:"encoding_position,omitempty"`
-	XXX_NoUnkeyedLiteral struct{}          `json:"-"`
-	XXX_unrecognized     []byte            `json:"-"`
-	XXX_sizecache        int32             `json:"-"`
-}
-
-func (m *Schema_Field) Reset()         { *m = Schema_Field{} }
-func (m *Schema_Field) String() string { return proto.CompactTextString(m) }
-func (*Schema_Field) ProtoMessage()    {}
-func (*Schema_Field) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{24, 3}
-}
-func (m *Schema_Field) XXX_Unmarshal(b []byte) error {
-	return xxx_messageInfo_Schema_Field.Unmarshal(m, b)
-}
-func (m *Schema_Field) XXX_Marshal(b []byte, deterministic bool) ([]byte, error) {
-	return xxx_messageInfo_Schema_Field.Marshal(b, m, deterministic)
-}
-func (dst *Schema_Field) XXX_Merge(src proto.Message) {
-	xxx_messageInfo_Schema_Field.Merge(dst, src)
-}
-func (m *Schema_Field) XXX_Size() int {
-	return xxx_messageInfo_Schema_Field.Size(m)
-}
-func (m *Schema_Field) XXX_DiscardUnknown() {
-	xxx_messageInfo_Schema_Field.DiscardUnknown(m)
-}
-
-var xxx_messageInfo_Schema_Field proto.InternalMessageInfo
-
-func (m *Schema_Field) GetName() string {
-	if m != nil {
-		return m.Name
-	}
-	return ""
-}
-
-func (m *Schema_Field) GetDescription() string {
-	if m != nil {
-		return m.Description
-	}
-	return ""
-}
-
-func (m *Schema_Field) GetType() *Schema_FieldType {
-	if m != nil {
-		return m.Type
-	}
-	return nil
-}
-
-func (m *Schema_Field) GetId() int32 {
-	if m != nil {
-		return m.Id
-	}
-	return 0
-}
-
-func (m *Schema_Field) GetEncodingPosition() int32 {
-	if m != nil {
-		return m.EncodingPosition
-	}
-	return 0
-}
-
 // A windowing strategy describes the window function, triggering, allowed
 // lateness, and accumulation mode for a PCollection.
 //
@@ -3172,7 +2664,7 @@
 func (m *WindowingStrategy) String() string { return proto.CompactTextString(m) }
 func (*WindowingStrategy) ProtoMessage()    {}
 func (*WindowingStrategy) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{25}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{24}
 }
 func (m *WindowingStrategy) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_WindowingStrategy.Unmarshal(m, b)
@@ -3275,7 +2767,7 @@
 func (m *MergeStatus) String() string { return proto.CompactTextString(m) }
 func (*MergeStatus) ProtoMessage()    {}
 func (*MergeStatus) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{26}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{25}
 }
 func (m *MergeStatus) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_MergeStatus.Unmarshal(m, b)
@@ -3308,7 +2800,7 @@
 func (m *AccumulationMode) String() string { return proto.CompactTextString(m) }
 func (*AccumulationMode) ProtoMessage()    {}
 func (*AccumulationMode) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{27}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{26}
 }
 func (m *AccumulationMode) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_AccumulationMode.Unmarshal(m, b)
@@ -3340,7 +2832,7 @@
 func (m *ClosingBehavior) String() string { return proto.CompactTextString(m) }
 func (*ClosingBehavior) ProtoMessage()    {}
 func (*ClosingBehavior) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{28}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{27}
 }
 func (m *ClosingBehavior) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ClosingBehavior.Unmarshal(m, b)
@@ -3372,7 +2864,7 @@
 func (m *OnTimeBehavior) String() string { return proto.CompactTextString(m) }
 func (*OnTimeBehavior) ProtoMessage()    {}
 func (*OnTimeBehavior) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{29}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{28}
 }
 func (m *OnTimeBehavior) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_OnTimeBehavior.Unmarshal(m, b)
@@ -3404,7 +2896,7 @@
 func (m *OutputTime) String() string { return proto.CompactTextString(m) }
 func (*OutputTime) ProtoMessage()    {}
 func (*OutputTime) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{30}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{29}
 }
 func (m *OutputTime) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_OutputTime.Unmarshal(m, b)
@@ -3435,7 +2927,7 @@
 func (m *TimeDomain) String() string { return proto.CompactTextString(m) }
 func (*TimeDomain) ProtoMessage()    {}
 func (*TimeDomain) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{31}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{30}
 }
 func (m *TimeDomain) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TimeDomain.Unmarshal(m, b)
@@ -3485,7 +2977,7 @@
 func (m *Trigger) String() string { return proto.CompactTextString(m) }
 func (*Trigger) ProtoMessage()    {}
 func (*Trigger) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31}
 }
 func (m *Trigger) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger.Unmarshal(m, b)
@@ -3926,7 +3418,7 @@
 func (m *Trigger_AfterAll) String() string { return proto.CompactTextString(m) }
 func (*Trigger_AfterAll) ProtoMessage()    {}
 func (*Trigger_AfterAll) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 0}
 }
 func (m *Trigger_AfterAll) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_AfterAll.Unmarshal(m, b)
@@ -3965,7 +3457,7 @@
 func (m *Trigger_AfterAny) String() string { return proto.CompactTextString(m) }
 func (*Trigger_AfterAny) ProtoMessage()    {}
 func (*Trigger_AfterAny) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 1}
 }
 func (m *Trigger_AfterAny) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_AfterAny.Unmarshal(m, b)
@@ -4005,7 +3497,7 @@
 func (m *Trigger_AfterEach) String() string { return proto.CompactTextString(m) }
 func (*Trigger_AfterEach) ProtoMessage()    {}
 func (*Trigger_AfterEach) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 2}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 2}
 }
 func (m *Trigger_AfterEach) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_AfterEach.Unmarshal(m, b)
@@ -4051,7 +3543,7 @@
 func (m *Trigger_AfterEndOfWindow) String() string { return proto.CompactTextString(m) }
 func (*Trigger_AfterEndOfWindow) ProtoMessage()    {}
 func (*Trigger_AfterEndOfWindow) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 3}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 3}
 }
 func (m *Trigger_AfterEndOfWindow) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_AfterEndOfWindow.Unmarshal(m, b)
@@ -4099,7 +3591,7 @@
 func (m *Trigger_AfterProcessingTime) String() string { return proto.CompactTextString(m) }
 func (*Trigger_AfterProcessingTime) ProtoMessage()    {}
 func (*Trigger_AfterProcessingTime) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 4}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 4}
 }
 func (m *Trigger_AfterProcessingTime) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_AfterProcessingTime.Unmarshal(m, b)
@@ -4140,7 +3632,7 @@
 func (m *Trigger_AfterSynchronizedProcessingTime) String() string { return proto.CompactTextString(m) }
 func (*Trigger_AfterSynchronizedProcessingTime) ProtoMessage()    {}
 func (*Trigger_AfterSynchronizedProcessingTime) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 5}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 5}
 }
 func (m *Trigger_AfterSynchronizedProcessingTime) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_AfterSynchronizedProcessingTime.Unmarshal(m, b)
@@ -4172,7 +3664,7 @@
 func (m *Trigger_Default) String() string { return proto.CompactTextString(m) }
 func (*Trigger_Default) ProtoMessage()    {}
 func (*Trigger_Default) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 6}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 6}
 }
 func (m *Trigger_Default) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_Default.Unmarshal(m, b)
@@ -4204,7 +3696,7 @@
 func (m *Trigger_ElementCount) String() string { return proto.CompactTextString(m) }
 func (*Trigger_ElementCount) ProtoMessage()    {}
 func (*Trigger_ElementCount) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 7}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 7}
 }
 func (m *Trigger_ElementCount) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_ElementCount.Unmarshal(m, b)
@@ -4243,7 +3735,7 @@
 func (m *Trigger_Never) String() string { return proto.CompactTextString(m) }
 func (*Trigger_Never) ProtoMessage()    {}
 func (*Trigger_Never) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 8}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 8}
 }
 func (m *Trigger_Never) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_Never.Unmarshal(m, b)
@@ -4275,7 +3767,7 @@
 func (m *Trigger_Always) String() string { return proto.CompactTextString(m) }
 func (*Trigger_Always) ProtoMessage()    {}
 func (*Trigger_Always) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 9}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 9}
 }
 func (m *Trigger_Always) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_Always.Unmarshal(m, b)
@@ -4311,7 +3803,7 @@
 func (m *Trigger_OrFinally) String() string { return proto.CompactTextString(m) }
 func (*Trigger_OrFinally) ProtoMessage()    {}
 func (*Trigger_OrFinally) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 10}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 10}
 }
 func (m *Trigger_OrFinally) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_OrFinally.Unmarshal(m, b)
@@ -4359,7 +3851,7 @@
 func (m *Trigger_Repeat) String() string { return proto.CompactTextString(m) }
 func (*Trigger_Repeat) ProtoMessage()    {}
 func (*Trigger_Repeat) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{32, 11}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{31, 11}
 }
 func (m *Trigger_Repeat) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Trigger_Repeat.Unmarshal(m, b)
@@ -4404,7 +3896,7 @@
 func (m *TimestampTransform) String() string { return proto.CompactTextString(m) }
 func (*TimestampTransform) ProtoMessage()    {}
 func (*TimestampTransform) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{33}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{32}
 }
 func (m *TimestampTransform) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TimestampTransform.Unmarshal(m, b)
@@ -4545,7 +4037,7 @@
 func (m *TimestampTransform_Delay) String() string { return proto.CompactTextString(m) }
 func (*TimestampTransform_Delay) ProtoMessage()    {}
 func (*TimestampTransform_Delay) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{33, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{32, 0}
 }
 func (m *TimestampTransform_Delay) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TimestampTransform_Delay.Unmarshal(m, b)
@@ -4588,7 +4080,7 @@
 func (m *TimestampTransform_AlignTo) String() string { return proto.CompactTextString(m) }
 func (*TimestampTransform_AlignTo) ProtoMessage()    {}
 func (*TimestampTransform_AlignTo) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{33, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{32, 1}
 }
 func (m *TimestampTransform_AlignTo) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_TimestampTransform_AlignTo.Unmarshal(m, b)
@@ -4656,7 +4148,7 @@
 func (m *SideInput) String() string { return proto.CompactTextString(m) }
 func (*SideInput) ProtoMessage()    {}
 func (*SideInput) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{34}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{33}
 }
 func (m *SideInput) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_SideInput.Unmarshal(m, b)
@@ -4714,7 +4206,7 @@
 func (m *Environment) String() string { return proto.CompactTextString(m) }
 func (*Environment) ProtoMessage()    {}
 func (*Environment) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{35}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{34}
 }
 func (m *Environment) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_Environment.Unmarshal(m, b)
@@ -4758,7 +4250,7 @@
 func (m *StandardEnvironments) String() string { return proto.CompactTextString(m) }
 func (*StandardEnvironments) ProtoMessage()    {}
 func (*StandardEnvironments) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{36}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{35}
 }
 func (m *StandardEnvironments) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_StandardEnvironments.Unmarshal(m, b)
@@ -4790,7 +4282,7 @@
 func (m *DockerPayload) String() string { return proto.CompactTextString(m) }
 func (*DockerPayload) ProtoMessage()    {}
 func (*DockerPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{37}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{36}
 }
 func (m *DockerPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_DockerPayload.Unmarshal(m, b)
@@ -4831,7 +4323,7 @@
 func (m *ProcessPayload) String() string { return proto.CompactTextString(m) }
 func (*ProcessPayload) ProtoMessage()    {}
 func (*ProcessPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{38}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{37}
 }
 func (m *ProcessPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ProcessPayload.Unmarshal(m, b)
@@ -4891,7 +4383,7 @@
 func (m *ExternalPayload) String() string { return proto.CompactTextString(m) }
 func (*ExternalPayload) ProtoMessage()    {}
 func (*ExternalPayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{39}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{38}
 }
 func (m *ExternalPayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ExternalPayload.Unmarshal(m, b)
@@ -4942,7 +4434,7 @@
 func (m *SdkFunctionSpec) String() string { return proto.CompactTextString(m) }
 func (*SdkFunctionSpec) ProtoMessage()    {}
 func (*SdkFunctionSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{40}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{39}
 }
 func (m *SdkFunctionSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_SdkFunctionSpec.Unmarshal(m, b)
@@ -5021,7 +4513,7 @@
 func (m *FunctionSpec) String() string { return proto.CompactTextString(m) }
 func (*FunctionSpec) ProtoMessage()    {}
 func (*FunctionSpec) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{41}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{40}
 }
 func (m *FunctionSpec) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_FunctionSpec.Unmarshal(m, b)
@@ -5068,7 +4560,7 @@
 func (m *DisplayData) String() string { return proto.CompactTextString(m) }
 func (*DisplayData) ProtoMessage()    {}
 func (*DisplayData) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{42}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{41}
 }
 func (m *DisplayData) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_DisplayData.Unmarshal(m, b)
@@ -5112,7 +4604,7 @@
 func (m *DisplayData_Identifier) String() string { return proto.CompactTextString(m) }
 func (*DisplayData_Identifier) ProtoMessage()    {}
 func (*DisplayData_Identifier) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{42, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{41, 0}
 }
 func (m *DisplayData_Identifier) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_DisplayData_Identifier.Unmarshal(m, b)
@@ -5176,7 +4668,7 @@
 func (m *DisplayData_Item) String() string { return proto.CompactTextString(m) }
 func (*DisplayData_Item) ProtoMessage()    {}
 func (*DisplayData_Item) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{42, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{41, 1}
 }
 func (m *DisplayData_Item) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_DisplayData_Item.Unmarshal(m, b)
@@ -5248,7 +4740,7 @@
 func (m *DisplayData_Type) String() string { return proto.CompactTextString(m) }
 func (*DisplayData_Type) ProtoMessage()    {}
 func (*DisplayData_Type) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{42, 2}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{41, 2}
 }
 func (m *DisplayData_Type) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_DisplayData_Type.Unmarshal(m, b)
@@ -5302,7 +4794,7 @@
 func (m *MessageWithComponents) String() string { return proto.CompactTextString(m) }
 func (*MessageWithComponents) ProtoMessage()    {}
 func (*MessageWithComponents) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{43}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{42}
 }
 func (m *MessageWithComponents) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_MessageWithComponents.Unmarshal(m, b)
@@ -5746,7 +5238,7 @@
 func (m *ExecutableStagePayload) String() string { return proto.CompactTextString(m) }
 func (*ExecutableStagePayload) ProtoMessage()    {}
 func (*ExecutableStagePayload) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{44}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{43}
 }
 func (m *ExecutableStagePayload) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ExecutableStagePayload.Unmarshal(m, b)
@@ -5838,7 +5330,7 @@
 func (m *ExecutableStagePayload_SideInputId) String() string { return proto.CompactTextString(m) }
 func (*ExecutableStagePayload_SideInputId) ProtoMessage()    {}
 func (*ExecutableStagePayload_SideInputId) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{44, 0}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{43, 0}
 }
 func (m *ExecutableStagePayload_SideInputId) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ExecutableStagePayload_SideInputId.Unmarshal(m, b)
@@ -5888,7 +5380,7 @@
 func (m *ExecutableStagePayload_UserStateId) String() string { return proto.CompactTextString(m) }
 func (*ExecutableStagePayload_UserStateId) ProtoMessage()    {}
 func (*ExecutableStagePayload_UserStateId) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{44, 1}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{43, 1}
 }
 func (m *ExecutableStagePayload_UserStateId) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ExecutableStagePayload_UserStateId.Unmarshal(m, b)
@@ -5938,7 +5430,7 @@
 func (m *ExecutableStagePayload_TimerId) String() string { return proto.CompactTextString(m) }
 func (*ExecutableStagePayload_TimerId) ProtoMessage()    {}
 func (*ExecutableStagePayload_TimerId) Descriptor() ([]byte, []int) {
-	return fileDescriptor_beam_runner_api_38625b9043608c5d, []int{44, 2}
+	return fileDescriptor_beam_runner_api_d5fa30116074ddde, []int{43, 2}
 }
 func (m *ExecutableStagePayload_TimerId) XXX_Unmarshal(b []byte) error {
 	return xxx_messageInfo_ExecutableStagePayload_TimerId.Unmarshal(m, b)
@@ -6032,11 +5524,6 @@
 	proto.RegisterMapType((map[string]*SideInput)(nil), "org.apache.beam.model.pipeline.v1.WriteFilesPayload.SideInputsEntry")
 	proto.RegisterType((*Coder)(nil), "org.apache.beam.model.pipeline.v1.Coder")
 	proto.RegisterType((*StandardCoders)(nil), "org.apache.beam.model.pipeline.v1.StandardCoders")
-	proto.RegisterType((*Schema)(nil), "org.apache.beam.model.pipeline.v1.Schema")
-	proto.RegisterType((*Schema_LogicalType)(nil), "org.apache.beam.model.pipeline.v1.Schema.LogicalType")
-	proto.RegisterType((*Schema_MapType)(nil), "org.apache.beam.model.pipeline.v1.Schema.MapType")
-	proto.RegisterType((*Schema_FieldType)(nil), "org.apache.beam.model.pipeline.v1.Schema.FieldType")
-	proto.RegisterType((*Schema_Field)(nil), "org.apache.beam.model.pipeline.v1.Schema.Field")
 	proto.RegisterType((*WindowingStrategy)(nil), "org.apache.beam.model.pipeline.v1.WindowingStrategy")
 	proto.RegisterType((*MergeStatus)(nil), "org.apache.beam.model.pipeline.v1.MergeStatus")
 	proto.RegisterType((*AccumulationMode)(nil), "org.apache.beam.model.pipeline.v1.AccumulationMode")
@@ -6089,7 +5576,6 @@
 	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.Parameter_Type_Enum", Parameter_Type_Enum_name, Parameter_Type_Enum_value)
 	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.IsBounded_Enum", IsBounded_Enum_name, IsBounded_Enum_value)
 	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.StandardCoders_Enum", StandardCoders_Enum_name, StandardCoders_Enum_value)
-	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.Schema_TypeName", Schema_TypeName_name, Schema_TypeName_value)
 	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.MergeStatus_Enum", MergeStatus_Enum_name, MergeStatus_Enum_value)
 	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.AccumulationMode_Enum", AccumulationMode_Enum_name, AccumulationMode_Enum_value)
 	proto.RegisterEnum("org.apache.beam.model.pipeline.v1.ClosingBehavior_Enum", ClosingBehavior_Enum_name, ClosingBehavior_Enum_value)
@@ -6103,355 +5589,327 @@
 }
 
 func init() {
-	proto.RegisterFile("beam_runner_api.proto", fileDescriptor_beam_runner_api_38625b9043608c5d)
+	proto.RegisterFile("beam_runner_api.proto", fileDescriptor_beam_runner_api_d5fa30116074ddde)
 }
 
-var fileDescriptor_beam_runner_api_38625b9043608c5d = []byte{
-	// 5526 bytes of a gzipped FileDescriptorProto
-	0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xbc, 0x5c, 0x5b, 0x8f, 0x23, 0xc7,
-	0x75, 0xe6, 0xfd, 0x72, 0xc8, 0xe1, 0xf4, 0xd6, 0x5e, 0x34, 0x6a, 0xcb, 0xd2, 0xaa, 0x25, 0x4b,
-	0x2b, 0x59, 0xa6, 0x76, 0x67, 0x57, 0x7b, 0x19, 0xdb, 0x92, 0x39, 0xc3, 0x9e, 0x9d, 0xde, 0xe5,
-	0xcd, 0x4d, 0xce, 0xec, 0xae, 0x6c, 0xab, 0x5d, 0xc3, 0x2e, 0xce, 0x34, 0xa6, 0xd9, 0x4d, 0x77,
-	0x37, 0x67, 0x45, 0xc3, 0x46, 0x80, 0x3c, 0x18, 0x01, 0x02, 0x04, 0xc9, 0x43, 0x1e, 0xfc, 0x14,
-	0xc0, 0x06, 0x02, 0x24, 0x41, 0xae, 0x76, 0x02, 0x24, 0x8f, 0xb6, 0xf3, 0x0b, 0x12, 0x20, 0x40,
-	0x7e, 0x43, 0x5e, 0x92, 0xc0, 0x0f, 0xc9, 0x53, 0x50, 0x97, 0x6e, 0x36, 0x39, 0x33, 0x2b, 0x72,
-	0x66, 0x91, 0x37, 0xf6, 0xa9, 0x3a, 0xdf, 0xa9, 0xdb, 0x39, 0x75, 0xce, 0xa9, 0x2a, 0xc2, 0xd5,
-	0x7d, 0x82, 0x87, 0x86, 0x37, 0x76, 0x1c, 0xe2, 0x19, 0x78, 0x64, 0x55, 0x47, 0x9e, 0x1b, 0xb8,
-	0xe8, 0x4d, 0xd7, 0x3b, 0xa8, 0xe2, 0x11, 0xee, 0x1f, 0x92, 0x2a, 0xad, 0x51, 0x1d, 0xba, 0x26,
-	0xb1, 0xab, 0x23, 0x6b, 0x44, 0x6c, 0xcb, 0x21, 0xd5, 0xe3, 0x5b, 0xf2, 0x2a, 0x71, 0xcc, 0x91,
-	0x6b, 0x39, 0x81, 0xcf, 0x79, 0xe4, 0x57, 0x0f, 0x5c, 0xf7, 0xc0, 0x26, 0x1f, 0xb2, 0xaf, 0xfd,
-	0xf1, 0xe0, 0x43, 0xec, 0x4c, 0x44, 0xd1, 0xf5, 0xf9, 0x22, 0x93, 0xf8, 0x7d, 0xcf, 0x1a, 0x05,
-	0xae, 0xc7, 0x6b, 0x28, 0xbf, 0x4a, 0xc2, 0xca, 0x26, 0xc1, 0xc3, 0x2d, 0xd7, 0xf1, 0x03, 0xec,
-	0x04, 0xbe, 0xf2, 0x37, 0x49, 0x28, 0x46, 0x5f, 0xe8, 0x16, 0x5c, 0x69, 0x6a, 0x2d, 0xa3, 0xa7,
-	0x35, 0xd5, 0x6e, 0xaf, 0xd6, 0xec, 0x18, 0x4d, 0xad, 0xd1, 0xd0, 0xba, 0x52, 0x42, 0x7e, 0xe5,
-	0xcf, 0x7f, 0xf9, 0xbf, 0xbf, 0xca, 0x5e, 0xfa, 0xda, 0x83, 0xf5, 0xf5, 0xdb, 0xb7, 0xef, 0xad,
-	0xdf, 0xbc, 0x7d, 0xf7, 0xfe, 0x47, 0x77, 0xee, 0xdd, 0xfb, 0x08, 0xdd, 0x84, 0x2b, 0xcd, 0xda,
-	0xd3, 0x93, 0x2c, 0x49, 0xf9, 0x1a, 0x63, 0x91, 0x4e, 0x70, 0x7c, 0x0c, 0xca, 0xc3, 0x46, 0x7b,
-	0xb3, 0xd6, 0x30, 0x9e, 0x68, 0xad, 0x7a, 0xfb, 0x89, 0x71, 0x2a, 0x7f, 0x6a, 0x96, 0xff, 0xd6,
-	0x83, 0x8f, 0x6e, 0xde, 0x61, 0xfc, 0xca, 0x3f, 0x14, 0x00, 0xb6, 0xdc, 0xe1, 0xc8, 0x75, 0x08,
-	0x6d, 0xf3, 0xf7, 0x00, 0x02, 0x0f, 0x3b, 0xfe, 0xc0, 0xf5, 0x86, 0xfe, 0x5a, 0xf2, 0x7a, 0xfa,
-	0x46, 0x69, 0xfd, 0x9b, 0xd5, 0x2f, 0x1c, 0xd9, 0xea, 0x14, 0xa2, 0xda, 0x8b, 0xf8, 0x55, 0x27,
-	0xf0, 0x26, 0x7a, 0x0c, 0x10, 0xf5, 0xa1, 0x3c, 0xea, 0xbb, 0xb6, 0x4d, 0xfa, 0x81, 0xe5, 0x3a,
-	0xfe, 0x5a, 0x8a, 0x09, 0xf8, 0x64, 0x39, 0x01, 0x9d, 0x18, 0x02, 0x17, 0x31, 0x03, 0x8a, 0x26,
-	0x70, 0xe5, 0xb9, 0xe5, 0x98, 0xee, 0x73, 0xcb, 0x39, 0x30, 0xfc, 0xc0, 0xc3, 0x01, 0x39, 0xb0,
-	0x88, 0xbf, 0x96, 0x66, 0xc2, 0xb6, 0x97, 0x13, 0xf6, 0x24, 0x44, 0xea, 0x46, 0x40, 0x5c, 0xe6,
-	0xe5, 0xe7, 0x27, 0x4b, 0xd0, 0xb7, 0x21, 0xd7, 0x77, 0x4d, 0xe2, 0xf9, 0x6b, 0x19, 0x26, 0xec,
-	0xc1, 0x72, 0xc2, 0xb6, 0x18, 0x2f, 0xc7, 0x17, 0x40, 0x74, 0xc8, 0x88, 0x73, 0x6c, 0x79, 0xae,
-	0x33, 0xa4, 0x75, 0xd6, 0xb2, 0xe7, 0x19, 0x32, 0x35, 0x86, 0x20, 0x86, 0x2c, 0x0e, 0x2a, 0xdb,
-	0xb0, 0x3a, 0x37, 0x6d, 0x48, 0x82, 0xf4, 0x11, 0x99, 0xac, 0x25, 0xaf, 0x27, 0x6f, 0x14, 0x75,
-	0xfa, 0x13, 0x6d, 0x41, 0xf6, 0x18, 0xdb, 0x63, 0xb2, 0x96, 0xba, 0x9e, 0xbc, 0x51, 0x5a, 0xff,
-	0xda, 0x02, 0x4d, 0xe8, 0x44, 0xa8, 0x3a, 0xe7, 0xdd, 0x48, 0xdd, 0x4f, 0xca, 0x2e, 0x5c, 0x3a,
-	0x31, 0x87, 0xa7, 0xc8, 0xab, 0xcf, 0xca, 0xab, 0x2e, 0x22, 0x6f, 0x2b, 0x82, 0x8d, 0x0b, 0xfc,
-	0x11, 0xac, 0x9d, 0x35, 0x8f, 0xa7, 0xc8, 0x7d, 0x34, 0x2b, 0xf7, 0xce, 0x02, 0x72, 0xe7, 0xd1,
-	0x27, 0x71, 0xe9, 0x7d, 0x28, 0xc5, 0x26, 0xf6, 0x14, 0x81, 0x1f, 0xcf, 0x0a, 0xbc, 0xb1, 0xd0,
-	0xdc, 0x9a, 0xc4, 0x9b, 0x1b, 0xd3, 0x13, 0x93, 0xfc, 0x72, 0xc6, 0x34, 0x06, 0x1b, 0x13, 0xa8,
-	0xfc, 0x7b, 0x12, 0x0a, 0x1d, 0x51, 0x0d, 0x35, 0x01, 0xfa, 0xd1, 0x6a, 0x63, 0xf2, 0x16, 0x5b,
-	0x1f, 0xd3, 0x25, 0xaa, 0xc7, 0x00, 0xd0, 0x07, 0x80, 0x3c, 0xd7, 0x0d, 0x8c, 0xc8, 0x72, 0x18,
-	0x96, 0xc9, 0x8d, 0x45, 0x51, 0x97, 0x68, 0x49, 0xb4, 0xac, 0x34, 0x93, 0x2a, 0x5d, 0xd9, 0xb4,
-	0xfc, 0x91, 0x8d, 0x27, 0x86, 0x89, 0x03, 0xbc, 0x96, 0x5e, 0xb8, 0x6b, 0x75, 0xce, 0x56, 0xc7,
-	0x01, 0xd6, 0x4b, 0xe6, 0xf4, 0x43, 0xf9, 0xfd, 0x0c, 0xc0, 0x74, 0xed, 0xa2, 0x37, 0xa0, 0x34,
-	0x76, 0xac, 0x1f, 0x8c, 0x89, 0xe1, 0xe0, 0x21, 0x59, 0xcb, 0xb2, 0xf1, 0x04, 0x4e, 0x6a, 0xe1,
-	0x21, 0x41, 0x5b, 0x90, 0xf1, 0x47, 0xa4, 0x2f, 0x7a, 0xfe, 0xe1, 0x02, 0xa2, 0xb7, 0xc7, 0x0e,
-	0x5b, 0xa6, 0xdd, 0x11, 0xe9, 0xeb, 0x8c, 0x19, 0xbd, 0x0d, 0x2b, 0xfe, 0x78, 0x3f, 0x66, 0x7e,
-	0x79, 0x87, 0x67, 0x89, 0xd4, 0xc4, 0x58, 0xce, 0x68, 0x1c, 0x84, 0xf6, 0xec, 0xc1, 0x52, 0x6a,
-	0x58, 0xd5, 0x18, 0xaf, 0x30, 0x31, 0x1c, 0x08, 0xf5, 0x20, 0xef, 0x8e, 0x03, 0x86, 0xc9, 0xcd,
-	0xd6, 0xc6, 0x72, 0x98, 0x6d, 0xce, 0xcc, 0x41, 0x43, 0xa8, 0x13, 0xd3, 0x92, 0xbb, 0xf0, 0xb4,
-	0xc8, 0x0f, 0xa0, 0x14, 0x6b, 0xff, 0x29, 0xcb, 0xfb, 0x4a, 0x7c, 0x79, 0x17, 0xe3, 0xfa, 0xb1,
-	0x01, 0xe5, 0x78, 0x33, 0x97, 0xe1, 0x55, 0xfe, 0x7e, 0x05, 0x2e, 0x77, 0x03, 0xec, 0x98, 0xd8,
-	0x33, 0xa7, 0xdd, 0xf6, 0x95, 0x3f, 0x4b, 0x03, 0x74, 0x3c, 0x6b, 0x68, 0x05, 0xd6, 0x31, 0xf1,
-	0xd1, 0x7b, 0x90, 0xeb, 0xd4, 0x74, 0xa3, 0xde, 0x96, 0x12, 0xf2, 0x97, 0x7f, 0x46, 0xb7, 0xdb,
-	0x57, 0x68, 0x07, 0x37, 0xa2, 0xc9, 0xdb, 0x18, 0x61, 0xcf, 0x74, 0x37, 0x8e, 0x6f, 0xa1, 0x0f,
-	0x20, 0xbf, 0xdd, 0xa8, 0xf5, 0x7a, 0x6a, 0x4b, 0x4a, 0xca, 0x6f, 0xb0, 0xba, 0xaf, 0xce, 0xd5,
-	0x1d, 0xd8, 0x38, 0x08, 0x88, 0x43, 0x6b, 0xdf, 0x85, 0xf2, 0x43, 0xbd, 0xbd, 0xdb, 0x31, 0x36,
-	0x9f, 0x19, 0x8f, 0xd5, 0x67, 0x52, 0x4a, 0x7e, 0x9b, 0xb1, 0xbc, 0x3e, 0xc7, 0x72, 0xe0, 0xb9,
-	0xe3, 0x91, 0xb1, 0x3f, 0x31, 0x8e, 0xc8, 0x44, 0x48, 0xd1, 0x9a, 0x9d, 0xdd, 0x46, 0x57, 0x95,
-	0xd2, 0x67, 0x48, 0xb1, 0x86, 0xa3, 0xb1, 0xed, 0x13, 0x5a, 0xfb, 0x1e, 0x54, 0x6a, 0xdd, 0xae,
-	0xf6, 0xb0, 0x25, 0x3c, 0x89, 0xae, 0x94, 0x91, 0xdf, 0x62, 0x4c, 0x5f, 0x9e, 0x63, 0xe2, 0x3b,
-	0x9f, 0x61, 0x39, 0x01, 0xeb, 0xcc, 0x6d, 0x28, 0xf5, 0xd4, 0x6e, 0xcf, 0xe8, 0xf6, 0x74, 0xb5,
-	0xd6, 0x94, 0xb2, 0xb2, 0xc2, 0xb8, 0x5e, 0x9b, 0xe3, 0x0a, 0x88, 0x1f, 0xf8, 0x81, 0x47, 0x89,
-	0xc7, 0xb7, 0xd0, 0x1d, 0x28, 0x35, 0x6b, 0x9d, 0x48, 0x54, 0xee, 0x0c, 0x51, 0x43, 0x3c, 0x32,
-	0xb8, 0x38, 0x9f, 0x72, 0xdd, 0x87, 0x95, 0xa6, 0xaa, 0x3f, 0x54, 0x23, 0xbe, 0xbc, 0xfc, 0x15,
-	0xc6, 0xf7, 0xc6, 0x3c, 0x1f, 0xf1, 0x0e, 0x48, 0x8c, 0x53, 0x09, 0xe0, 0x4a, 0x9d, 0x8c, 0x3c,
-	0xd2, 0xc7, 0x01, 0x31, 0x63, 0x93, 0xf6, 0x0e, 0x64, 0x74, 0xb5, 0x56, 0x97, 0x12, 0xf2, 0x6b,
-	0x0c, 0xe8, 0xda, 0x1c, 0x90, 0x47, 0xb0, 0x29, 0xda, 0xbb, 0xa5, 0xab, 0xb5, 0x9e, 0x6a, 0xec,
-	0x69, 0xea, 0x13, 0x29, 0x79, 0x46, 0x7b, 0xfb, 0x1e, 0xc1, 0x01, 0x31, 0x8e, 0x2d, 0xf2, 0x9c,
-	0x4a, 0xfd, 0xaf, 0xa4, 0xf0, 0xae, 0x7c, 0x2b, 0x20, 0x3e, 0xfa, 0x06, 0xac, 0x6e, 0xb5, 0x9b,
-	0x9b, 0x5a, 0x4b, 0x35, 0x3a, 0xaa, 0xce, 0xe6, 0x32, 0x21, 0xbf, 0xcb, 0x80, 0xde, 0x9c, 0x07,
-	0x72, 0x87, 0xfb, 0x96, 0x43, 0x8c, 0x11, 0xf1, 0xc2, 0xe9, 0xfc, 0x18, 0xa4, 0x90, 0x9b, 0xbb,
-	0x7c, 0x8d, 0x67, 0x52, 0x52, 0xbe, 0xc1, 0xd8, 0x95, 0x33, 0xd8, 0x0f, 0x6c, 0x77, 0x1f, 0xdb,
-	0x36, 0xe3, 0xbf, 0x09, 0x45, 0x5d, 0xed, 0xee, 0xec, 0x6e, 0x6f, 0x37, 0x54, 0x29, 0x25, 0xbf,
-	0xc9, 0x18, 0xbf, 0x74, 0xa2, 0xbf, 0xfe, 0xe1, 0x78, 0x30, 0xb0, 0x89, 0xe8, 0xf4, 0x13, 0x5d,
-	0xeb, 0xa9, 0xc6, 0xb6, 0xd6, 0x50, 0xbb, 0x52, 0xfa, 0xac, 0xf5, 0xe0, 0x59, 0x01, 0x31, 0x06,
-	0x96, 0x4d, 0xd8, 0x50, 0xff, 0x36, 0x05, 0x97, 0xb6, 0xb8, 0xfc, 0x98, 0x67, 0xa9, 0x83, 0x3c,
-	0xd7, 0x77, 0xa3, 0xa3, 0xab, 0x82, 0x24, 0x25, 0xe4, 0x75, 0x06, 0xfd, 0xc1, 0x8b, 0x87, 0xc1,
-	0xa0, 0x33, 0xc8, 0x49, 0xb4, 0x7d, 0xfb, 0xa0, 0xcc, 0x63, 0xf2, 0xe5, 0x51, 0xdb, 0xda, 0xda,
-	0x6d, 0xee, 0x36, 0x6a, 0xbd, 0xb6, 0x4e, 0x9d, 0xe7, 0x0d, 0x86, 0x7d, 0xe7, 0x0b, 0xb0, 0xf9,
-	0x9a, 0xc1, 0xfd, 0xfe, 0x78, 0x38, 0xb6, 0x71, 0xe0, 0x7a, 0x6c, 0xc9, 0x7d, 0x17, 0xde, 0x98,
-	0x97, 0xa1, 0x3e, 0xed, 0xe9, 0xb5, 0xad, 0x9e, 0xd1, 0xde, 0xed, 0x75, 0x76, 0x7b, 0xd4, 0xbb,
-	0xbe, 0xc7, 0x04, 0xdc, 0xfa, 0x02, 0x01, 0xe4, 0xf3, 0xc0, 0xc3, 0xfd, 0xc0, 0x10, 0x16, 0x92,
-	0xa2, 0x3f, 0x82, 0x6b, 0xd1, 0x9c, 0x52, 0x15, 0x57, 0xeb, 0xc6, 0x5e, 0xad, 0xb1, 0xcb, 0x06,
-	0xbb, 0xca, 0x40, 0x6f, 0x9c, 0x35, 0xb3, 0x54, 0xd9, 0x89, 0x69, 0x30, 0x33, 0xc5, 0xc6, 0xfd,
-	0x0f, 0x32, 0xf0, 0x6a, 0x77, 0x64, 0x5b, 0x41, 0x80, 0xf7, 0x6d, 0xd2, 0xc1, 0x5e, 0xdd, 0x8d,
-	0x8d, 0x7f, 0x03, 0xae, 0x76, 0x6a, 0x9a, 0x6e, 0x3c, 0xd1, 0x7a, 0x3b, 0x86, 0xae, 0x76, 0x7b,
-	0xba, 0xb6, 0xd5, 0xd3, 0xda, 0x2d, 0x29, 0x21, 0xdf, 0x62, 0x82, 0xbe, 0x3a, 0x27, 0xc8, 0x37,
-	0x07, 0xc6, 0x08, 0x5b, 0x9e, 0xf1, 0xdc, 0x0a, 0x0e, 0x0d, 0x8f, 0xf8, 0x81, 0x67, 0xb1, 0x2d,
-	0x8b, 0xb6, 0xbb, 0x0e, 0x97, 0xba, 0x9d, 0x86, 0xd6, 0x9b, 0x41, 0x4a, 0xca, 0x5f, 0x63, 0x48,
-	0xef, 0x9e, 0x82, 0xe4, 0xd3, 0x86, 0xcd, 0xa3, 0xb4, 0xe0, 0x5a, 0x47, 0x6f, 0x6f, 0xa9, 0xdd,
-	0x2e, 0x1d, 0x57, 0xb5, 0x6e, 0xa8, 0x0d, 0xb5, 0xa9, 0xb6, 0xd8, 0x90, 0x9e, 0xbe, 0x1e, 0x58,
-	0xa3, 0x3c, 0xb7, 0x4f, 0x7c, 0x9f, 0x0e, 0x29, 0x31, 0x0d, 0x62, 0x13, 0xe6, 0xf1, 0x50, 0xbc,
-	0x4d, 0x90, 0x42, 0xbc, 0x08, 0x29, 0x2d, 0x7f, 0xc0, 0x90, 0xde, 0x79, 0x01, 0x52, 0x1c, 0xe3,
-	0x29, 0x7c, 0x89, 0xf7, 0xac, 0xd6, 0xaa, 0x1b, 0x5d, 0xed, 0x53, 0x35, 0xde, 0x45, 0x6a, 0x13,
-	0x4f, 0x9f, 0xeb, 0x69, 0x1f, 0xb1, 0x63, 0x1a, 0xbe, 0xf5, 0x43, 0x12, 0xef, 0x2c, 0x43, 0x76,
-	0xe1, 0xdd, 0xb0, 0x75, 0x14, 0x77, 0xda, 0x5b, 0x26, 0x6a, 0x46, 0x4a, 0x56, 0xde, 0x64, 0x52,
-	0xbe, 0xf1, 0x82, 0x46, 0x53, 0x19, 0x51, 0xf7, 0x99, 0xd4, 0x39, 0x81, 0xca, 0xef, 0x26, 0xe1,
-	0x5a, 0xb8, 0x6f, 0x75, 0x2d, 0x93, 0xb0, 0xbd, 0xb3, 0x37, 0x19, 0x11, 0x5f, 0x39, 0x84, 0x8c,
-	0xea, 0x8c, 0x87, 0xe8, 0x43, 0x28, 0x68, 0x3d, 0x55, 0xaf, 0x6d, 0x36, 0xa8, 0x0e, 0xc6, 0x4d,
-	0x82, 0x6f, 0x99, 0xc4, 0x60, 0x0e, 0xc2, 0x86, 0x15, 0x10, 0x8f, 0x2e, 0x29, 0xda, 0x89, 0x0f,
-	0xa1, 0xd0, 0xdc, 0x6d, 0xf4, 0xb4, 0x66, 0xad, 0x23, 0x25, 0xcf, 0x62, 0x18, 0x8e, 0xed, 0xc0,
-	0x1a, 0xe2, 0x11, 0x6d, 0xc4, 0xcf, 0x52, 0x50, 0x8a, 0xb9, 0xe5, 0xf3, 0xbe, 0x54, 0xf2, 0x84,
-	0x2f, 0xf5, 0x2a, 0x14, 0x58, 0xe8, 0x63, 0x58, 0xa6, 0xd8, 0x8a, 0xf3, 0xec, 0x5b, 0x33, 0x51,
-	0x07, 0xc0, 0xf2, 0x8d, 0x7d, 0x77, 0xec, 0x98, 0xc4, 0x64, 0x7e, 0x5e, 0x65, 0xfd, 0xd6, 0x02,
-	0x0e, 0x85, 0xe6, 0x6f, 0x72, 0x9e, 0x2a, 0xed, 0xb4, 0x5e, 0xb4, 0xc2, 0x6f, 0xb4, 0x0e, 0x57,
-	0x4f, 0xc4, 0x8a, 0x13, 0x2a, 0x39, 0xc3, 0x24, 0x9f, 0x08, 0xf2, 0x26, 0x9a, 0x79, 0xc2, 0xb1,
-	0xc9, 0x5e, 0xdc, 0xdf, 0xfc, 0x69, 0x1e, 0xca, 0x4c, 0x61, 0x3b, 0x78, 0x62, 0xbb, 0xd8, 0x44,
-	0x0f, 0x21, 0x6b, 0xba, 0xc6, 0xc0, 0x11, 0x1e, 0xe5, 0xfa, 0x02, 0xe0, 0x5d, 0xf3, 0x68, 0xd6,
-	0xa9, 0x34, 0xdd, 0x6d, 0x07, 0x35, 0x00, 0x46, 0xd8, 0xc3, 0x43, 0x12, 0xd0, 0xa8, 0x94, 0xc7,
-	0xdb, 0x1f, 0x2c, 0xe2, 0xde, 0x85, 0x4c, 0x7a, 0x8c, 0x1f, 0x7d, 0x1f, 0x4a, 0xd3, 0x69, 0x0e,
-	0x3d, 0xd0, 0x4f, 0x16, 0x83, 0x8b, 0x3a, 0x57, 0x8d, 0xd6, 0x62, 0x98, 0x21, 0xf0, 0x23, 0x02,
-	0x93, 0x10, 0xd0, 0x2d, 0x94, 0xba, 0xc4, 0xa1, 0x3f, 0xba, 0xbc, 0x04, 0x0a, 0x41, 0x47, 0x21,
-	0x92, 0x10, 0x11, 0xa8, 0x84, 0xc0, 0x1a, 0x12, 0x4f, 0x48, 0xc8, 0x9e, 0x4f, 0x42, 0x8f, 0x42,
-	0xc4, 0x25, 0x04, 0x11, 0x01, 0xbd, 0x0e, 0xe0, 0x47, 0x76, 0x98, 0xf9, 0xbd, 0x05, 0x3d, 0x46,
-	0x41, 0x37, 0xe1, 0x4a, 0x4c, 0x55, 0x8d, 0x68, 0xb5, 0xe7, 0xd9, 0x9a, 0x43, 0xb1, 0xb2, 0x2d,
-	0xb1, 0xf0, 0x6f, 0xc3, 0x55, 0x8f, 0xfc, 0x60, 0x4c, 0x3d, 0x28, 0x63, 0x60, 0x39, 0xd8, 0xb6,
-	0x7e, 0x88, 0x69, 0xf9, 0x5a, 0x81, 0x81, 0x5f, 0x09, 0x0b, 0xb7, 0x63, 0x65, 0xf2, 0x11, 0xac,
-	0xce, 0x8d, 0xf4, 0x29, 0x5e, 0xef, 0xe6, 0x6c, 0x40, 0xb8, 0xc8, 0xd2, 0x88, 0x40, 0xe3, 0xfe,
-	0x35, 0x15, 0x36, 0x3b, 0xe8, 0x2f, 0x49, 0x58, 0x08, 0x3a, 0x27, 0x6c, 0x6e, 0xfc, 0x5f, 0x8e,
-	0xb0, 0x08, 0x34, 0xee, 0xfd, 0xff, 0x22, 0x09, 0xc5, 0x48, 0x1b, 0xd0, 0x23, 0xc8, 0x04, 0x93,
-	0x11, 0xb7, 0x5b, 0x95, 0xf5, 0xbb, 0xcb, 0x68, 0x52, 0x95, 0x9a, 0x5e, 0x6e, 0x81, 0x18, 0x86,
-	0xfc, 0x29, 0x64, 0x28, 0x49, 0xd1, 0x85, 0x31, 0x5e, 0x85, 0xd2, 0x6e, 0xab, 0xdb, 0x51, 0xb7,
-	0xb4, 0x6d, 0x4d, 0xad, 0x4b, 0x09, 0x04, 0x90, 0xe3, 0x8e, 0xae, 0x94, 0x44, 0x57, 0x40, 0xea,
-	0x68, 0x1d, 0xb5, 0x41, 0x5d, 0x85, 0x76, 0x87, 0x6f, 0x13, 0x29, 0xf4, 0x0a, 0x5c, 0x8e, 0x6d,
-	0x1c, 0x06, 0xf5, 0x4b, 0x1e, 0xab, 0xba, 0x94, 0x56, 0xfe, 0x36, 0x0d, 0xc5, 0x68, 0xec, 0x90,
-	0x0e, 0xc0, 0x3a, 0x64, 0xc4, 0xa2, 0xd4, 0x45, 0x0c, 0xe7, 0x1e, 0x65, 0x8a, 0x60, 0x76, 0x12,
-	0x7a, 0x91, 0xc1, 0x30, 0xcc, 0x06, 0x14, 0xf6, 0xf1, 0x01, 0x47, 0x4c, 0x2d, 0x1c, 0xf7, 0x6e,
-	0xe2, 0x83, 0x38, 0x5e, 0x7e, 0x1f, 0x1f, 0x30, 0xb4, 0xcf, 0xa0, 0xc2, 0x3d, 0x1b, 0x66, 0x88,
-	0x29, 0x26, 0x0f, 0xe3, 0x3f, 0x5a, 0x2c, 0x8b, 0xc0, 0x19, 0xe3, 0xc8, 0x2b, 0x11, 0x5c, 0xd8,
-	0x5a, 0x1a, 0x4b, 0x30, 0xe4, 0xcc, 0xc2, 0xad, 0x6d, 0xe2, 0xd1, 0x4c, 0x6b, 0x87, 0x78, 0x14,
-	0xa2, 0xf9, 0x24, 0xe0, 0x68, 0xd9, 0x85, 0xd1, 0xba, 0x24, 0x98, 0x41, 0xf3, 0x49, 0x40, 0x7f,
-	0x6e, 0xe6, 0x78, 0xf6, 0x40, 0xf9, 0x2a, 0x54, 0x66, 0x07, 0x7c, 0x66, 0x2f, 0x4c, 0xce, 0xec,
-	0x85, 0xca, 0x7d, 0x28, 0xc7, 0xc7, 0x12, 0xdd, 0x00, 0x29, 0xf4, 0x05, 0xe6, 0x58, 0x2a, 0x82,
-	0x2e, 0x8c, 0x89, 0xf2, 0xd3, 0x24, 0xa0, 0x93, 0x43, 0x46, 0xad, 0x52, 0xcc, 0xf7, 0x9d, 0x07,
-	0x41, 0xb1, 0xb2, 0xd0, 0x2a, 0x7d, 0x9b, 0x65, 0x7d, 0x98, 0x37, 0x3a, 0x70, 0xc4, 0x1a, 0x38,
-	0xcf, 0x4e, 0x55, 0x14, 0x28, 0xdb, 0x8e, 0xb2, 0x07, 0xe5, 0xf8, 0x98, 0xa3, 0xeb, 0x50, 0xa6,
-	0x9e, 0xf3, 0x5c, 0x63, 0xe0, 0x88, 0x4c, 0xc2, 0x46, 0xbc, 0x0d, 0x15, 0xbe, 0xb4, 0xe7, 0x9c,
-	0x86, 0x32, 0xa3, 0x6e, 0x4d, 0x47, 0x2b, 0x3e, 0xfa, 0x4b, 0x8c, 0xd6, 0x4f, 0x92, 0x50, 0x8c,
-	0xec, 0x02, 0xea, 0xf2, 0xcd, 0xc3, 0x30, 0xdd, 0x21, 0xb6, 0x1c, 0x61, 0x05, 0xd6, 0x17, 0x34,
-	0x2d, 0x75, 0xc6, 0xc4, 0x2d, 0x00, 0xdb, 0x2f, 0x38, 0x81, 0x76, 0x81, 0xef, 0x48, 0xf3, 0x5d,
-	0x60, 0xd4, 0xb0, 0x21, 0xdf, 0x82, 0x62, 0xe4, 0xc7, 0x28, 0xb7, 0xcf, 0x32, 0x19, 0x2b, 0x50,
-	0xdc, 0x6d, 0x6d, 0xb6, 0x77, 0x5b, 0x75, 0xb5, 0x2e, 0x25, 0x51, 0x09, 0xf2, 0xe1, 0x47, 0x4a,
-	0xf9, 0x8b, 0x24, 0x94, 0x74, 0x82, 0xcd, 0xd0, 0xc9, 0x78, 0x04, 0x39, 0xdf, 0x1d, 0x7b, 0x7d,
-	0x72, 0x01, 0x2f, 0x43, 0x20, 0xcc, 0xb9, 0x66, 0xa9, 0x8b, 0xbb, 0x66, 0x8a, 0x09, 0x97, 0x78,
-	0x5a, 0x55, 0x73, 0x82, 0xc8, 0x2f, 0x6a, 0x43, 0x51, 0x64, 0x1f, 0x2e, 0xe4, 0x1b, 0x15, 0x38,
-	0xc8, 0xb6, 0xa3, 0xfc, 0x71, 0x12, 0x2a, 0x22, 0x58, 0x0d, 0x65, 0xcc, 0x2e, 0xeb, 0xe4, 0x4b,
-	0x58, 0xd6, 0x67, 0xea, 0x56, 0xea, 0x2c, 0xdd, 0x52, 0xfe, 0x25, 0x07, 0x97, 0x7a, 0xc4, 0x0f,
-	0xba, 0x2c, 0x63, 0x12, 0x36, 0xed, 0x6c, 0x7b, 0x80, 0x74, 0xc8, 0x91, 0x63, 0x96, 0x7e, 0x4d,
-	0x2d, 0x9c, 0xc3, 0x3b, 0x21, 0xa0, 0xaa, 0x52, 0x08, 0x5d, 0x20, 0xc9, 0xff, 0x99, 0x81, 0x2c,
-	0xa3, 0xa0, 0x63, 0x58, 0x7d, 0x8e, 0x03, 0xe2, 0x0d, 0xb1, 0x77, 0x64, 0xb0, 0x52, 0x31, 0x30,
-	0x8f, 0xcf, 0x2f, 0xa6, 0x5a, 0x33, 0x8f, 0xb1, 0xd3, 0x27, 0x4f, 0x42, 0xe0, 0x9d, 0x84, 0x5e,
-	0x89, 0xa4, 0x70, 0xb9, 0x3f, 0x49, 0xc2, 0x55, 0x11, 0xf0, 0xd0, 0x8d, 0x81, 0xe9, 0x1e, 0x17,
-	0xcf, 0xcd, 0x4d, 0xe7, 0xe2, 0xe2, 0x3b, 0x11, 0x3c, 0xd5, 0xd1, 0x9d, 0x84, 0x7e, 0x79, 0x34,
-	0x43, 0xe1, 0x0d, 0x19, 0xc2, 0x4a, 0x68, 0x30, 0xb8, 0x7c, 0xbe, 0x3d, 0x6d, 0x5f, 0x48, 0xbe,
-	0xa9, 0x8a, 0xc0, 0x73, 0x27, 0xa1, 0x97, 0x05, 0x3c, 0x2b, 0x93, 0xef, 0x81, 0x34, 0x3f, 0x3a,
-	0xe8, 0x2d, 0x58, 0x71, 0xc8, 0x73, 0x23, 0x1a, 0x21, 0x36, 0x03, 0x69, 0xbd, 0xec, 0x90, 0xe7,
-	0x51, 0x25, 0x79, 0x13, 0xae, 0x9e, 0xda, 0x2f, 0xf4, 0x1e, 0x48, 0x98, 0x17, 0x18, 0xe6, 0xd8,
-	0xe3, 0xde, 0x23, 0x07, 0x58, 0x15, 0xf4, 0xba, 0x20, 0xcb, 0x1e, 0x94, 0x62, 0x6d, 0x43, 0x7d,
-	0x28, 0x84, 0x01, 0xb2, 0x38, 0x11, 0x7c, 0x78, 0xae, 0x5e, 0xd3, 0x66, 0xf8, 0x01, 0x1e, 0x8e,
-	0x48, 0x88, 0xad, 0x47, 0xc0, 0x9b, 0x79, 0xc8, 0xb2, 0x71, 0x95, 0xbf, 0x03, 0xe8, 0x64, 0x45,
-	0xf4, 0x2e, 0xac, 0x12, 0x87, 0x2e, 0xf5, 0x28, 0xe2, 0x65, 0x8d, 0x2f, 0xeb, 0x15, 0x41, 0x0e,
-	0x2b, 0xbe, 0x06, 0xc5, 0x20, 0x64, 0x67, 0x6b, 0x24, 0xad, 0x4f, 0x09, 0xca, 0x7f, 0xa7, 0xe1,
-	0xd2, 0x13, 0xcf, 0x0a, 0xc8, 0xb6, 0x65, 0x13, 0x3f, 0xd4, 0xaa, 0x6d, 0xc8, 0xf8, 0x96, 0x73,
-	0x74, 0x91, 0x58, 0x8b, 0xf2, 0xa3, 0xef, 0xc0, 0x2a, 0x8d, 0xd2, 0x71, 0x60, 0x0c, 0x44, 0xe1,
-	0x05, 0x36, 0xc5, 0x0a, 0x87, 0x0a, 0x69, 0x74, 0x04, 0xb8, 0xd1, 0x22, 0xa6, 0xc1, 0x12, 0x6e,
-	0x3e, 0x5b, 0x82, 0x05, 0xbd, 0x12, 0x92, 0x59, 0xc7, 0x7c, 0xf4, 0x0d, 0x90, 0xc5, 0xd9, 0xb8,
-	0x49, 0xbd, 0xce, 0xa1, 0xe5, 0x10, 0xd3, 0xf0, 0x0f, 0xb1, 0x67, 0x5a, 0xce, 0x01, 0xf3, 0x7d,
-	0x0a, 0xfa, 0x1a, 0xaf, 0x51, 0x8f, 0x2a, 0x74, 0x45, 0x39, 0x22, 0xb3, 0x11, 0x1e, 0x8f, 0x8e,
-	0xea, 0x8b, 0x1c, 0x81, 0xcd, 0x0f, 0xeb, 0x8b, 0xc2, 0xbc, 0xff, 0xd7, 0xd8, 0x44, 0xf9, 0x11,
-	0x64, 0x99, 0x59, 0x7d, 0x39, 0xc7, 0x34, 0x55, 0xb8, 0x1c, 0x1d, 0x55, 0x45, 0x96, 0x3c, 0x3c,
-	0xac, 0xb9, 0x14, 0x15, 0x09, 0x43, 0xee, 0x2b, 0xff, 0x96, 0x81, 0x4a, 0x98, 0x85, 0xe1, 0xe7,
-	0x80, 0xca, 0x6f, 0x32, 0x62, 0xfb, 0x7e, 0x1b, 0xb2, 0x9b, 0xcf, 0x7a, 0x6a, 0x57, 0x4a, 0xc8,
-	0xaf, 0xb2, 0x54, 0xca, 0x65, 0x96, 0x4a, 0x61, 0xa8, 0x1b, 0xfb, 0x93, 0x80, 0x25, 0xf6, 0xd0,
-	0x4d, 0x28, 0x51, 0x17, 0xbf, 0xf5, 0xd0, 0xd8, 0xed, 0x6d, 0xdf, 0x97, 0x60, 0x26, 0x97, 0xcf,
-	0xeb, 0xd2, 0x88, 0xd1, 0x39, 0x30, 0xc6, 0xc1, 0xe0, 0x3e, 0xe5, 0x78, 0x1d, 0x52, 0x8f, 0xf7,
-	0xa4, 0xa4, 0x7c, 0x8d, 0x55, 0x94, 0x62, 0x15, 0x8f, 0x8e, 0x69, 0xf9, 0x3b, 0x90, 0xdb, 0xab,
-	0xe9, 0x5a, 0xab, 0x27, 0xa5, 0x64, 0x99, 0xd5, 0xb9, 0x12, 0xab, 0x73, 0x8c, 0x3d, 0xcb, 0x09,
-	0x44, 0xbd, 0x7a, 0x7b, 0x77, 0xb3, 0xa1, 0x4a, 0xa5, 0x53, 0xea, 0x99, 0xee, 0x58, 0x64, 0x85,
-	0xde, 0x8f, 0xa5, 0x91, 0xd2, 0x33, 0x99, 0x74, 0x5e, 0x33, 0x9e, 0x41, 0x7a, 0x1b, 0xb2, 0x3d,
-	0xad, 0xa9, 0xea, 0x52, 0xe6, 0x94, 0x3e, 0x33, 0x8f, 0x87, 0x67, 0xfa, 0x57, 0xb5, 0x56, 0x4f,
-	0xd5, 0xf7, 0xa2, 0x9b, 0x0d, 0x52, 0x76, 0x26, 0xfd, 0x2c, 0x80, 0x9d, 0x80, 0x78, 0xc7, 0xd8,
-	0x16, 0xa9, 0x7e, 0x9e, 0xb4, 0x5e, 0x69, 0xa8, 0xad, 0x87, 0xbd, 0x1d, 0xa3, 0xa3, 0xab, 0xdb,
-	0xda, 0x53, 0x29, 0x37, 0x93, 0xa6, 0xe2, 0x7c, 0x36, 0x71, 0x0e, 0x82, 0x43, 0x63, 0xe4, 0x91,
-	0x81, 0xf5, 0xb9, 0xe0, 0x9a, 0xb9, 0x47, 0x21, 0xe5, 0x4f, 0xe1, 0xe2, 0xd9, 0xf4, 0x98, 0xac,
-	0xbb, 0x50, 0xe1, 0xd5, 0xc3, 0xbc, 0xad, 0x54, 0x98, 0x39, 0xfd, 0xe0, 0x6c, 0x91, 0xde, 0xf2,
-	0x25, 0xc9, 0xd2, 0xa7, 0x57, 0xbb, 0xbd, 0x5a, 0x4f, 0x35, 0x36, 0x69, 0xbc, 0x56, 0x37, 0xa2,
-	0xc1, 0x2b, 0xca, 0xef, 0x31, 0xf6, 0xb7, 0x66, 0xe6, 0x16, 0x07, 0xc4, 0xd8, 0xc7, 0xfd, 0x23,
-	0x62, 0x1a, 0xb1, 0x91, 0x54, 0x7e, 0x09, 0x90, 0xeb, 0xf6, 0x0f, 0xc9, 0x10, 0xa3, 0x87, 0x90,
-	0x1b, 0x58, 0xc4, 0x36, 0x43, 0x0b, 0xbd, 0x50, 0x38, 0xc2, 0x58, 0xab, 0xdb, 0x94, 0x4f, 0x17,
-	0xec, 0xa8, 0x02, 0xa9, 0xc8, 0x2f, 0x49, 0x59, 0xa6, 0xfc, 0x57, 0x49, 0x28, 0x35, 0xdc, 0x03,
-	0xab, 0x8f, 0x6d, 0x1a, 0xab, 0x8a, 0xf2, 0x64, 0x58, 0x8e, 0x10, 0x64, 0xb0, 0x77, 0xe0, 0x0b,
-	0x0e, 0xf6, 0x1b, 0x75, 0xa0, 0xb8, 0x8f, 0x7d, 0x62, 0xb0, 0x40, 0x99, 0xef, 0x93, 0xb7, 0x97,
-	0x6c, 0x0f, 0x95, 0xa5, 0x17, 0x28, 0x0a, 0x93, 0xfa, 0x1e, 0x48, 0x3e, 0xf1, 0x2c, 0x6c, 0xb3,
-	0x9c, 0x67, 0xdf, 0xc6, 0xbe, 0xcf, 0x2c, 0x59, 0x59, 0x5f, 0x9d, 0xd2, 0xb7, 0x28, 0x59, 0xfe,
-	0xcb, 0x24, 0xe4, 0x9b, 0x78, 0xc4, 0xd8, 0x5a, 0x50, 0xa0, 0xd1, 0x43, 0x14, 0xb0, 0x9f, 0xb3,
-	0x1d, 0xf9, 0x23, 0x32, 0x61, 0x78, 0x51, 0x18, 0xcd, 0x10, 0x53, 0xe7, 0x47, 0xe4, 0x61, 0x34,
-	0xfd, 0x29, 0xff, 0x47, 0x1a, 0x8a, 0x51, 0x01, 0xf5, 0x6f, 0x29, 0xf6, 0x34, 0x37, 0xba, 0x58,
-	0x74, 0x21, 0x04, 0x50, 0x88, 0x16, 0x1e, 0x12, 0xbd, 0x10, 0x88, 0x5f, 0x48, 0x86, 0x82, 0x33,
-	0xb6, 0x6d, 0x96, 0x89, 0x4a, 0x31, 0xdb, 0x1f, 0x7d, 0xa3, 0x21, 0xbc, 0x32, 0xbd, 0x86, 0x11,
-	0x65, 0x92, 0x2f, 0x38, 0x6b, 0x3b, 0x09, 0xfd, 0xea, 0x14, 0x55, 0x6c, 0xcb, 0xe1, 0x6c, 0xd0,
-	0x10, 0x9c, 0xe1, 0x67, 0x16, 0x4e, 0x41, 0x08, 0x7c, 0x31, 0xa5, 0x22, 0x08, 0x67, 0x78, 0x8f,
-	0x00, 0x3c, 0xf7, 0xb9, 0xe1, 0xb3, 0x0a, 0x22, 0x0c, 0x7f, 0x6f, 0x61, 0xc4, 0x9d, 0x84, 0x5e,
-	0xf4, 0xdc, 0xe7, 0x42, 0x7f, 0x3e, 0x85, 0xb2, 0xcd, 0x57, 0x39, 0x6f, 0x5f, 0x6e, 0xe1, 0xe4,
-	0x83, 0x68, 0x5f, 0x4c, 0x47, 0x76, 0x12, 0x7a, 0xc9, 0x9e, 0x7e, 0x6e, 0x96, 0xc4, 0x9c, 0x5a,
-	0xce, 0xc0, 0x95, 0x7f, 0x9d, 0x84, 0x2c, 0x1b, 0x2b, 0xaa, 0x39, 0xb1, 0x0c, 0x38, 0xfb, 0x8d,
-	0xae, 0x43, 0x29, 0xbc, 0x66, 0x16, 0x7a, 0x0f, 0x45, 0x3d, 0x4e, 0x42, 0x0f, 0x45, 0xfe, 0xe9,
-	0x02, 0x6a, 0xc5, 0x00, 0x84, 0x22, 0xd3, 0x79, 0xc8, 0x32, 0x45, 0xfe, 0x2a, 0x5c, 0x62, 0xae,
-	0x14, 0xdd, 0x46, 0xd8, 0x79, 0x25, 0x6d, 0x40, 0x96, 0x15, 0x4b, 0x61, 0x41, 0x47, 0xd0, 0x95,
-	0x7f, 0x4a, 0x42, 0x21, 0x5c, 0x6c, 0xa8, 0x00, 0x19, 0xba, 0x89, 0x49, 0x09, 0x54, 0x84, 0xac,
-	0xd6, 0xea, 0xdd, 0xba, 0x2b, 0x25, 0xc5, 0xcf, 0xdb, 0xeb, 0x52, 0x4a, 0xfc, 0xbc, 0x7b, 0x47,
-	0x4a, 0xd3, 0x70, 0xb4, 0xae, 0x6e, 0x69, 0xcd, 0x5a, 0x43, 0xca, 0x50, 0xfa, 0x76, 0xa3, 0x5d,
-	0xeb, 0x49, 0x59, 0x04, 0xd1, 0x3e, 0x93, 0xa3, 0xbf, 0xf9, 0x6e, 0x27, 0xe5, 0x51, 0x19, 0x0a,
-	0xf5, 0x5a, 0x4f, 0xa5, 0xfb, 0x85, 0x54, 0xe0, 0xc1, 0x6c, 0xbb, 0xa1, 0xd6, 0x5a, 0x52, 0x91,
-	0x72, 0xf3, 0xad, 0x13, 0xe8, 0xcf, 0x9a, 0xae, 0xd7, 0x9e, 0x49, 0x25, 0x94, 0x87, 0x74, 0xb3,
-	0xd6, 0x91, 0x56, 0xe8, 0x0f, 0xbd, 0xfd, 0x44, 0xaa, 0x20, 0x09, 0xca, 0x8d, 0xf6, 0x43, 0x6d,
-	0xab, 0xd6, 0x30, 0x7a, 0xcf, 0x3a, 0xaa, 0xb4, 0xaa, 0xfc, 0x5e, 0x2e, 0x8c, 0x2c, 0x63, 0x79,
-	0xfd, 0x97, 0x1e, 0x59, 0xa2, 0x3d, 0x28, 0xf3, 0x13, 0x45, 0x6a, 0xbf, 0xc7, 0xbe, 0x88, 0x89,
-	0x17, 0x99, 0xb1, 0x26, 0x65, 0xeb, 0x32, 0x2e, 0x1e, 0x15, 0x97, 0x86, 0x53, 0x0a, 0x7a, 0x27,
-	0x74, 0x04, 0xa7, 0x61, 0x64, 0x9a, 0xad, 0x93, 0x15, 0x4e, 0x0e, 0x13, 0x23, 0x75, 0xc8, 0x07,
-	0x9e, 0x75, 0x70, 0x40, 0x3c, 0xa1, 0x6d, 0xef, 0x2f, 0xe2, 0xb5, 0x73, 0x0e, 0x3d, 0x64, 0x45,
-	0x04, 0x2e, 0x45, 0xd1, 0x29, 0xb5, 0x12, 0x94, 0x85, 0x2d, 0x8b, 0xca, 0xfa, 0xfd, 0x05, 0xf0,
-	0x6a, 0x31, 0xde, 0xa6, 0x6b, 0x8a, 0xf4, 0xa7, 0x84, 0xe7, 0xc8, 0xa8, 0x0b, 0x25, 0x7e, 0x2a,
-	0xca, 0x42, 0x3c, 0xa6, 0x7e, 0x8b, 0x59, 0x3e, 0x7e, 0xa9, 0x83, 0x46, 0x0c, 0x22, 0xaf, 0xe2,
-	0x46, 0x04, 0xb4, 0x0f, 0x52, 0xdf, 0x76, 0x59, 0xe0, 0xb8, 0x4f, 0x0e, 0xf1, 0xb1, 0xe5, 0x7a,
-	0x2c, 0xc7, 0x5e, 0x59, 0xbf, 0xb7, 0x48, 0x56, 0x91, 0xb3, 0x6e, 0x0a, 0x4e, 0x0e, 0xbf, 0xda,
-	0x9f, 0xa5, 0xb2, 0xb0, 0xca, 0xb6, 0xd9, 0xee, 0x6e, 0xe3, 0x80, 0x38, 0xc4, 0xf7, 0x59, 0x52,
-	0x9e, 0x86, 0x55, 0x9c, 0xde, 0x10, 0x64, 0xf4, 0x19, 0x54, 0xda, 0x0e, 0x6d, 0x58, 0xc8, 0xbc,
-	0x56, 0x5c, 0x38, 0x89, 0x3c, 0xcb, 0xc8, 0xdb, 0x32, 0x87, 0x86, 0x6e, 0xc1, 0x55, 0xec, 0xfb,
-	0xd6, 0x81, 0xe3, 0x1b, 0x81, 0x6b, 0xb8, 0x4e, 0x78, 0xff, 0x61, 0x0d, 0x98, 0xdd, 0x47, 0xa2,
-	0xb0, 0xe7, 0xb6, 0x1d, 0xc2, 0xd7, 0xbf, 0xf2, 0x5d, 0x28, 0xc5, 0x16, 0x9b, 0xd2, 0x3c, 0x2b,
-	0xab, 0xb4, 0x0a, 0xa5, 0x56, 0xbb, 0xc5, 0x0e, 0xd7, 0xa9, 0x62, 0x26, 0x19, 0x41, 0x55, 0xeb,
-	0x5d, 0x7e, 0xde, 0x2e, 0xa5, 0x10, 0x82, 0x4a, 0xad, 0xa1, 0xab, 0xb5, 0xba, 0x38, 0x82, 0xaf,
-	0x4b, 0x69, 0xe5, 0x7b, 0x20, 0xcd, 0xcf, 0xbf, 0xa2, 0x9d, 0x25, 0xa2, 0x02, 0x50, 0xd7, 0xba,
-	0x5b, 0x35, 0xbd, 0xce, 0x25, 0x48, 0x50, 0x8e, 0x4e, 0xf1, 0x29, 0x25, 0x45, 0x6b, 0xe8, 0x2a,
-	0x3b, 0x79, 0xa7, 0xdf, 0x69, 0xe5, 0xdb, 0xb0, 0x3a, 0x37, 0x47, 0xca, 0xc7, 0x2f, 0xe8, 0x80,
-	0xda, 0xd4, 0x7a, 0x46, 0xad, 0xf1, 0xa4, 0xf6, 0xac, 0xcb, 0xd3, 0xe9, 0x8c, 0xa0, 0x6d, 0x1b,
-	0xad, 0x76, 0x4b, 0x6d, 0x76, 0x7a, 0xcf, 0xa4, 0x94, 0xd2, 0x99, 0x9f, 0xa2, 0x17, 0x22, 0x6e,
-	0x6b, 0xba, 0x3a, 0x83, 0xc8, 0x08, 0xb3, 0x88, 0xfb, 0x00, 0xd3, 0x25, 0xaa, 0xf4, 0xce, 0x42,
-	0xbb, 0x04, 0x2b, 0x6a, 0xab, 0x6e, 0xb4, 0xb7, 0x8d, 0x28, 0xe1, 0x8f, 0xa0, 0xd2, 0xa8, 0xb1,
-	0x8b, 0x35, 0x5a, 0xcb, 0xe8, 0xd4, 0x5a, 0x74, 0x94, 0x69, 0xab, 0x6b, 0x7a, 0x43, 0x8b, 0x53,
-	0xd3, 0x8a, 0x0d, 0x30, 0x4d, 0x2f, 0x2a, 0x9f, 0xbd, 0x60, 0x84, 0xd5, 0x3d, 0xb5, 0xd5, 0x63,
-	0xd7, 0x83, 0xa5, 0x24, 0xba, 0x0c, 0xab, 0xe2, 0x3c, 0x9a, 0x86, 0x16, 0x8c, 0x98, 0x42, 0xd7,
-	0xe1, 0xb5, 0xee, 0xb3, 0xd6, 0xd6, 0x8e, 0xde, 0x6e, 0xb1, 0x33, 0xea, 0xf9, 0x1a, 0x69, 0xe5,
-	0xe7, 0x12, 0xe4, 0x85, 0x99, 0x40, 0x3a, 0x14, 0xf1, 0x20, 0x20, 0x9e, 0x81, 0x6d, 0x7b, 0x09,
-	0x0f, 0x4b, 0xb0, 0x57, 0x6b, 0x94, 0xb7, 0x66, 0xdb, 0x3b, 0x09, 0xbd, 0x80, 0xc5, 0xef, 0x18,
-	0xa6, 0x33, 0x59, 0xc2, 0xc7, 0x9a, 0xc5, 0x74, 0x26, 0x53, 0x4c, 0x67, 0x82, 0x76, 0x01, 0x38,
-	0x26, 0xc1, 0xfd, 0x43, 0xb1, 0x77, 0xde, 0x59, 0x16, 0x54, 0xc5, 0xfd, 0x43, 0xea, 0x35, 0xe0,
-	0xf0, 0x03, 0xd9, 0x70, 0x59, 0xc0, 0x3a, 0xa6, 0xe1, 0x0e, 0x42, 0x7d, 0xe3, 0xe6, 0xf6, 0xeb,
-	0x4b, 0xe3, 0x3b, 0x66, 0x7b, 0xc0, 0x15, 0x73, 0x27, 0xa1, 0x4b, 0x78, 0x8e, 0x86, 0x02, 0xb8,
-	0xca, 0xa5, 0xcd, 0x25, 0xc4, 0x84, 0xeb, 0xf3, 0xf1, 0xb2, 0xf2, 0x4e, 0x26, 0xbe, 0xf0, 0x49,
-	0x32, 0xfa, 0x69, 0x12, 0x14, 0x2e, 0xd6, 0x9f, 0x38, 0xfd, 0x43, 0xcf, 0x75, 0x98, 0x0f, 0x3e,
-	0xdf, 0x06, 0xee, 0x30, 0x3d, 0x5a, 0xb6, 0x0d, 0xdd, 0x18, 0xe6, 0x89, 0xf6, 0xbc, 0x81, 0x5f,
-	0x5c, 0x05, 0x3d, 0x86, 0x1c, 0xb6, 0x9f, 0xe3, 0x89, 0xbf, 0x56, 0x5e, 0xd8, 0x9f, 0x8c, 0xc4,
-	0x33, 0xc6, 0x9d, 0x84, 0x2e, 0x20, 0x50, 0x0b, 0xf2, 0x26, 0x19, 0xe0, 0xb1, 0x1d, 0xb0, 0x4d,
-	0x62, 0xb1, 0xed, 0x3f, 0x44, 0xab, 0x73, 0x4e, 0xea, 0x9e, 0x0a, 0x10, 0xf4, 0xd9, 0x34, 0x63,
-	0xd8, 0x77, 0xc7, 0x4e, 0xc0, 0xb6, 0x85, 0xd2, 0x42, 0x5b, 0x4f, 0x88, 0xaa, 0x86, 0x47, 0x11,
-	0x63, 0x27, 0x88, 0xa5, 0x08, 0xd9, 0x37, 0xda, 0x81, 0xac, 0x43, 0x8e, 0x09, 0xdf, 0x45, 0x4a,
-	0xeb, 0x37, 0x97, 0xc0, 0x6d, 0x51, 0xbe, 0x9d, 0x84, 0xce, 0x01, 0xa8, 0x76, 0xb8, 0x1e, 0x3f,
-	0x57, 0xb6, 0x27, 0x6c, 0xb7, 0x58, 0x4e, 0x3b, 0xda, 0xde, 0x36, 0xe7, 0xa5, 0xda, 0xe1, 0x86,
-	0x1f, 0x74, 0x76, 0x3c, 0x32, 0x22, 0x38, 0x58, 0x2b, 0x2d, 0x3d, 0x3b, 0x3a, 0x63, 0xa4, 0xb3,
-	0xc3, 0x21, 0xe4, 0xa7, 0x50, 0x08, 0xad, 0x05, 0x6a, 0x40, 0x89, 0xdd, 0x89, 0x65, 0x55, 0xc3,
-	0x88, 0x77, 0x19, 0xef, 0x26, 0xce, 0x3e, 0x45, 0x76, 0x26, 0x2f, 0x19, 0xf9, 0x19, 0x14, 0x23,
-	0xc3, 0xf1, 0x92, 0xa1, 0xff, 0x2e, 0x09, 0xd2, 0xbc, 0xd1, 0x40, 0x6d, 0x58, 0x21, 0xd8, 0xb3,
-	0x27, 0xc6, 0xc0, 0xf2, 0x2c, 0xe7, 0x20, 0xbc, 0x88, 0xbd, 0x8c, 0x90, 0x32, 0x03, 0xd8, 0xe6,
-	0xfc, 0xa8, 0x09, 0x65, 0xea, 0xd4, 0x44, 0x78, 0xa9, 0xa5, 0xf1, 0x4a, 0x94, 0x5f, 0xc0, 0xc9,
-	0xbf, 0x03, 0x97, 0x4f, 0x31, 0x3c, 0xe8, 0x10, 0xae, 0x44, 0x19, 0x5a, 0xe3, 0xc4, 0xeb, 0x93,
-	0x8f, 0x16, 0x3c, 0x5c, 0x63, 0xec, 0xd3, 0xe7, 0x06, 0x97, 0x83, 0x13, 0x34, 0x5f, 0x7e, 0x13,
-	0xde, 0xf8, 0x02, 0xab, 0x23, 0x17, 0x21, 0x2f, 0x74, 0x59, 0xbe, 0x0d, 0xe5, 0xb8, 0x02, 0xa2,
-	0xb7, 0xe6, 0x15, 0x3a, 0xc9, 0xa2, 0xa3, 0x19, 0xad, 0x94, 0xf3, 0x90, 0x65, 0xda, 0x25, 0x17,
-	0x20, 0xc7, 0x4d, 0x8c, 0xfc, 0x47, 0x49, 0x28, 0x46, 0x2a, 0x82, 0x3e, 0x86, 0x4c, 0x74, 0x74,
-	0xb8, 0xdc, 0x58, 0x32, 0x3e, 0xea, 0xd6, 0x87, 0x9a, 0xba, 0xfc, 0x74, 0x84, 0xac, 0x72, 0x0f,
-	0x72, 0x5c, 0xc5, 0x68, 0x14, 0x3d, 0x5d, 0x58, 0xe7, 0x68, 0x55, 0x8c, 0x7b, 0xb3, 0x18, 0x85,
-	0x1c, 0xca, 0xaf, 0x53, 0xb1, 0x3c, 0xfe, 0xf4, 0x26, 0x7d, 0x17, 0xb2, 0x26, 0xb1, 0xf1, 0x44,
-	0x08, 0xfa, 0xfa, 0xb9, 0x26, 0xb7, 0x5a, 0xa7, 0x10, 0xd4, 0x7e, 0x31, 0x2c, 0xf4, 0x29, 0x14,
-	0xb0, 0x6d, 0x1d, 0x38, 0x46, 0xe0, 0x8a, 0x31, 0xf9, 0xe6, 0xf9, 0x70, 0x6b, 0x14, 0xa5, 0xe7,
-	0x52, 0x2b, 0x8e, 0xf9, 0x4f, 0xf9, 0x7d, 0xc8, 0x32, 0x69, 0xe8, 0x4d, 0x28, 0x33, 0x69, 0xc6,
-	0xd0, 0xb2, 0x6d, 0xcb, 0x17, 0x67, 0x27, 0x25, 0x46, 0x6b, 0x32, 0x92, 0xfc, 0x00, 0xf2, 0x02,
-	0x01, 0x5d, 0x83, 0xdc, 0x88, 0x78, 0x96, 0xcb, 0x63, 0xb3, 0xb4, 0x2e, 0xbe, 0x28, 0xdd, 0x1d,
-	0x0c, 0x7c, 0x12, 0x30, 0x27, 0x21, 0xad, 0x8b, 0xaf, 0xcd, 0xab, 0x70, 0xf9, 0x14, 0x1d, 0x50,
-	0xfe, 0x30, 0x05, 0xc5, 0x28, 0xa5, 0x8d, 0xf6, 0xa0, 0x82, 0xfb, 0xec, 0xee, 0xdf, 0x08, 0x07,
-	0x01, 0xf1, 0x9c, 0xf3, 0x26, 0xb2, 0x57, 0x38, 0x4c, 0x87, 0xa3, 0xa0, 0xc7, 0x90, 0x3f, 0xb6,
-	0xc8, 0xf3, 0x8b, 0x1d, 0xe2, 0xe7, 0x28, 0xc4, 0xb6, 0x83, 0x3e, 0x83, 0x4b, 0x22, 0x3c, 0x1d,
-	0xe2, 0xd1, 0x88, 0xfa, 0x07, 0x03, 0x47, 0x78, 0x5c, 0xe7, 0x81, 0x15, 0xb1, 0x6e, 0x93, 0x63,
-	0x6d, 0x3b, 0xca, 0x27, 0x50, 0x8a, 0xbd, 0x48, 0x41, 0x12, 0xa4, 0xc7, 0x5e, 0x98, 0x29, 0xa1,
-	0x3f, 0xd1, 0x1a, 0xe4, 0x47, 0xfc, 0x04, 0x82, 0x89, 0x2d, 0xeb, 0xe1, 0xe7, 0xa3, 0x4c, 0x21,
-	0x29, 0xa5, 0x94, 0x3f, 0x49, 0xc2, 0x95, 0x30, 0x1f, 0x1f, 0x7f, 0x32, 0xa3, 0xfc, 0x24, 0x09,
-	0xe5, 0x38, 0x01, 0xbd, 0x0d, 0xb9, 0x7a, 0x9b, 0xdd, 0xa7, 0x49, 0xc8, 0x6b, 0x2c, 0x2d, 0x8b,
-	0x58, 0x5a, 0x96, 0x38, 0xc7, 0x1b, 0xa6, 0xdb, 0x3f, 0xe2, 0x99, 0xea, 0x77, 0x20, 0x2f, 0x9c,
-	0x64, 0x29, 0x39, 0x93, 0xd1, 0xa6, 0xd5, 0x84, 0x9b, 0x44, 0xeb, 0xdd, 0x80, 0x82, 0xfa, 0xb4,
-	0xa7, 0xea, 0xad, 0x5a, 0x63, 0x2e, 0xeb, 0x4e, 0x2b, 0x92, 0xcf, 0xe9, 0x54, 0x60, 0x7b, 0xe3,
-	0xf8, 0x96, 0x72, 0x1f, 0x56, 0xea, 0x0c, 0x3e, 0x3c, 0xa0, 0x7a, 0x17, 0x56, 0xfb, 0xae, 0x13,
-	0x60, 0xcb, 0xa1, 0xf1, 0xfe, 0x10, 0x1f, 0x84, 0x59, 0xa3, 0x4a, 0x44, 0xd6, 0x28, 0x55, 0xf9,
-	0xd7, 0x24, 0x54, 0x84, 0x41, 0x0b, 0x79, 0x2b, 0x90, 0x72, 0xfd, 0x30, 0x61, 0xeb, 0xfa, 0x3c,
-	0x61, 0xdb, 0x3f, 0x9c, 0x26, 0x6c, 0xfb, 0x87, 0x74, 0xc8, 0xfa, 0xee, 0x70, 0x88, 0x9d, 0x30,
-	0x95, 0x10, 0x7e, 0xa2, 0x06, 0xa4, 0x89, 0x73, 0xbc, 0xcc, 0xb3, 0x90, 0x19, 0xe9, 0x55, 0xd5,
-	0x39, 0xe6, 0x87, 0x3f, 0x14, 0x46, 0xbe, 0x0b, 0x85, 0x90, 0xb0, 0xd4, 0x03, 0x8c, 0xff, 0x49,
-	0xc2, 0xaa, 0x2a, 0x06, 0x28, 0xec, 0x57, 0x17, 0x0a, 0xe1, 0x6b, 0x4e, 0xa1, 0x06, 0x8b, 0x78,
-	0x56, 0xb5, 0x91, 0xd5, 0x25, 0xde, 0xb1, 0xd5, 0x27, 0xf5, 0xe8, 0x39, 0xa7, 0x1e, 0x01, 0xa1,
-	0x3d, 0xc8, 0xb1, 0xdb, 0x8e, 0xe1, 0x21, 0xfa, 0x22, 0x3e, 0xf5, 0x5c, 0xc3, 0xf8, 0x7d, 0xaf,
-	0xf0, 0x85, 0x0d, 0x47, 0x93, 0x1f, 0x40, 0x29, 0x46, 0x5e, 0xaa, 0xef, 0x3f, 0x86, 0xd5, 0x39,
-	0x9d, 0x78, 0x39, 0xc7, 0x58, 0x5f, 0x81, 0x4a, 0xec, 0x09, 0xe0, 0xf4, 0x32, 0xc2, 0x4a, 0x8c,
-	0xaa, 0x99, 0xca, 0x06, 0x94, 0x67, 0x64, 0x0b, 0x7d, 0x4b, 0x2e, 0xa0, 0x6f, 0xca, 0x6f, 0x33,
-	0x50, 0x8a, 0x5d, 0x79, 0x45, 0x1a, 0x64, 0xad, 0x80, 0x44, 0x3b, 0xfb, 0xed, 0xe5, 0x6e, 0xcc,
-	0x56, 0xb5, 0x80, 0x0c, 0x75, 0x8e, 0x20, 0x0f, 0x00, 0x34, 0x93, 0x38, 0x81, 0x35, 0xb0, 0x88,
-	0x47, 0x6d, 0x73, 0xfc, 0xa9, 0x98, 0x68, 0x5d, 0x29, 0x98, 0xbe, 0x12, 0xa3, 0x9b, 0xf7, 0xb4,
-	0xca, 0xd4, 0x62, 0x4c, 0xf9, 0x76, 0x3d, 0x27, 0x9c, 0x97, 0x74, 0x34, 0x2f, 0xf2, 0x2f, 0x52,
-	0x90, 0xa1, 0x72, 0x91, 0x16, 0x9d, 0x7b, 0x2c, 0xf6, 0xe4, 0x6a, 0xa6, 0xe1, 0x51, 0x4b, 0x59,
-	0xa6, 0xb5, 0x21, 0x52, 0xb8, 0xa9, 0x85, 0xb3, 0x68, 0x71, 0xb0, 0xb9, 0x4b, 0x84, 0xe8, 0xfd,
-	0x70, 0xe5, 0x70, 0x1b, 0x7b, 0xa5, 0xca, 0xdf, 0x2d, 0x57, 0xc3, 0x77, 0xcb, 0xd5, 0x9a, 0x13,
-	0xbe, 0x46, 0x44, 0x1f, 0x41, 0xc9, 0x3f, 0x74, 0xbd, 0x80, 0x1f, 0x44, 0x89, 0x38, 0xf5, 0x74,
-	0x0e, 0x60, 0x15, 0xd9, 0x75, 0x34, 0xba, 0x38, 0x6d, 0xbc, 0x4f, 0x6c, 0xf1, 0xf0, 0x8d, 0x7f,
-	0xa0, 0x57, 0xa1, 0x60, 0x5b, 0xce, 0x91, 0x31, 0xf6, 0x6c, 0x16, 0xfd, 0x15, 0xf5, 0x3c, 0xfd,
-	0xde, 0xf5, 0x6c, 0xf9, 0xc7, 0xe2, 0x62, 0xe3, 0xf8, 0x05, 0x17, 0x1b, 0x45, 0x8e, 0x97, 0x5d,
-	0x51, 0xd2, 0x5a, 0x3d, 0xf5, 0xa1, 0xaa, 0xf3, 0x5c, 0x31, 0xcf, 0x09, 0xa7, 0xe3, 0xd9, 0xde,
-	0x0c, 0x5a, 0x81, 0x62, 0xf4, 0xa8, 0x59, 0xca, 0xb2, 0xbc, 0xf0, 0xae, 0x5e, 0x63, 0xaf, 0x0e,
-	0x72, 0xa8, 0x02, 0xf0, 0xa8, 0xb6, 0x57, 0x33, 0xb6, 0x1a, 0xb5, 0x6e, 0x57, 0xca, 0x2b, 0xff,
-	0x58, 0x80, 0xab, 0x4d, 0xe2, 0xfb, 0xf8, 0x80, 0x3c, 0xb1, 0x82, 0xc3, 0xd8, 0x23, 0x88, 0x97,
-	0xfc, 0x4e, 0xf1, 0x5b, 0x90, 0x65, 0x39, 0xd8, 0x65, 0x1f, 0x6e, 0x52, 0xd7, 0x85, 0x31, 0xa2,
-	0xef, 0x52, 0xcb, 0x2e, 0x5e, 0x89, 0xc4, 0x94, 0x68, 0xb1, 0x60, 0x69, 0xf6, 0xde, 0xd2, 0x4e,
-	0x42, 0x17, 0x57, 0x28, 0xa3, 0x9b, 0x4c, 0xdf, 0x87, 0x4b, 0xbe, 0x79, 0x14, 0xdd, 0x46, 0x88,
-	0xdf, 0x7e, 0x3c, 0xc7, 0x5e, 0xbc, 0x93, 0xd0, 0x57, 0xfd, 0x39, 0x53, 0xf4, 0x04, 0x2a, 0x23,
-	0xec, 0x19, 0xa6, 0x1b, 0x35, 0x3f, 0xb7, 0xb0, 0x51, 0x8a, 0xdf, 0xa7, 0xa6, 0xd1, 0xed, 0x28,
-	0x7e, 0x01, 0xbe, 0x0d, 0x30, 0x8a, 0x74, 0x53, 0x04, 0xe4, 0xcb, 0xbd, 0x38, 0xde, 0x49, 0xe8,
-	0x31, 0x08, 0xa4, 0x43, 0x29, 0xf6, 0x4a, 0x5c, 0x04, 0xe3, 0x4b, 0xbe, 0x29, 0xde, 0x49, 0xe8,
-	0x71, 0x10, 0xd4, 0x85, 0xb2, 0x47, 0xb0, 0x19, 0xf5, 0xbd, 0xb8, 0x30, 0x68, 0xec, 0x1a, 0x1e,
-	0x05, 0xf5, 0x62, 0xb7, 0xf2, 0x9a, 0x00, 0xd3, 0x1b, 0x18, 0x22, 0x74, 0x5e, 0xea, 0xea, 0x03,
-	0x8d, 0xc2, 0xa3, 0xab, 0x16, 0x68, 0x00, 0x97, 0x63, 0xef, 0xf5, 0xa2, 0xa6, 0x96, 0x97, 0x7c,
-	0xdb, 0x1c, 0xbb, 0x84, 0xb7, 0x93, 0xd0, 0x85, 0x8b, 0x17, 0xbf, 0x99, 0x47, 0x00, 0x9d, 0x7c,
-	0x49, 0xb1, 0xb6, 0x72, 0xfe, 0x27, 0xd4, 0x53, 0x31, 0xf1, 0x63, 0x9a, 0x3d, 0x58, 0x99, 0x5d,
-	0xce, 0x95, 0x73, 0x6d, 0x82, 0x74, 0xbd, 0x0d, 0x62, 0xdf, 0x9b, 0x39, 0xc8, 0x78, 0xae, 0x1b,
-	0x28, 0x3f, 0xcf, 0xc1, 0x35, 0xf5, 0x73, 0xd2, 0x1f, 0xb3, 0xab, 0xfa, 0xdd, 0x00, 0x1f, 0x44,
-	0xda, 0xd4, 0x81, 0x52, 0x6c, 0x6f, 0x14, 0xd6, 0x63, 0xd9, 0x17, 0xd4, 0x71, 0x08, 0x6a, 0x58,
-	0xf9, 0x2c, 0x8b, 0x5d, 0xdf, 0x12, 0x33, 0x76, 0xca, 0x23, 0x0b, 0x75, 0x21, 0x4f, 0xe4, 0xb4,
-	0x76, 0x4f, 0x17, 0x86, 0x66, 0xce, 0x3c, 0xb5, 0x78, 0x7d, 0xe6, 0xbf, 0x1e, 0x32, 0xec, 0xfe,
-	0x4a, 0xfc, 0xcf, 0x1a, 0xd6, 0xa6, 0xcf, 0x82, 0xb3, 0xac, 0x30, 0x7a, 0xda, 0x3b, 0x6b, 0x46,
-	0x73, 0x17, 0x35, 0xa3, 0x03, 0x28, 0x8d, 0x7d, 0xe2, 0xb1, 0x83, 0x32, 0xe2, 0xaf, 0xe5, 0x2f,
-	0xda, 0xe1, 0x5d, 0x9f, 0x78, 0xec, 0xaa, 0x2f, 0xed, 0xf0, 0x38, 0xfc, 0xf0, 0xd1, 0x33, 0xc8,
-	0xb1, 0xfb, 0x25, 0xfe, 0x5a, 0x81, 0x89, 0xa8, 0x9d, 0x5f, 0x04, 0xbb, 0x11, 0xac, 0x99, 0xba,
-	0x00, 0x94, 0xdb, 0x50, 0x8a, 0x0d, 0xf3, 0x22, 0x0e, 0xc9, 0x97, 0x01, 0x6c, 0xb7, 0x8f, 0x6d,
-	0x7e, 0xd4, 0xcf, 0x17, 0x40, 0x91, 0x51, 0x5a, 0x78, 0x48, 0x28, 0x60, 0xac, 0x1b, 0x2f, 0x01,
-	0xf0, 0x31, 0xe4, 0x45, 0xa3, 0x2f, 0x0e, 0xb6, 0xf1, 0x09, 0x14, 0xd8, 0x9f, 0xb0, 0x50, 0xff,
-	0xef, 0xcd, 0x13, 0xfe, 0x03, 0xdd, 0xf3, 0x99, 0xe7, 0xd0, 0x1e, 0xf1, 0xbf, 0xf9, 0xf8, 0xcd,
-	0x9f, 0xfe, 0xf5, 0x53, 0xee, 0x21, 0x50, 0xae, 0x5d, 0xcf, 0xd9, 0xd0, 0x60, 0x85, 0x01, 0xf4,
-	0xc5, 0xbf, 0xa5, 0x2c, 0x82, 0xf2, 0xcf, 0x21, 0x4a, 0x79, 0x3f, 0xf6, 0xaf, 0x2b, 0x9b, 0x5f,
-	0x87, 0x2f, 0xfe, 0xe7, 0x97, 0xcd, 0xa2, 0xce, 0x2e, 0xbc, 0xd5, 0x46, 0xd6, 0xa7, 0xa5, 0x90,
-	0x6e, 0x1c, 0xdf, 0xda, 0xcf, 0x31, 0x71, 0xb7, 0xff, 0x2f, 0x00, 0x00, 0xff, 0xff, 0xeb, 0x81,
-	0x4f, 0x6b, 0x54, 0x46, 0x00, 0x00,
+var fileDescriptor_beam_runner_api_d5fa30116074ddde = []byte{
+	// 5086 bytes of a gzipped FileDescriptorProto
+	0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xbc, 0x5c, 0xdd, 0x6f, 0xdb, 0x58,
+	0x76, 0xd7, 0xb7, 0xa5, 0x23, 0x59, 0xa6, 0xaf, 0x9d, 0xac, 0xc3, 0x9d, 0x9d, 0x24, 0x9c, 0xec,
+	0x4c, 0x76, 0x76, 0x46, 0x93, 0x38, 0xc9, 0x24, 0xf1, 0xcc, 0x66, 0x56, 0xb2, 0xa8, 0x98, 0x89,
+	0xbe, 0x86, 0x92, 0x9d, 0x64, 0x76, 0x76, 0xb8, 0xb4, 0x78, 0x65, 0x13, 0xa6, 0x48, 0x2d, 0x49,
+	0x39, 0xa3, 0xc5, 0x2e, 0x0a, 0xf4, 0x61, 0x50, 0xa0, 0x40, 0xd1, 0x3e, 0xf4, 0x61, 0x9e, 0x0a,
+	0xec, 0x02, 0x05, 0xda, 0x3e, 0xf4, 0x63, 0xdb, 0x02, 0x7d, 0xdd, 0x6e, 0xff, 0x82, 0x16, 0x28,
+	0xd0, 0xff, 0xa2, 0x2d, 0xf6, 0xa1, 0x7d, 0x2a, 0xee, 0x07, 0x29, 0x4a, 0xb6, 0x33, 0x92, 0x1d,
+	0xf4, 0x4d, 0x3c, 0xbc, 0xe7, 0x77, 0xee, 0x3d, 0xf7, 0xde, 0x73, 0xcf, 0x39, 0xf7, 0x50, 0x70,
+	0x69, 0x1f, 0xeb, 0x03, 0xcd, 0x1d, 0xd9, 0x36, 0x76, 0x35, 0x7d, 0x68, 0x96, 0x86, 0xae, 0xe3,
+	0x3b, 0xe8, 0xba, 0xe3, 0x1e, 0x94, 0xf4, 0xa1, 0xde, 0x3b, 0xc4, 0x25, 0xd2, 0xa2, 0x34, 0x70,
+	0x0c, 0x6c, 0x95, 0x86, 0xe6, 0x10, 0x5b, 0xa6, 0x8d, 0x4b, 0xc7, 0xb7, 0xc5, 0x15, 0x6c, 0x1b,
+	0x43, 0xc7, 0xb4, 0x7d, 0x8f, 0xf1, 0x88, 0x57, 0x0e, 0x1c, 0xe7, 0xc0, 0xc2, 0x1f, 0xd0, 0xa7,
+	0xfd, 0x51, 0xff, 0x03, 0xdd, 0x1e, 0xf3, 0x57, 0xd7, 0x66, 0x5f, 0x19, 0xd8, 0xeb, 0xb9, 0xe6,
+	0xd0, 0x77, 0x5c, 0xd6, 0x42, 0xfa, 0x4d, 0x1c, 0x96, 0x2b, 0x58, 0x1f, 0x6c, 0x3b, 0xb6, 0xe7,
+	0xeb, 0xb6, 0xef, 0x49, 0x7f, 0x13, 0x87, 0x5c, 0xf8, 0x84, 0x6e, 0xc3, 0x7a, 0x43, 0x69, 0x6a,
+	0x5d, 0xa5, 0x21, 0x77, 0xba, 0xe5, 0x46, 0x5b, 0x6b, 0x28, 0xf5, 0xba, 0xd2, 0x11, 0x62, 0xe2,
+	0xb7, 0xfe, 0xf2, 0xef, 0xff, 0xf7, 0x37, 0xe9, 0xd5, 0xf7, 0x1f, 0x6e, 0x6e, 0xde, 0xb9, 0x73,
+	0x7f, 0xf3, 0xd6, 0x9d, 0x0f, 0x1f, 0xdc, 0xbb, 0x7b, 0xff, 0xfe, 0x3d, 0x74, 0x0b, 0xd6, 0x1b,
+	0xe5, 0xe7, 0x27, 0x59, 0xe2, 0xe2, 0x65, 0xca, 0x22, 0x9c, 0xe0, 0x78, 0x04, 0xd2, 0xe3, 0x7a,
+	0xab, 0x52, 0xae, 0x6b, 0xcf, 0x94, 0x66, 0xb5, 0xf5, 0x4c, 0x3b, 0x95, 0x3f, 0x31, 0xcd, 0x7f,
+	0xfb, 0xe1, 0xbd, 0x5b, 0x77, 0x29, 0xbf, 0xf4, 0x8f, 0x59, 0x80, 0x6d, 0x67, 0x30, 0x74, 0x6c,
+	0x4c, 0xfa, 0xfc, 0x63, 0x00, 0xdf, 0xd5, 0x6d, 0xaf, 0xef, 0xb8, 0x03, 0x6f, 0x23, 0x7e, 0x2d,
+	0x79, 0x33, 0xbf, 0xf9, 0x83, 0xd2, 0x37, 0x6a, 0xb6, 0x34, 0x81, 0x28, 0x75, 0x43, 0x7e, 0xd9,
+	0xf6, 0xdd, 0xb1, 0x1a, 0x01, 0x44, 0x3d, 0x28, 0x0c, 0x7b, 0x8e, 0x65, 0xe1, 0x9e, 0x6f, 0x3a,
+	0xb6, 0xb7, 0x91, 0xa0, 0x02, 0x3e, 0x59, 0x4c, 0x40, 0x3b, 0x82, 0xc0, 0x44, 0x4c, 0x81, 0xa2,
+	0x31, 0xac, 0xbf, 0x34, 0x6d, 0xc3, 0x79, 0x69, 0xda, 0x07, 0x9a, 0xe7, 0xbb, 0xba, 0x8f, 0x0f,
+	0x4c, 0xec, 0x6d, 0x24, 0xa9, 0xb0, 0xda, 0x62, 0xc2, 0x9e, 0x05, 0x48, 0x9d, 0x10, 0x88, 0xc9,
+	0x5c, 0x7b, 0x79, 0xf2, 0x0d, 0xfa, 0x14, 0x32, 0x3d, 0xc7, 0xc0, 0xae, 0xb7, 0x91, 0xa2, 0xc2,
+	0x1e, 0x2e, 0x26, 0x6c, 0x9b, 0xf2, 0x32, 0x7c, 0x0e, 0x44, 0x54, 0x86, 0xed, 0x63, 0xd3, 0x75,
+	0xec, 0x01, 0x69, 0xb3, 0x91, 0x3e, 0x8f, 0xca, 0xe4, 0x08, 0x02, 0x57, 0x59, 0x14, 0x54, 0xb4,
+	0x60, 0x65, 0x66, 0xda, 0x90, 0x00, 0xc9, 0x23, 0x3c, 0xde, 0x88, 0x5f, 0x8b, 0xdf, 0xcc, 0xa9,
+	0xe4, 0x27, 0xda, 0x86, 0xf4, 0xb1, 0x6e, 0x8d, 0xf0, 0x46, 0xe2, 0x5a, 0xfc, 0x66, 0x7e, 0xf3,
+	0xfd, 0x39, 0xba, 0xd0, 0x0e, 0x51, 0x55, 0xc6, 0xbb, 0x95, 0x78, 0x10, 0x17, 0x1d, 0x58, 0x3d,
+	0x31, 0x87, 0xa7, 0xc8, 0xab, 0x4e, 0xcb, 0x2b, 0xcd, 0x23, 0x6f, 0x3b, 0x84, 0x8d, 0x0a, 0xfc,
+	0x39, 0x6c, 0x9c, 0x35, 0x8f, 0xa7, 0xc8, 0x7d, 0x32, 0x2d, 0xf7, 0xee, 0x1c, 0x72, 0x67, 0xd1,
+	0xc7, 0x51, 0xe9, 0x3d, 0xc8, 0x47, 0x26, 0xf6, 0x14, 0x81, 0x8f, 0xa6, 0x05, 0xde, 0x9c, 0x6b,
+	0x6e, 0x0d, 0xec, 0xce, 0xe8, 0xf4, 0xc4, 0x24, 0xbf, 0x1e, 0x9d, 0x46, 0x60, 0x23, 0x02, 0xa5,
+	0xff, 0x88, 0x43, 0xb6, 0xcd, 0x9b, 0xa1, 0x06, 0x40, 0x2f, 0x5c, 0x6d, 0x54, 0xde, 0x7c, 0xeb,
+	0x63, 0xb2, 0x44, 0xd5, 0x08, 0x00, 0x7a, 0x0f, 0x90, 0xeb, 0x38, 0xbe, 0x16, 0x5a, 0x0e, 0xcd,
+	0x34, 0x98, 0xb1, 0xc8, 0xa9, 0x02, 0x79, 0x13, 0x2e, 0x2b, 0xc5, 0x20, 0x9b, 0xae, 0x60, 0x98,
+	0xde, 0xd0, 0xd2, 0xc7, 0x9a, 0xa1, 0xfb, 0xfa, 0x46, 0x72, 0xee, 0xa1, 0x55, 0x19, 0x5b, 0x55,
+	0xf7, 0x75, 0x35, 0x6f, 0x4c, 0x1e, 0xa4, 0x3f, 0x4c, 0x01, 0x4c, 0xd6, 0x2e, 0xba, 0x0a, 0xf9,
+	0x91, 0x6d, 0xfe, 0x74, 0x84, 0x35, 0x5b, 0x1f, 0xe0, 0x8d, 0x34, 0xd5, 0x27, 0x30, 0x52, 0x53,
+	0x1f, 0x60, 0xb4, 0x0d, 0x29, 0x6f, 0x88, 0x7b, 0x7c, 0xe4, 0x1f, 0xcc, 0x21, 0xba, 0x36, 0xb2,
+	0xe9, 0x32, 0xed, 0x0c, 0x71, 0x4f, 0xa5, 0xcc, 0xe8, 0x06, 0x2c, 0x7b, 0xa3, 0xfd, 0x88, 0xf9,
+	0x65, 0x03, 0x9e, 0x26, 0x12, 0x13, 0x63, 0xda, 0xc3, 0x91, 0x1f, 0xd8, 0xb3, 0x87, 0x0b, 0x6d,
+	0xc3, 0x92, 0x42, 0x79, 0xb9, 0x89, 0x61, 0x40, 0xa8, 0x0b, 0x4b, 0xce, 0xc8, 0xa7, 0x98, 0xcc,
+	0x6c, 0x6d, 0x2d, 0x86, 0xd9, 0x62, 0xcc, 0x0c, 0x34, 0x80, 0x3a, 0x31, 0x2d, 0x99, 0x0b, 0x4f,
+	0x8b, 0xf8, 0x10, 0xf2, 0x91, 0xfe, 0x9f, 0xb2, 0xbc, 0xd7, 0xa3, 0xcb, 0x3b, 0x17, 0xdd, 0x1f,
+	0x5b, 0x50, 0x88, 0x76, 0x73, 0x11, 0x5e, 0xe9, 0x1f, 0x96, 0x61, 0xad, 0xe3, 0xeb, 0xb6, 0xa1,
+	0xbb, 0xc6, 0x64, 0xd8, 0x9e, 0xf4, 0x17, 0x49, 0x80, 0xb6, 0x6b, 0x0e, 0x4c, 0xdf, 0x3c, 0xc6,
+	0x1e, 0xfa, 0x1e, 0x64, 0xda, 0x65, 0x55, 0xab, 0xb6, 0x84, 0x98, 0xf8, 0x9d, 0x5f, 0x92, 0xe3,
+	0xf6, 0x5b, 0x64, 0x80, 0x5b, 0xe1, 0xe4, 0x6d, 0x0d, 0x75, 0xd7, 0x70, 0xb6, 0x8e, 0x6f, 0xa3,
+	0xf7, 0x60, 0xa9, 0x56, 0x2f, 0x77, 0xbb, 0x72, 0x53, 0x88, 0x8b, 0x57, 0x69, 0xdb, 0x2b, 0x33,
+	0x6d, 0xfb, 0x96, 0xee, 0xfb, 0xd8, 0x26, 0xad, 0x3f, 0x84, 0xc2, 0x63, 0xb5, 0xb5, 0xdb, 0xd6,
+	0x2a, 0x2f, 0xb4, 0xa7, 0xf2, 0x0b, 0x21, 0x21, 0xde, 0xa0, 0x2c, 0x6f, 0xce, 0xb0, 0x1c, 0xb8,
+	0xce, 0x68, 0xa8, 0xed, 0x8f, 0xb5, 0x23, 0x3c, 0xe6, 0x52, 0x94, 0x46, 0x7b, 0xb7, 0xde, 0x91,
+	0x85, 0xe4, 0x19, 0x52, 0xcc, 0xc1, 0x70, 0x64, 0x79, 0x98, 0xb4, 0xbe, 0x0f, 0xc5, 0x72, 0xa7,
+	0xa3, 0x3c, 0x6e, 0x72, 0x4f, 0xa2, 0x23, 0xa4, 0xc4, 0xb7, 0x28, 0xd3, 0x77, 0x66, 0x98, 0xd8,
+	0xc9, 0xa7, 0x99, 0xb6, 0x4f, 0x07, 0x73, 0x07, 0xf2, 0x5d, 0xb9, 0xd3, 0xd5, 0x3a, 0x5d, 0x55,
+	0x2e, 0x37, 0x84, 0xb4, 0x28, 0x51, 0xae, 0x37, 0x66, 0xb8, 0x7c, 0xec, 0xf9, 0x9e, 0xef, 0x12,
+	0xe2, 0xf1, 0x6d, 0x74, 0x17, 0xf2, 0x8d, 0x72, 0x3b, 0x14, 0x95, 0x39, 0x43, 0xd4, 0x40, 0x1f,
+	0x6a, 0x4c, 0x9c, 0x47, 0xb8, 0x1e, 0xc0, 0x72, 0x43, 0x56, 0x1f, 0xcb, 0x21, 0xdf, 0x92, 0xf8,
+	0x5d, 0xca, 0x77, 0x75, 0x96, 0x0f, 0xbb, 0x07, 0x38, 0xc2, 0x29, 0xf9, 0xb0, 0x5e, 0xc5, 0x43,
+	0x17, 0xf7, 0x74, 0x1f, 0x1b, 0x91, 0x49, 0x7b, 0x1b, 0x52, 0xaa, 0x5c, 0xae, 0x0a, 0x31, 0xf1,
+	0x0d, 0x0a, 0x74, 0x79, 0x06, 0xc8, 0xc5, 0xba, 0xc1, 0xfb, 0xbb, 0xad, 0xca, 0xe5, 0xae, 0xac,
+	0xed, 0x29, 0xf2, 0x33, 0x21, 0x7e, 0x46, 0x7f, 0x7b, 0x2e, 0xd6, 0x7d, 0xac, 0x1d, 0x9b, 0xf8,
+	0x25, 0x91, 0xfa, 0x5f, 0x71, 0xee, 0x5d, 0x79, 0xa6, 0x8f, 0x3d, 0xf4, 0x31, 0xac, 0x6c, 0xb7,
+	0x1a, 0x15, 0xa5, 0x29, 0x6b, 0x6d, 0x59, 0xa5, 0x73, 0x19, 0x13, 0xdf, 0xa1, 0x40, 0xd7, 0x67,
+	0x81, 0x9c, 0xc1, 0xbe, 0x69, 0x63, 0x6d, 0x88, 0xdd, 0x60, 0x3a, 0x1f, 0x81, 0x10, 0x70, 0x33,
+	0x97, 0xaf, 0xfe, 0x42, 0x88, 0x8b, 0x37, 0x29, 0xbb, 0x74, 0x06, 0xfb, 0x81, 0xe5, 0xec, 0xeb,
+	0x96, 0x45, 0xf9, 0x6f, 0x41, 0x4e, 0x95, 0x3b, 0x3b, 0xbb, 0xb5, 0x5a, 0x5d, 0x16, 0x12, 0xe2,
+	0x75, 0xca, 0xf8, 0xed, 0x13, 0xe3, 0xf5, 0x0e, 0x47, 0xfd, 0xbe, 0x85, 0xf9, 0xa0, 0x9f, 0xa9,
+	0x4a, 0x57, 0xd6, 0x6a, 0x4a, 0x5d, 0xee, 0x08, 0xc9, 0xb3, 0xd6, 0x83, 0x6b, 0xfa, 0x58, 0xeb,
+	0x9b, 0x16, 0xa6, 0xaa, 0xfe, 0x5d, 0x02, 0x56, 0xb7, 0x99, 0xfc, 0x88, 0x67, 0xa9, 0x82, 0x38,
+	0x33, 0x76, 0xad, 0xad, 0xca, 0x9c, 0x24, 0xc4, 0xc4, 0x4d, 0x0a, 0xfd, 0xde, 0xab, 0xd5, 0xa0,
+	0x91, 0x19, 0x64, 0x24, 0xd2, 0xbf, 0x7d, 0x90, 0x66, 0x31, 0xd9, 0xf2, 0x28, 0x6f, 0x6f, 0xef,
+	0x36, 0x76, 0xeb, 0xe5, 0x6e, 0x4b, 0x25, 0xce, 0xf3, 0x16, 0xc5, 0xbe, 0xfb, 0x0d, 0xd8, 0x6c,
+	0xcd, 0xe8, 0xbd, 0xde, 0x68, 0x30, 0xb2, 0x74, 0xdf, 0x71, 0xe9, 0x92, 0xfb, 0x1c, 0xae, 0xce,
+	0xca, 0x90, 0x9f, 0x77, 0xd5, 0xf2, 0x76, 0x57, 0x6b, 0xed, 0x76, 0xdb, 0xbb, 0x5d, 0xe2, 0x5d,
+	0xdf, 0xa7, 0x02, 0x6e, 0x7f, 0x83, 0x00, 0xfc, 0xa5, 0xef, 0xea, 0x3d, 0x5f, 0xe3, 0x16, 0x92,
+	0xa0, 0x3f, 0x81, 0xcb, 0xe1, 0x9c, 0x92, 0x2d, 0x2e, 0x57, 0xb5, 0xbd, 0x72, 0x7d, 0x97, 0x2a,
+	0xbb, 0x44, 0x41, 0x6f, 0x9e, 0x35, 0xb3, 0x64, 0xb3, 0x63, 0x43, 0xa3, 0x66, 0x8a, 0xea, 0xfd,
+	0x8f, 0x52, 0x70, 0xa5, 0x33, 0xb4, 0x4c, 0xdf, 0xd7, 0xf7, 0x2d, 0xdc, 0xd6, 0xdd, 0xaa, 0x13,
+	0xd1, 0x7f, 0x1d, 0x2e, 0xb5, 0xcb, 0x8a, 0xaa, 0x3d, 0x53, 0xba, 0x3b, 0x9a, 0x2a, 0x77, 0xba,
+	0xaa, 0xb2, 0xdd, 0x55, 0x5a, 0x4d, 0x21, 0x26, 0xde, 0xa6, 0x82, 0xbe, 0x3f, 0x23, 0xc8, 0x33,
+	0xfa, 0xda, 0x50, 0x37, 0x5d, 0xed, 0xa5, 0xe9, 0x1f, 0x6a, 0x2e, 0xf6, 0x7c, 0xd7, 0xa4, 0x47,
+	0x16, 0xe9, 0x77, 0x15, 0x56, 0x3b, 0xed, 0xba, 0xd2, 0x9d, 0x42, 0x8a, 0x8b, 0xef, 0x53, 0xa4,
+	0x77, 0x4e, 0x41, 0xf2, 0x48, 0xc7, 0x66, 0x51, 0x9a, 0x70, 0xb9, 0xad, 0xb6, 0xb6, 0xe5, 0x4e,
+	0x87, 0xe8, 0x55, 0xae, 0x6a, 0x72, 0x5d, 0x6e, 0xc8, 0x4d, 0xaa, 0xd2, 0xd3, 0xd7, 0x03, 0xed,
+	0x94, 0xeb, 0xf4, 0xb0, 0xe7, 0x11, 0x95, 0x62, 0x43, 0xc3, 0x16, 0xa6, 0x1e, 0x0f, 0xc1, 0xab,
+	0x80, 0x10, 0xe0, 0x85, 0x48, 0x49, 0xf1, 0x3d, 0x8a, 0xf4, 0xf6, 0x2b, 0x90, 0xa2, 0x18, 0xcf,
+	0xe1, 0xdb, 0x6c, 0x64, 0xe5, 0x66, 0x55, 0xeb, 0x28, 0x9f, 0xc9, 0xd1, 0x21, 0x12, 0x9b, 0x78,
+	0xfa, 0x5c, 0x4f, 0xc6, 0xa8, 0xdb, 0x86, 0xe6, 0x99, 0x3f, 0xc3, 0xd1, 0xc1, 0x52, 0x64, 0x07,
+	0xde, 0x09, 0x7a, 0x47, 0x70, 0x27, 0xa3, 0xa5, 0xa2, 0xa6, 0xa4, 0xa4, 0xc5, 0x0a, 0x95, 0xf2,
+	0xf1, 0x2b, 0x3a, 0x4d, 0x64, 0x84, 0xc3, 0xa7, 0x52, 0x67, 0x04, 0x4a, 0xbf, 0x1f, 0x87, 0xcb,
+	0xc1, 0xb9, 0xd5, 0x31, 0x0d, 0x4c, 0xcf, 0xce, 0xee, 0x78, 0x88, 0x3d, 0xe9, 0x10, 0x52, 0xb2,
+	0x3d, 0x1a, 0xa0, 0x0f, 0x20, 0xab, 0x74, 0x65, 0xb5, 0x5c, 0xa9, 0x93, 0x3d, 0x18, 0x35, 0x09,
+	0x9e, 0x69, 0x60, 0x8d, 0x3a, 0x08, 0x5b, 0xa6, 0x8f, 0x5d, 0xb2, 0xa4, 0xc8, 0x20, 0x3e, 0x80,
+	0x6c, 0x63, 0xb7, 0xde, 0x55, 0x1a, 0xe5, 0xb6, 0x10, 0x3f, 0x8b, 0x61, 0x30, 0xb2, 0x7c, 0x73,
+	0xa0, 0x0f, 0x49, 0x27, 0x7e, 0x99, 0x80, 0x7c, 0xc4, 0x2d, 0x9f, 0xf5, 0xa5, 0xe2, 0x27, 0x7c,
+	0xa9, 0x2b, 0x90, 0xa5, 0xa1, 0x8f, 0x66, 0x1a, 0xfc, 0x28, 0x5e, 0xa2, 0xcf, 0x8a, 0x81, 0xda,
+	0x00, 0xa6, 0xa7, 0xed, 0x3b, 0x23, 0xdb, 0xc0, 0x06, 0xf5, 0xf3, 0x8a, 0x9b, 0xb7, 0xe7, 0x70,
+	0x28, 0x14, 0xaf, 0xc2, 0x78, 0x4a, 0x64, 0xd0, 0x6a, 0xce, 0x0c, 0x9e, 0xd1, 0x26, 0x5c, 0x3a,
+	0x11, 0x2b, 0x8e, 0x89, 0xe4, 0x14, 0x95, 0x7c, 0x22, 0xc8, 0x1b, 0x2b, 0xc6, 0x09, 0xc7, 0x26,
+	0x7d, 0x71, 0x7f, 0xf3, 0xeb, 0x25, 0x28, 0xd0, 0x0d, 0xdb, 0xd6, 0xc7, 0x96, 0xa3, 0x1b, 0xe8,
+	0x31, 0xa4, 0x0d, 0x47, 0xeb, 0xdb, 0xdc, 0xa3, 0xdc, 0x9c, 0x03, 0xbc, 0x63, 0x1c, 0x4d, 0x3b,
+	0x95, 0x86, 0x53, 0xb3, 0x51, 0x1d, 0x60, 0xa8, 0xbb, 0xfa, 0x00, 0xfb, 0x24, 0x2a, 0x65, 0xf1,
+	0xf6, 0x7b, 0xf3, 0xb8, 0x77, 0x01, 0x93, 0x1a, 0xe1, 0x47, 0x3f, 0x81, 0xfc, 0x64, 0x9a, 0x03,
+	0x0f, 0xf4, 0x93, 0xf9, 0xe0, 0xc2, 0xc1, 0x95, 0xc2, 0xb5, 0x18, 0x64, 0x08, 0xbc, 0x90, 0x40,
+	0x25, 0xf8, 0xe4, 0x08, 0x25, 0x2e, 0x71, 0xe0, 0x8f, 0x2e, 0x2e, 0x81, 0x40, 0x10, 0x2d, 0x84,
+	0x12, 0x42, 0x02, 0x91, 0xe0, 0x9b, 0x03, 0xec, 0x72, 0x09, 0xe9, 0xf3, 0x49, 0xe8, 0x12, 0x88,
+	0xa8, 0x04, 0x3f, 0x24, 0xa0, 0x37, 0x01, 0xbc, 0xd0, 0x0e, 0x53, 0xbf, 0x37, 0xab, 0x46, 0x28,
+	0xe8, 0x16, 0xac, 0x47, 0xb6, 0xaa, 0x16, 0xae, 0xf6, 0x25, 0xba, 0xe6, 0x50, 0xe4, 0xdd, 0x36,
+	0x5f, 0xf8, 0x77, 0xe0, 0x92, 0x8b, 0x7f, 0x3a, 0x22, 0x1e, 0x94, 0xd6, 0x37, 0x6d, 0xdd, 0x32,
+	0x7f, 0xa6, 0x93, 0xf7, 0x1b, 0x59, 0x0a, 0xbe, 0x1e, 0xbc, 0xac, 0x45, 0xde, 0x89, 0x47, 0xb0,
+	0x32, 0xa3, 0xe9, 0x53, 0xbc, 0xde, 0xca, 0x74, 0x40, 0x38, 0xcf, 0xd2, 0x08, 0x41, 0xa3, 0xfe,
+	0x35, 0x11, 0x36, 0xad, 0xf4, 0xd7, 0x24, 0x2c, 0x00, 0x9d, 0x11, 0x36, 0xa3, 0xff, 0xd7, 0x23,
+	0x2c, 0x04, 0x8d, 0x7a, 0xff, 0xbf, 0x8e, 0x43, 0x2e, 0xdc, 0x0d, 0xe8, 0x09, 0xa4, 0xfc, 0xf1,
+	0x90, 0xd9, 0xad, 0xe2, 0xe6, 0x87, 0x8b, 0xec, 0xa4, 0x12, 0x31, 0xbd, 0xcc, 0x02, 0x51, 0x0c,
+	0xf1, 0x33, 0x48, 0x11, 0x92, 0xa4, 0x72, 0x63, 0xbc, 0x02, 0xf9, 0xdd, 0x66, 0xa7, 0x2d, 0x6f,
+	0x2b, 0x35, 0x45, 0xae, 0x0a, 0x31, 0x04, 0x90, 0x61, 0x8e, 0xae, 0x10, 0x47, 0xeb, 0x20, 0xb4,
+	0x95, 0xb6, 0x5c, 0x27, 0xae, 0x42, 0xab, 0xcd, 0x8e, 0x89, 0x04, 0xfa, 0x16, 0xac, 0x45, 0x0e,
+	0x0e, 0x8d, 0xf8, 0x25, 0x4f, 0x65, 0x55, 0x48, 0x4a, 0x7f, 0x9b, 0x84, 0x5c, 0xa8, 0x3b, 0xa4,
+	0x02, 0xd0, 0x01, 0x69, 0x91, 0x28, 0x75, 0x1e, 0xc3, 0xb9, 0x47, 0x98, 0x42, 0x98, 0x9d, 0x98,
+	0x9a, 0xa3, 0x30, 0x14, 0xb3, 0x0e, 0xd9, 0x7d, 0xfd, 0x80, 0x21, 0x26, 0xe6, 0x8e, 0x7b, 0x2b,
+	0xfa, 0x41, 0x14, 0x6f, 0x69, 0x5f, 0x3f, 0xa0, 0x68, 0x5f, 0x40, 0x91, 0x79, 0x36, 0xd4, 0x10,
+	0x13, 0x4c, 0x16, 0xc6, 0xdf, 0x9b, 0x2f, 0x8b, 0xc0, 0x18, 0xa3, 0xc8, 0xcb, 0x21, 0x5c, 0xd0,
+	0x5b, 0x12, 0x4b, 0x50, 0xe4, 0xd4, 0xdc, 0xbd, 0x6d, 0xe8, 0xc3, 0xa9, 0xde, 0x0e, 0xf4, 0x61,
+	0x80, 0xe6, 0x61, 0x9f, 0xa1, 0xa5, 0xe7, 0x46, 0xeb, 0x60, 0x7f, 0x0a, 0xcd, 0xc3, 0x3e, 0xf9,
+	0x59, 0xc9, 0xb0, 0xec, 0x81, 0xf4, 0x7d, 0x28, 0x4e, 0x2b, 0x7c, 0xea, 0x2c, 0x8c, 0x4f, 0x9d,
+	0x85, 0xd2, 0x03, 0x28, 0x44, 0x75, 0x89, 0x6e, 0x82, 0x10, 0xf8, 0x02, 0x33, 0x2c, 0x45, 0x4e,
+	0xe7, 0xc6, 0x44, 0xfa, 0x3a, 0x0e, 0xe8, 0xa4, 0xca, 0x88, 0x55, 0x8a, 0xf8, 0xbe, 0xb3, 0x20,
+	0x28, 0xf2, 0x2e, 0xb0, 0x4a, 0x9f, 0xd2, 0xac, 0x0f, 0xf5, 0x46, 0xfb, 0x36, 0x5f, 0x03, 0xe7,
+	0x39, 0xa9, 0x72, 0x1c, 0xa5, 0x66, 0x4b, 0x7b, 0x50, 0x88, 0xea, 0x1c, 0x5d, 0x83, 0x02, 0xf1,
+	0x9c, 0x67, 0x3a, 0x03, 0x47, 0x78, 0x1c, 0x74, 0xe2, 0x06, 0x14, 0xd9, 0xd2, 0x9e, 0x71, 0x1a,
+	0x0a, 0x94, 0xba, 0x3d, 0xd1, 0x56, 0x54, 0xfb, 0x0b, 0x68, 0xeb, 0xab, 0x38, 0xe4, 0x42, 0xbb,
+	0x80, 0x3a, 0xec, 0xf0, 0xd0, 0x0c, 0x67, 0xa0, 0x9b, 0x36, 0xb7, 0x02, 0x9b, 0x73, 0x9a, 0x96,
+	0x2a, 0x65, 0x62, 0x16, 0x80, 0x9e, 0x17, 0x8c, 0x40, 0x86, 0xc0, 0x4e, 0xa4, 0xd9, 0x21, 0x50,
+	0x6a, 0xd0, 0x91, 0x1f, 0x42, 0x2e, 0xf4, 0x63, 0xa4, 0x3b, 0x67, 0x99, 0x8c, 0x65, 0xc8, 0xed,
+	0x36, 0x2b, 0xad, 0xdd, 0x66, 0x55, 0xae, 0x0a, 0x71, 0x94, 0x87, 0xa5, 0xe0, 0x21, 0x21, 0xfd,
+	0x55, 0x1c, 0xf2, 0x2a, 0xd6, 0x8d, 0xc0, 0xc9, 0x78, 0x02, 0x19, 0xcf, 0x19, 0xb9, 0x3d, 0x7c,
+	0x01, 0x2f, 0x83, 0x23, 0xcc, 0xb8, 0x66, 0x89, 0x8b, 0xbb, 0x66, 0x92, 0x01, 0xab, 0x2c, 0xad,
+	0xaa, 0xd8, 0x7e, 0xe8, 0x17, 0xb5, 0x20, 0xc7, 0xb3, 0x0f, 0x17, 0xf2, 0x8d, 0xb2, 0x0c, 0xa4,
+	0x66, 0x4b, 0x7f, 0x1a, 0x87, 0x22, 0x0f, 0x56, 0x03, 0x19, 0xd3, 0xcb, 0x3a, 0xfe, 0x1a, 0x96,
+	0xf5, 0x99, 0x7b, 0x2b, 0x71, 0xd6, 0xde, 0x92, 0xfe, 0x35, 0x03, 0xab, 0x5d, 0xec, 0xf9, 0x1d,
+	0x9a, 0x31, 0x09, 0xba, 0x76, 0xb6, 0x3d, 0x40, 0x2a, 0x64, 0xf0, 0x31, 0x4d, 0xbf, 0x26, 0xe6,
+	0xce, 0xe1, 0x9d, 0x10, 0x50, 0x92, 0x09, 0x84, 0xca, 0x91, 0xc4, 0xff, 0x4c, 0x41, 0x9a, 0x52,
+	0xd0, 0x31, 0xac, 0xbc, 0xd4, 0x7d, 0xec, 0x0e, 0x74, 0xf7, 0x48, 0xa3, 0x6f, 0xb9, 0x62, 0x9e,
+	0x9e, 0x5f, 0x4c, 0xa9, 0x6c, 0x1c, 0xeb, 0x76, 0x0f, 0x3f, 0x0b, 0x80, 0x77, 0x62, 0x6a, 0x31,
+	0x94, 0xc2, 0xe4, 0x7e, 0x15, 0x87, 0x4b, 0x3c, 0xe0, 0x21, 0x07, 0x03, 0xdd, 0x7b, 0x4c, 0x3c,
+	0x33, 0x37, 0xed, 0x8b, 0x8b, 0x6f, 0x87, 0xf0, 0x64, 0x8f, 0xee, 0xc4, 0xd4, 0xb5, 0xe1, 0x14,
+	0x85, 0x75, 0x64, 0x00, 0xcb, 0x81, 0xc1, 0x60, 0xf2, 0xd9, 0xf1, 0x54, 0xbb, 0x90, 0x7c, 0x43,
+	0xe6, 0x81, 0xe7, 0x4e, 0x4c, 0x2d, 0x70, 0x78, 0xfa, 0x4e, 0xbc, 0x0f, 0xc2, 0xac, 0x76, 0xd0,
+	0x5b, 0xb0, 0x6c, 0xe3, 0x97, 0x5a, 0xa8, 0x21, 0x3a, 0x03, 0x49, 0xb5, 0x60, 0xe3, 0x97, 0x61,
+	0x23, 0xb1, 0x02, 0x97, 0x4e, 0x1d, 0x17, 0xfa, 0x1e, 0x08, 0x3a, 0x7b, 0xa1, 0x19, 0x23, 0x97,
+	0x79, 0x8f, 0x0c, 0x60, 0x85, 0xd3, 0xab, 0x9c, 0x2c, 0xba, 0x90, 0x8f, 0xf4, 0x0d, 0xf5, 0x20,
+	0x1b, 0x04, 0xc8, 0xfc, 0x46, 0xf0, 0xf1, 0xb9, 0x46, 0x4d, 0xba, 0xe1, 0xf9, 0xfa, 0x60, 0x88,
+	0x03, 0x6c, 0x35, 0x04, 0xae, 0x2c, 0x41, 0x9a, 0xea, 0x55, 0xfc, 0x11, 0xa0, 0x93, 0x0d, 0xd1,
+	0x3b, 0xb0, 0x82, 0x6d, 0xb2, 0xd4, 0xc3, 0x88, 0x97, 0x76, 0xbe, 0xa0, 0x16, 0x39, 0x39, 0x68,
+	0xf8, 0x06, 0xe4, 0xfc, 0x80, 0x9d, 0xae, 0x91, 0xa4, 0x3a, 0x21, 0x48, 0xff, 0x9d, 0x84, 0xd5,
+	0x67, 0xae, 0xe9, 0xe3, 0x9a, 0x69, 0x61, 0x2f, 0xd8, 0x55, 0x35, 0x48, 0x79, 0xa6, 0x7d, 0x74,
+	0x91, 0x58, 0x8b, 0xf0, 0xa3, 0x1f, 0xc1, 0x0a, 0x89, 0xd2, 0x75, 0x5f, 0xeb, 0xf3, 0x97, 0x17,
+	0x38, 0x14, 0x8b, 0x0c, 0x2a, 0xa0, 0x11, 0x0d, 0x30, 0xa3, 0x85, 0x0d, 0x8d, 0x26, 0xdc, 0x3c,
+	0xba, 0x04, 0xb3, 0x6a, 0x31, 0x20, 0xd3, 0x81, 0x79, 0xe8, 0x63, 0x10, 0xf9, 0xdd, 0xb8, 0x41,
+	0xbc, 0xce, 0x81, 0x69, 0x63, 0x43, 0xf3, 0x0e, 0x75, 0xd7, 0x30, 0xed, 0x03, 0xea, 0xfb, 0x64,
+	0xd5, 0x0d, 0xd6, 0xa2, 0x1a, 0x36, 0xe8, 0xf0, 0xf7, 0x08, 0x4f, 0x47, 0x78, 0x2c, 0x3a, 0xaa,
+	0xce, 0x73, 0x05, 0x36, 0xab, 0xd6, 0x57, 0x85, 0x79, 0xff, 0xaf, 0xb1, 0x89, 0xf4, 0x73, 0x48,
+	0x53, 0xb3, 0xfa, 0x7a, 0xae, 0x69, 0x4a, 0xb0, 0x16, 0x5e, 0x55, 0x85, 0x96, 0x3c, 0xb8, 0xac,
+	0x59, 0x0d, 0x5f, 0x71, 0x43, 0xee, 0x49, 0xff, 0x9e, 0x82, 0x62, 0x90, 0x85, 0x61, 0xf7, 0x80,
+	0xd2, 0x6f, 0x53, 0xfc, 0xf8, 0xbe, 0x01, 0xe9, 0xca, 0x8b, 0xae, 0xdc, 0x11, 0x62, 0xe2, 0x15,
+	0x9a, 0x4a, 0x59, 0xa3, 0xa9, 0x14, 0x8a, 0xba, 0xb5, 0x3f, 0xf6, 0x69, 0x62, 0x0f, 0xdd, 0x82,
+	0x3c, 0x71, 0xf1, 0x9b, 0x8f, 0xb5, 0xdd, 0x6e, 0xed, 0x81, 0x00, 0x53, 0xb9, 0x7c, 0xd6, 0x96,
+	0x44, 0x8c, 0xf6, 0x81, 0x36, 0xf2, 0xfb, 0x0f, 0x08, 0xc7, 0x9b, 0x90, 0x78, 0xba, 0x27, 0xc4,
+	0xc5, 0xcb, 0xb4, 0xa1, 0x10, 0x69, 0x78, 0x74, 0x4c, 0xde, 0xbf, 0x0d, 0x99, 0xbd, 0xb2, 0xaa,
+	0x34, 0xbb, 0x42, 0x42, 0x14, 0x69, 0x9b, 0xf5, 0x48, 0x9b, 0x63, 0xdd, 0x35, 0x6d, 0x9f, 0xb7,
+	0xab, 0xb6, 0x76, 0x2b, 0x75, 0x59, 0xc8, 0x9f, 0xd2, 0xce, 0x70, 0x46, 0x3c, 0x2b, 0xf4, 0x6e,
+	0x24, 0x8d, 0x94, 0x9c, 0xca, 0xa4, 0xb3, 0x96, 0xd1, 0x0c, 0xd2, 0x0d, 0x48, 0x77, 0x95, 0x86,
+	0xac, 0x0a, 0xa9, 0x53, 0xc6, 0x4c, 0x3d, 0x1e, 0x96, 0xe9, 0x5f, 0x51, 0x9a, 0x5d, 0x59, 0xdd,
+	0x0b, 0x2b, 0x1b, 0x84, 0xf4, 0x54, 0xfa, 0x99, 0x03, 0xdb, 0x3e, 0x76, 0x8f, 0x75, 0x8b, 0xa7,
+	0xfa, 0x59, 0xd2, 0x7a, 0xb9, 0x2e, 0x37, 0x1f, 0x77, 0x77, 0xb4, 0xb6, 0x2a, 0xd7, 0x94, 0xe7,
+	0x42, 0x66, 0x2a, 0x4d, 0xc5, 0xf8, 0x2c, 0x6c, 0x1f, 0xf8, 0x87, 0xda, 0xd0, 0xc5, 0x7d, 0xf3,
+	0x4b, 0xce, 0x35, 0x55, 0x47, 0x21, 0x2c, 0x9d, 0xc2, 0xc5, 0xb2, 0xe9, 0x11, 0x59, 0x1f, 0x42,
+	0x91, 0x35, 0x0f, 0xf2, 0xb6, 0x42, 0x76, 0xea, 0xf6, 0x83, 0xb1, 0x85, 0xfb, 0x96, 0x2d, 0x49,
+	0x9a, 0x3e, 0xbd, 0xd4, 0xe9, 0x96, 0xbb, 0xb2, 0x56, 0x21, 0xf1, 0x5a, 0x55, 0x0b, 0x95, 0x97,
+	0x13, 0xbf, 0x47, 0xd9, 0xdf, 0x9a, 0x9a, 0x5b, 0xdd, 0xc7, 0xda, 0xbe, 0xde, 0x3b, 0xc2, 0x86,
+	0x16, 0xd1, 0xa4, 0xf4, 0x07, 0x99, 0xc0, 0x45, 0x8a, 0x24, 0xa8, 0x5e, 0xbb, 0x8b, 0x84, 0xf6,
+	0xa0, 0xc0, 0x52, 0xe3, 0xa4, 0x23, 0x23, 0x8f, 0x3b, 0x77, 0x77, 0xe6, 0x09, 0x9f, 0x08, 0x5b,
+	0x87, 0x72, 0x31, 0xf7, 0x2e, 0x3f, 0x98, 0x50, 0xd0, 0xdb, 0x81, 0x45, 0x9b, 0xf8, 0x43, 0x49,
+	0xba, 0xf9, 0x97, 0x19, 0x39, 0xf0, 0xf0, 0xab, 0xb0, 0xe4, 0xbb, 0xe6, 0xc1, 0x01, 0x76, 0x79,
+	0xe4, 0xf6, 0xee, 0x3c, 0xc7, 0x0f, 0xe3, 0x50, 0x03, 0x56, 0x84, 0x61, 0x35, 0x74, 0xb3, 0x4c,
+	0xc7, 0xd6, 0x08, 0x0b, 0x8d, 0xdd, 0x8a, 0x9b, 0x0f, 0xe6, 0xc0, 0x2b, 0x47, 0x78, 0x1b, 0x8e,
+	0xc1, 0xe3, 0x78, 0x41, 0x9f, 0x21, 0x93, 0x00, 0x81, 0xa5, 0xf7, 0xa9, 0xaf, 0x42, 0x93, 0x3f,
+	0xf3, 0x05, 0x08, 0xec, 0x76, 0x92, 0x1c, 0x7d, 0x3c, 0x40, 0x70, 0x42, 0x02, 0xda, 0x07, 0xa1,
+	0x67, 0x39, 0xd4, 0x03, 0xda, 0xc7, 0x87, 0xfa, 0xb1, 0xe9, 0xb8, 0x34, 0x59, 0x54, 0xdc, 0xbc,
+	0x3f, 0x4f, 0x78, 0xcc, 0x58, 0x2b, 0x9c, 0x93, 0xc1, 0xaf, 0xf4, 0xa6, 0xa9, 0xd4, 0x3f, 0xb0,
+	0x2c, 0xba, 0x4c, 0x2d, 0xdd, 0xc7, 0x36, 0xf6, 0x3c, 0x9a, 0x5d, 0x22, 0xfe, 0x01, 0xa3, 0xd7,
+	0x39, 0x99, 0xc4, 0xea, 0x2d, 0x9b, 0x74, 0x2c, 0x60, 0xde, 0xc8, 0xcd, 0x9d, 0x0d, 0x99, 0x66,
+	0x64, 0x7d, 0x99, 0x41, 0x43, 0xb7, 0xe1, 0x92, 0xee, 0x79, 0xe6, 0x81, 0xed, 0x69, 0xbe, 0xa3,
+	0x39, 0x76, 0x70, 0x91, 0xb7, 0x01, 0xf4, 0xf0, 0x42, 0xfc, 0x65, 0xd7, 0x69, 0xd9, 0x98, 0xad,
+	0x7f, 0xe9, 0x73, 0xc8, 0x47, 0x16, 0x9b, 0xd4, 0x38, 0x2b, 0x3c, 0x5a, 0x81, 0x7c, 0xb3, 0xd5,
+	0xa4, 0xb7, 0x44, 0x4a, 0xf3, 0xb1, 0x10, 0xa7, 0x04, 0x59, 0xae, 0x76, 0xd8, 0xc5, 0x91, 0x90,
+	0x40, 0x08, 0x8a, 0xe5, 0xba, 0x2a, 0x97, 0xab, 0xfc, 0x2e, 0xa9, 0x2a, 0x24, 0xa5, 0x1f, 0x83,
+	0x30, 0x3b, 0xff, 0x92, 0x72, 0x96, 0x88, 0x22, 0x40, 0x55, 0xe9, 0x6c, 0x97, 0xd5, 0x2a, 0x93,
+	0x20, 0x40, 0x21, 0xbc, 0x8e, 0x22, 0x94, 0x04, 0x69, 0xa1, 0xca, 0xf4, 0x0a, 0x89, 0x3c, 0x27,
+	0xa5, 0x4f, 0x61, 0x65, 0x66, 0x8e, 0xa4, 0x47, 0xaf, 0x18, 0x80, 0xdc, 0x50, 0xba, 0x5a, 0xb9,
+	0xfe, 0xac, 0xfc, 0xa2, 0xc3, 0xf2, 0x42, 0x94, 0xa0, 0xd4, 0xb4, 0x66, 0xab, 0x29, 0x37, 0xda,
+	0xdd, 0x17, 0x42, 0x42, 0x6a, 0xcf, 0x4e, 0xd1, 0x2b, 0x11, 0x6b, 0x8a, 0x2a, 0x4f, 0x21, 0x52,
+	0xc2, 0x34, 0xe2, 0x3e, 0xc0, 0x64, 0x89, 0x4a, 0xdd, 0xb3, 0xd0, 0x56, 0x61, 0x59, 0x6e, 0x56,
+	0xb5, 0x56, 0x4d, 0x0b, 0x33, 0x57, 0x08, 0x8a, 0xf5, 0x32, 0xbd, 0x21, 0x56, 0x9a, 0x5a, 0xbb,
+	0xdc, 0x24, 0x5a, 0x26, 0xbd, 0x2e, 0xab, 0x75, 0x25, 0x4a, 0x4d, 0x4a, 0x16, 0xc0, 0x24, 0x4e,
+	0x96, 0xbe, 0x78, 0x85, 0x86, 0xe5, 0x3d, 0xb9, 0xd9, 0xa5, 0x75, 0x6e, 0x42, 0x1c, 0xad, 0xc1,
+	0x0a, 0xbf, 0x58, 0x21, 0x67, 0x24, 0x25, 0x26, 0xd0, 0x35, 0x78, 0xa3, 0xf3, 0xa2, 0xb9, 0xbd,
+	0xa3, 0xb6, 0x9a, 0xf4, 0xb2, 0x65, 0xb6, 0x45, 0x52, 0xfa, 0x95, 0x00, 0x4b, 0xdc, 0x4c, 0x20,
+	0x15, 0x72, 0x7a, 0xdf, 0xc7, 0xae, 0xa6, 0x5b, 0x16, 0x37, 0x9a, 0x77, 0xe6, 0xb7, 0x32, 0xa5,
+	0x32, 0xe1, 0x2d, 0x5b, 0xd6, 0x4e, 0x4c, 0xcd, 0xea, 0xfc, 0x77, 0x04, 0xd3, 0x1e, 0x73, 0x17,
+	0x66, 0x71, 0x4c, 0x7b, 0x3c, 0xc1, 0xb4, 0xc7, 0x68, 0x17, 0x80, 0x61, 0x62, 0xbd, 0x77, 0xc8,
+	0x63, 0x90, 0xbb, 0x8b, 0x82, 0xca, 0x7a, 0xef, 0x70, 0x27, 0xa6, 0xb2, 0xde, 0x91, 0x07, 0x64,
+	0xc1, 0x1a, 0x87, 0xb5, 0x0d, 0xcd, 0xe9, 0x07, 0xfb, 0x8d, 0x99, 0xdb, 0x8f, 0x16, 0xc6, 0xb7,
+	0x8d, 0x56, 0x9f, 0x6d, 0xcc, 0x9d, 0x98, 0x2a, 0xe8, 0x33, 0x34, 0xe4, 0xc3, 0x25, 0x26, 0x6d,
+	0x26, 0xb2, 0xe3, 0xa9, 0xb4, 0x47, 0x8b, 0xca, 0x3b, 0x19, 0xc1, 0xe9, 0x27, 0xc9, 0xe8, 0xeb,
+	0x38, 0x48, 0x4c, 0xac, 0x37, 0xb6, 0x7b, 0x87, 0xae, 0x63, 0xd3, 0x0b, 0xb4, 0xd9, 0x3e, 0xb0,
+	0x32, 0x95, 0x27, 0x8b, 0xf6, 0xa1, 0x13, 0xc1, 0x3c, 0xd1, 0x9f, 0xab, 0xfa, 0xab, 0x9b, 0xa0,
+	0xa7, 0x90, 0xd1, 0xad, 0x97, 0xfa, 0xd8, 0xdb, 0x28, 0xcc, 0x9d, 0x9b, 0x0d, 0xc5, 0x53, 0xc6,
+	0x9d, 0x98, 0xca, 0x21, 0x50, 0x13, 0x96, 0x0c, 0xdc, 0xd7, 0x47, 0x96, 0x4f, 0x0f, 0x89, 0xf9,
+	0x8e, 0xff, 0x00, 0xad, 0xca, 0x38, 0x77, 0x62, 0x6a, 0x00, 0x82, 0xbe, 0x98, 0x84, 0xbe, 0x3d,
+	0x67, 0x64, 0xfb, 0xf4, 0x58, 0xc8, 0xcf, 0x75, 0xf4, 0x04, 0xa8, 0x72, 0x90, 0x53, 0x1b, 0xd9,
+	0x7e, 0x24, 0xd6, 0xa5, 0xcf, 0x68, 0x07, 0xd2, 0x36, 0x3e, 0xc6, 0xec, 0x14, 0xc9, 0x6f, 0xde,
+	0x5a, 0x00, 0xb7, 0x49, 0xf8, 0x76, 0x62, 0x2a, 0x03, 0x20, 0xbb, 0xc3, 0x71, 0xd9, 0x05, 0x89,
+	0x35, 0xa6, 0xa7, 0xc5, 0x62, 0xbb, 0xa3, 0xe5, 0xd6, 0x18, 0x2f, 0xd9, 0x1d, 0x4e, 0xf0, 0x40,
+	0x66, 0xc7, 0xc5, 0x43, 0xac, 0xfb, 0x1b, 0xf9, 0x85, 0x67, 0x47, 0xa5, 0x8c, 0x64, 0x76, 0x18,
+	0x84, 0xf8, 0x1c, 0xb2, 0x81, 0xb5, 0x40, 0x75, 0xc8, 0xd3, 0xe2, 0x2e, 0xda, 0x34, 0x08, 0xae,
+	0x17, 0xf1, 0x6e, 0xa2, 0xec, 0x13, 0x64, 0x7b, 0xfc, 0x9a, 0x91, 0x5f, 0x40, 0x2e, 0x34, 0x1c,
+	0xaf, 0x19, 0xfa, 0xef, 0xe2, 0x20, 0xcc, 0x1a, 0x0d, 0xd4, 0x82, 0x65, 0xac, 0xbb, 0xd6, 0x58,
+	0xeb, 0x9b, 0x24, 0xac, 0x09, 0x2a, 0x0a, 0x17, 0x11, 0x52, 0xa0, 0x00, 0x35, 0xc6, 0x8f, 0x1a,
+	0x50, 0x20, 0x4e, 0x4d, 0x88, 0x97, 0x58, 0x18, 0x2f, 0x4f, 0xf8, 0x39, 0x9c, 0xf8, 0x7b, 0xb0,
+	0x76, 0x8a, 0xe1, 0x41, 0x87, 0xb0, 0x1e, 0xa6, 0x1a, 0xb4, 0x13, 0x65, 0xd4, 0xf7, 0xe6, 0xcc,
+	0x12, 0x53, 0xf6, 0x49, 0xdd, 0xec, 0x9a, 0x7f, 0x82, 0xe6, 0x89, 0xd7, 0xe1, 0xea, 0x37, 0x58,
+	0x1d, 0x31, 0x07, 0x4b, 0x7c, 0x2f, 0x8b, 0x77, 0xa0, 0x10, 0xdd, 0x80, 0xe8, 0xad, 0xd9, 0x0d,
+	0x4d, 0xd4, 0x9b, 0x9e, 0xde, 0x95, 0xe2, 0x12, 0xa4, 0xe9, 0xee, 0x12, 0xb3, 0x90, 0x61, 0x26,
+	0x46, 0xfc, 0x93, 0x38, 0xe4, 0xc2, 0x2d, 0x82, 0x1e, 0x41, 0x2a, 0xcc, 0x81, 0x2f, 0xa6, 0x4b,
+	0xca, 0x47, 0xdc, 0xfa, 0x60, 0xa7, 0x2e, 0x3e, 0x1d, 0x01, 0xab, 0xd8, 0x85, 0x0c, 0xdb, 0x62,
+	0xe8, 0x09, 0xc0, 0x64, 0x61, 0x9d, 0xa3, 0x57, 0x11, 0xee, 0x4a, 0x2e, 0x0c, 0x39, 0xa4, 0x7f,
+	0x4e, 0x44, 0x12, 0x52, 0x93, 0x92, 0xd0, 0x0e, 0xa4, 0x0d, 0x6c, 0xe9, 0x63, 0x2e, 0xe8, 0xa3,
+	0x73, 0x4d, 0x6e, 0xa9, 0x4a, 0x20, 0x88, 0xfd, 0xa2, 0x58, 0xe8, 0x33, 0xc8, 0xea, 0x96, 0x79,
+	0x60, 0x6b, 0xbe, 0xc3, 0x75, 0xf2, 0x83, 0xf3, 0xe1, 0x96, 0x09, 0x4a, 0xd7, 0x21, 0x56, 0x5c,
+	0x67, 0x3f, 0xc5, 0x77, 0x21, 0x4d, 0xa5, 0xa1, 0xeb, 0x50, 0xa0, 0xd2, 0xb4, 0x81, 0x69, 0x59,
+	0xa6, 0xc7, 0x93, 0x80, 0x79, 0x4a, 0x6b, 0x50, 0x92, 0xf8, 0x10, 0x96, 0x38, 0x02, 0xba, 0x0c,
+	0x99, 0x21, 0x76, 0x4d, 0x87, 0xc5, 0x66, 0x49, 0x95, 0x3f, 0x11, 0xba, 0xd3, 0xef, 0x7b, 0xd8,
+	0xa7, 0x4e, 0x42, 0x52, 0xe5, 0x4f, 0x95, 0x4b, 0xb0, 0x76, 0xca, 0x1e, 0x90, 0xfe, 0x38, 0x01,
+	0xb9, 0x30, 0x37, 0x83, 0xf6, 0xa0, 0xa8, 0xf7, 0x68, 0x11, 0xcb, 0x50, 0xf7, 0x7d, 0xec, 0xda,
+	0xe7, 0xcd, 0xc8, 0x2c, 0x33, 0x98, 0x36, 0x43, 0x41, 0x4f, 0x61, 0xe9, 0xd8, 0xc4, 0x2f, 0x2f,
+	0x76, 0x1b, 0x95, 0x21, 0x10, 0x35, 0x1b, 0x7d, 0x01, 0xab, 0x3c, 0x3c, 0x1d, 0xe8, 0xc3, 0x21,
+	0xf1, 0x0f, 0xfa, 0x36, 0xf7, 0xb8, 0xce, 0x03, 0xcb, 0x63, 0xdd, 0x06, 0xc3, 0xaa, 0xd9, 0xd2,
+	0x27, 0x90, 0x8f, 0x94, 0x56, 0x23, 0x01, 0x92, 0x23, 0xd7, 0xe6, 0x37, 0x02, 0xe4, 0x27, 0xda,
+	0x80, 0xa5, 0x21, 0x4b, 0xa5, 0x51, 0xb1, 0x05, 0x35, 0x78, 0x7c, 0x92, 0xca, 0xc6, 0x85, 0x84,
+	0xf4, 0x67, 0x71, 0x58, 0x0f, 0x12, 0x4b, 0xd1, 0xda, 0x6f, 0xe9, 0xab, 0x38, 0x14, 0xa2, 0x04,
+	0x74, 0x03, 0x32, 0xd5, 0x16, 0xbd, 0x18, 0x8e, 0x89, 0x1b, 0x34, 0xbf, 0x80, 0x68, 0x7e, 0x01,
+	0xdb, 0xc7, 0x5b, 0x86, 0xd3, 0x3b, 0x62, 0x29, 0x97, 0xb7, 0x61, 0x89, 0x3b, 0xc9, 0x42, 0x7c,
+	0x2a, 0x35, 0x43, 0x9a, 0x71, 0x37, 0x89, 0xb4, 0xbb, 0x09, 0x59, 0xf9, 0x79, 0x57, 0x56, 0x9b,
+	0xe5, 0xfa, 0x4c, 0xfa, 0x88, 0x34, 0xc4, 0x5f, 0x92, 0xa9, 0xd0, 0xad, 0xad, 0xe3, 0xdb, 0xd2,
+	0x03, 0x58, 0xae, 0x52, 0xf8, 0x20, 0xd3, 0xfa, 0x0e, 0xac, 0xf4, 0x1c, 0xdb, 0xd7, 0x4d, 0x9b,
+	0xc4, 0xfb, 0x03, 0xfd, 0x20, 0x28, 0x00, 0x2a, 0x86, 0x64, 0x85, 0x50, 0xa5, 0x7f, 0x8b, 0x43,
+	0x91, 0x1b, 0xb4, 0x80, 0xb7, 0x08, 0x09, 0xc7, 0xe3, 0xcd, 0x13, 0x8e, 0x87, 0x10, 0xa4, 0x74,
+	0xb7, 0x77, 0xc8, 0x35, 0x46, 0x7f, 0x13, 0x95, 0xf5, 0x9c, 0xc1, 0x40, 0xb7, 0x83, 0x54, 0x42,
+	0xf0, 0x88, 0xea, 0x90, 0xc4, 0xf6, 0xf1, 0x22, 0xf5, 0xcd, 0x53, 0xd2, 0x4b, 0xb2, 0x7d, 0xcc,
+	0xb2, 0x98, 0x04, 0x46, 0xfc, 0x10, 0xb2, 0x01, 0x61, 0xa1, 0x4a, 0xe2, 0xff, 0x89, 0xc3, 0x8a,
+	0xcc, 0x15, 0x14, 0x8c, 0xab, 0x03, 0xd9, 0xe0, 0xb3, 0x24, 0xbe, 0x0d, 0xe6, 0xf1, 0xac, 0xca,
+	0x43, 0xb3, 0x83, 0xdd, 0x63, 0xb3, 0x87, 0xab, 0xe1, 0x77, 0x49, 0x6a, 0x08, 0x84, 0xf6, 0x20,
+	0x43, 0xcb, 0x76, 0x82, 0xdb, 0xa0, 0x79, 0x7c, 0xea, 0x99, 0x8e, 0xb1, 0xc2, 0x85, 0xa0, 0x54,
+	0x9c, 0xa1, 0x89, 0x0f, 0x21, 0x1f, 0x21, 0x2f, 0x34, 0xf6, 0x5f, 0xc0, 0xca, 0xcc, 0x9e, 0x78,
+	0x3d, 0xf9, 0xd8, 0xef, 0x42, 0x31, 0xf2, 0x2d, 0xcb, 0xe4, 0x56, 0x6d, 0x39, 0x42, 0x55, 0x0c,
+	0x69, 0x0b, 0x0a, 0x53, 0xb2, 0xf9, 0x7e, 0x8b, 0xcf, 0xb1, 0xdf, 0xa4, 0xdf, 0xa5, 0x20, 0x1f,
+	0xa9, 0xdd, 0x42, 0x0a, 0xa4, 0x4d, 0x1f, 0x87, 0x27, 0xfb, 0x9d, 0xc5, 0x4a, 0xbf, 0x4a, 0x8a,
+	0x8f, 0x07, 0x2a, 0x43, 0x10, 0xfb, 0x00, 0x8a, 0x81, 0x6d, 0xdf, 0xec, 0x9b, 0xd8, 0x25, 0xb6,
+	0x39, 0xfa, 0xcd, 0x03, 0xef, 0x5d, 0xde, 0x9f, 0x7c, 0xee, 0x40, 0x0e, 0xef, 0x49, 0x93, 0x89,
+	0xc5, 0x98, 0xf0, 0xed, 0xba, 0x76, 0x30, 0x2f, 0xc9, 0x70, 0x5e, 0xc4, 0x5f, 0x27, 0x20, 0x45,
+	0xe4, 0x22, 0x05, 0x12, 0x1c, 0x78, 0xbe, 0x6f, 0x07, 0xa6, 0x3a, 0x1e, 0xf6, 0x54, 0x4d, 0x98,
+	0x64, 0x4f, 0xb1, 0x5a, 0x98, 0xc4, 0xdc, 0x59, 0xb4, 0x28, 0xd8, 0x4c, 0x35, 0x0c, 0x7a, 0x37,
+	0x58, 0x39, 0xcc, 0xc6, 0xae, 0x97, 0xd8, 0x07, 0x78, 0xa5, 0xe0, 0x03, 0xbc, 0x52, 0xd9, 0x0e,
+	0x3e, 0xab, 0x41, 0xf7, 0x20, 0xef, 0x1d, 0x3a, 0xae, 0xcf, 0x32, 0xaa, 0x3c, 0x4e, 0x3d, 0x9d,
+	0x03, 0x68, 0x43, 0x5a, 0x57, 0x41, 0x16, 0xa7, 0xa5, 0xef, 0x63, 0x8b, 0x7f, 0xc1, 0xc1, 0x1e,
+	0xd0, 0x15, 0xc8, 0x5a, 0xa6, 0x7d, 0xa4, 0x8d, 0x5c, 0x8b, 0x46, 0x7f, 0x39, 0x75, 0x89, 0x3c,
+	0xef, 0xba, 0x96, 0xf8, 0x0b, 0x5e, 0xa1, 0x33, 0x7a, 0x45, 0x85, 0x0e, 0x4b, 0xcd, 0xb3, 0xbb,
+	0x76, 0xa5, 0xd9, 0x95, 0x1f, 0xcb, 0xaa, 0x90, 0x40, 0x39, 0x48, 0xd7, 0xea, 0xad, 0x72, 0x57,
+	0x48, 0xb2, 0x3b, 0xf8, 0x56, 0x5d, 0x2e, 0x37, 0x85, 0x14, 0x5a, 0x86, 0x5c, 0xf8, 0x75, 0x9e,
+	0x90, 0x46, 0x05, 0xc8, 0x56, 0x77, 0xd5, 0x32, 0x2d, 0x9f, 0xcd, 0xa0, 0x22, 0xc0, 0x93, 0xf2,
+	0x5e, 0x59, 0xdb, 0xae, 0x97, 0x3b, 0x1d, 0x61, 0x49, 0xfa, 0xa7, 0x2c, 0x5c, 0x6a, 0x60, 0xcf,
+	0xd3, 0x0f, 0xf0, 0x33, 0xd3, 0x3f, 0x8c, 0x54, 0xf3, 0xbe, 0xe6, 0x0f, 0x6e, 0x7e, 0x08, 0x69,
+	0x9a, 0x83, 0x5d, 0xf4, 0x0b, 0x24, 0xe2, 0xba, 0x50, 0x46, 0xf4, 0x39, 0xb1, 0xec, 0xbc, 0xdc,
+	0x39, 0xb2, 0x89, 0xe6, 0x0b, 0x96, 0xa6, 0x2f, 0xe0, 0x77, 0x62, 0x2a, 0xaf, 0x05, 0x0a, 0xaf,
+	0xe4, 0x7f, 0x02, 0xab, 0x9e, 0x71, 0x14, 0x5e, 0xab, 0x45, 0xcb, 0x78, 0xce, 0x71, 0x16, 0xef,
+	0xc4, 0xd4, 0x15, 0x6f, 0xc6, 0x14, 0x3d, 0x83, 0xe2, 0x50, 0x77, 0x35, 0xc3, 0x09, 0xbb, 0x9f,
+	0x99, 0xdb, 0x28, 0x45, 0x0b, 0x03, 0x49, 0x74, 0x3b, 0x8c, 0x56, 0x72, 0xb6, 0x00, 0x86, 0xe1,
+	0xde, 0xe4, 0x01, 0xf9, 0x62, 0x9f, 0xce, 0xed, 0xc4, 0xd4, 0x08, 0x04, 0x52, 0x21, 0x1f, 0xf9,
+	0xdc, 0x91, 0x07, 0xe3, 0x0b, 0x7e, 0x1c, 0xb7, 0x13, 0x53, 0xa3, 0x20, 0xa8, 0x03, 0x05, 0x17,
+	0xeb, 0x46, 0x38, 0xf6, 0xdc, 0xdc, 0xa0, 0x91, 0x7a, 0x12, 0x02, 0xea, 0x46, 0xca, 0x4b, 0x1a,
+	0x00, 0x93, 0xab, 0x44, 0x1e, 0x3a, 0x2f, 0x74, 0x87, 0x47, 0xa2, 0xf0, 0xf0, 0xce, 0x10, 0xf5,
+	0x61, 0x2d, 0xf2, 0xe1, 0x49, 0xd8, 0xd5, 0xc2, 0x82, 0x1f, 0xe9, 0x45, 0xaa, 0x49, 0x76, 0x62,
+	0x2a, 0x77, 0xf1, 0xa2, 0x25, 0x26, 0x18, 0xd0, 0xc9, 0x92, 0xe0, 0x8d, 0xe5, 0xf3, 0x7f, 0x0b,
+	0x38, 0x11, 0x13, 0xbd, 0xa6, 0xd9, 0x83, 0xe5, 0xe9, 0xe5, 0x5c, 0x3c, 0xd7, 0x21, 0x48, 0xd6,
+	0x5b, 0x3f, 0xf2, 0x5c, 0xc9, 0x40, 0xca, 0x75, 0x1c, 0x5f, 0xfa, 0x55, 0x06, 0x2e, 0xcb, 0x5f,
+	0xe2, 0xde, 0x88, 0xd6, 0x9c, 0x76, 0x7c, 0xfd, 0x20, 0xdc, 0x4d, 0x6d, 0xc8, 0x47, 0xce, 0x46,
+	0x6e, 0x3d, 0x16, 0xfd, 0x14, 0x30, 0x0a, 0x41, 0x0c, 0x2b, 0x9b, 0x65, 0x7e, 0xea, 0x9b, 0x7c,
+	0xc6, 0x4e, 0xa9, 0x16, 0x96, 0xe7, 0xf2, 0x44, 0x4e, 0xeb, 0xf7, 0x64, 0x61, 0x28, 0xc6, 0x54,
+	0xcd, 0xf0, 0x9b, 0x53, 0x1f, 0x2d, 0xa7, 0xe8, 0x45, 0x6c, 0xf4, 0xab, 0xe3, 0x8d, 0xc9, 0xf7,
+	0x6d, 0x69, 0xfa, 0x32, 0xfc, 0x46, 0x6d, 0xda, 0x8c, 0x66, 0x2e, 0x6a, 0x46, 0xfb, 0x90, 0x1f,
+	0x79, 0xd8, 0xa5, 0x17, 0x65, 0xd8, 0xdb, 0x58, 0xba, 0xe8, 0x80, 0x77, 0x3d, 0xec, 0xd2, 0x9a,
+	0x35, 0x32, 0xe0, 0x51, 0xf0, 0xe0, 0xa1, 0x17, 0x90, 0xa1, 0x17, 0xa5, 0xde, 0x46, 0x96, 0x8a,
+	0x28, 0x9f, 0x5f, 0x04, 0x2d, 0x6d, 0x53, 0x0c, 0x95, 0x03, 0x8a, 0x2d, 0xc8, 0x47, 0xd4, 0x3c,
+	0x8f, 0x43, 0xf2, 0x1d, 0x00, 0xcb, 0xe9, 0xe9, 0x16, 0xab, 0xe7, 0x67, 0x0b, 0x20, 0x47, 0x29,
+	0x4d, 0x7d, 0x80, 0x09, 0x60, 0x64, 0x18, 0xaf, 0x01, 0xf0, 0x29, 0x2c, 0xf1, 0x4e, 0x5f, 0x1c,
+	0x6c, 0xeb, 0x13, 0xc8, 0xd2, 0x7f, 0x13, 0x20, 0xfe, 0xdf, 0xf5, 0x13, 0xfe, 0x03, 0x39, 0xf3,
+	0xa9, 0xe7, 0xd0, 0x1a, 0xb2, 0xef, 0xd5, 0x7f, 0xfb, 0xe7, 0x7f, 0xfd, 0x9c, 0x79, 0x08, 0x84,
+	0x6b, 0xd7, 0xb5, 0xb7, 0x14, 0x58, 0xa6, 0x00, 0x3d, 0xfe, 0xd9, 0xff, 0x3c, 0x28, 0xff, 0x12,
+	0xa0, 0x14, 0xf6, 0x23, 0x7f, 0x1f, 0x50, 0xf9, 0x08, 0xbe, 0xf9, 0x2f, 0x0c, 0x2a, 0x39, 0x95,
+	0x56, 0x6e, 0x94, 0x87, 0xe6, 0x67, 0xf9, 0x80, 0xae, 0x1d, 0xdf, 0xde, 0xcf, 0x50, 0x71, 0x77,
+	0xfe, 0x2f, 0x00, 0x00, 0xff, 0xff, 0xa0, 0xb1, 0x4c, 0x75, 0x1d, 0x41, 0x00, 0x00,
 }
diff --git a/sdks/go/test/build.gradle b/sdks/go/test/build.gradle
index b12d3f6..77fd3be 100644
--- a/sdks/go/test/build.gradle
+++ b/sdks/go/test/build.gradle
@@ -49,12 +49,12 @@
 
 task flinkValidatesRunner {
   dependsOn ":sdks:go:test:goBuild"
-  dependsOn ":runners:flink:1.5:job-server:shadowJar"
+  dependsOn ":runners:flink:1.8:job-server:shadowJar"
   doLast {
     def options = [
             "--runner flink",
             "--parallel 1", // prevent memory overuse
-            "--flink_job_server_jar ${project(":runners:flink:1.5:job-server").shadowJar.archivePath}",
+            "--flink_job_server_jar ${project(":runners:flink:1.8:job-server").shadowJar.archivePath}",
     ]
     exec {
       executable "sh"
diff --git a/sdks/java/build-tools/build.gradle b/sdks/java/build-tools/build.gradle
index 53f88b7..5916470 100644
--- a/sdks/java/build-tools/build.gradle
+++ b/sdks/java/build-tools/build.gradle
@@ -17,6 +17,6 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, publish: false)
 
 description = "Apache Beam :: SDKs :: Java :: Build Tools"
diff --git a/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml b/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
index ba130ad..26265e3 100644
--- a/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
+++ b/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
@@ -130,6 +130,14 @@
       <property name="message" value="You are using raw byte-buddy, please use vendored byte-buddy classes."/>
     </module>
 
+    <!-- Forbid Non-vendored calcite imports. -->
+    <module name="RegexpSinglelineJava">
+      <property name="id" value="ForbidCalcite"/>
+      <property name="format" value="(\sorg\.apache\.calcite)"/>
+      <property name="severity" value="error"/>
+      <property name="message" value="You are using raw calcite, please use vendored calcite classes."/>
+    </module>
+
     <module name="UnusedImports">
       <property name="severity" value="error"/>
       <property name="processJavadoc" value="true"/>
diff --git a/sdks/java/core/build.gradle b/sdks/java/core/build.gradle
index 2a835c5..a7ed6c2 100644
--- a/sdks/java/core/build.gradle
+++ b/sdks/java/core/build.gradle
@@ -17,17 +17,20 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(shadowClosure: {
-  dependencies {
-    include(dependency(library.java.protobuf_java))
-    include(dependency("org.apache.commons:.*"))
-    include(dependency(library.java.antlr_runtime))
+applyJavaNature(
+  automaticModuleName: 'org.apache.beam.sdk',
+  shadowClosure: {
+    dependencies {
+      include(dependency(library.java.protobuf_java))
+      include(dependency("org.apache.commons:.*"))
+      include(dependency(library.java.antlr_runtime))
+    }
+    relocate "com.google.thirdparty", getJavaRelocatedPath("com.google.thirdparty")
+    relocate "com.google.protobuf", getJavaRelocatedPath("com.google.protobuf")
+    relocate "org.apache.commons", getJavaRelocatedPath("org.apache.commons")
+    relocate "org.antlr.v4", getJavaRelocatedPath("org.antlr.v4")
   }
-  relocate "com.google.thirdparty", getJavaRelocatedPath("com.google.thirdparty")
-  relocate "com.google.protobuf", getJavaRelocatedPath("com.google.protobuf")
-  relocate "org.apache.commons", getJavaRelocatedPath("org.apache.commons")
-  relocate "org.antlr.v4", getJavaRelocatedPath("org.antlr.v4")
-})
+)
 applyAvroNature()
 applyAntlrNature()
 
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
index 6c63ab8..2d78e48 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
@@ -131,8 +131,7 @@
    * Returns an {@code AvroCoder} instance for the provided element type using the provided Avro
    * schema.
    *
-   * <p>If the type argument is GenericRecord, the schema may be arbitrary. Otherwise, the schema
-   * must correspond to the type provided.
+   * <p>The schema must correspond to the type provided.
    *
    * @param <T> the element type
    */
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/ByteBuddyUtils.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/ByteBuddyUtils.java
index 2a414bd..e98e532 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/ByteBuddyUtils.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/ByteBuddyUtils.java
@@ -69,9 +69,11 @@
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Maps;
 import org.apache.commons.lang3.ArrayUtils;
 import org.apache.commons.lang3.ClassUtils;
+import org.joda.time.DateTimeZone;
 import org.joda.time.Instant;
 import org.joda.time.ReadableInstant;
 import org.joda.time.ReadablePartial;
+import org.joda.time.base.BaseLocal;
 
 class ByteBuddyUtils {
   private static final ForLoadedType ARRAYS_TYPE = new ForLoadedType(Arrays.class);
@@ -80,6 +82,7 @@
   private static final ForLoadedType BYTE_BUFFER_TYPE = new ForLoadedType(ByteBuffer.class);
   private static final ForLoadedType CHAR_SEQUENCE_TYPE = new ForLoadedType(CharSequence.class);
   private static final ForLoadedType INSTANT_TYPE = new ForLoadedType(Instant.class);
+  private static final ForLoadedType DATE_TIME_ZONE_TYPE = new ForLoadedType(DateTimeZone.class);
   private static final ForLoadedType LIST_TYPE = new ForLoadedType(List.class);
   private static final ForLoadedType READABLE_INSTANT_TYPE =
       new ForLoadedType(ReadableInstant.class);
@@ -574,31 +577,62 @@
       // that the POJO can accept.
 
       // Generate the following code:
-      // return new T(value.getMillis());
+      //   return new T(value.getMillis());
+      // Unless T is a sub-class of BaseLocal. Then generate:
+      //   return new T(value.getMillis(), DateTimeZone.UTC);
 
       ForLoadedType loadedType = new ForLoadedType(type.getRawType());
-      return new Compound(
-          // Create a new instance of the target type.
-          TypeCreation.of(loadedType),
-          Duplication.SINGLE,
-          // Load the parameter and cast it to a ReadableInstant.
-          readValue,
-          TypeCasting.to(READABLE_INSTANT_TYPE),
-          // Call ReadableInstant.getMillis to extract the millis since the epoch.
+      List<StackManipulation> stackManipulations = new ArrayList<>();
+
+      // Create a new instance of the target ype.
+      stackManipulations.add(TypeCreation.of(loadedType));
+      stackManipulations.add(Duplication.SINGLE);
+      // Load the parameter and cast it to a ReadableInstant.
+      stackManipulations.add(readValue);
+      stackManipulations.add(TypeCasting.to(READABLE_INSTANT_TYPE));
+      // Call ReadableInstant.getMillis to extract the millis since the epoch.
+      stackManipulations.add(
           MethodInvocation.invoke(
               READABLE_INSTANT_TYPE
                   .getDeclaredMethods()
                   .filter(ElementMatchers.named("getMillis"))
-                  .getOnly()),
-          // All subclasses of ReadableInstant and ReadablePartial contain a ()(long) constructor
-          // that takes in a millis argument. Call that constructor of the field to initialize it.
-          MethodInvocation.invoke(
-              loadedType
-                  .getDeclaredMethods()
-                  .filter(
-                      ElementMatchers.isConstructor()
-                          .and(ElementMatchers.takesArguments(ForLoadedType.of(long.class))))
                   .getOnly()));
+      if (type.isSubtypeOf(TypeDescriptor.of(BaseLocal.class))) {
+        // Access DateTimeZone.UTC
+        stackManipulations.add(
+            FieldAccess.forField(
+                    DATE_TIME_ZONE_TYPE
+                        .getDeclaredFields()
+                        .filter(ElementMatchers.named("UTC"))
+                        .getOnly())
+                .read());
+        // All subclasses of BaseLocal contain a ()(long, DateTimeZone) constructor
+        // that takes in a millis and time zone argument. Call that constructor of the field to
+        // initialize it.
+        stackManipulations.add(
+            MethodInvocation.invoke(
+                loadedType
+                    .getDeclaredMethods()
+                    .filter(
+                        ElementMatchers.isConstructor()
+                            .and(
+                                ElementMatchers.takesArguments(
+                                    ForLoadedType.of(long.class), DATE_TIME_ZONE_TYPE)))
+                    .getOnly()));
+      } else {
+        // All subclasses of ReadableInstant and ReadablePartial contain a ()(long) constructor
+        // that takes in a millis argument. Call that constructor of the field to initialize it.
+        stackManipulations.add(
+            MethodInvocation.invoke(
+                loadedType
+                    .getDeclaredMethods()
+                    .filter(
+                        ElementMatchers.isConstructor()
+                            .and(ElementMatchers.takesArguments(ForLoadedType.of(long.class))))
+                    .getOnly()));
+      }
+
+      return new Compound(stackManipulations);
     }
 
     @Override
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/TestStream.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/TestStream.java
index 08b7bb4d..41a46ab 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/TestStream.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/TestStream.java
@@ -74,6 +74,10 @@
     return new Builder<>(coder);
   }
 
+  public static Builder<Row> create(Schema schema) {
+    return create(SchemaCoder.of(schema));
+  }
+
   public static <T> Builder<T> create(
       Schema schema,
       SerializableFunction<T, Row> toRowFunction,
diff --git a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/FlatMapElementsTest.java b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/FlatMapElementsTest.java
index 81fca1f..9e606b3 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/FlatMapElementsTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/FlatMapElementsTest.java
@@ -21,8 +21,8 @@
 import static org.apache.beam.sdk.transforms.Requirements.requiresSideInputs;
 import static org.apache.beam.sdk.transforms.display.DisplayDataMatchers.hasDisplayItem;
 import static org.apache.beam.sdk.values.TypeDescriptors.integers;
+import static org.hamcrest.MatcherAssert.assertThat;
 import static org.hamcrest.Matchers.equalTo;
-import static org.junit.Assert.assertThat;
 
 import java.io.Serializable;
 import java.util.Collections;
diff --git a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/MapElementsTest.java b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/MapElementsTest.java
index 2441255..390a613 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/MapElementsTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/MapElementsTest.java
@@ -21,12 +21,12 @@
 import static org.apache.beam.sdk.transforms.Requirements.requiresSideInputs;
 import static org.apache.beam.sdk.transforms.display.DisplayDataMatchers.hasDisplayItem;
 import static org.apache.beam.sdk.values.TypeDescriptors.integers;
+import static org.hamcrest.MatcherAssert.assertThat;
 import static org.hamcrest.Matchers.equalTo;
 import static org.hamcrest.Matchers.hasItem;
 import static org.hamcrest.Matchers.hasKey;
 import static org.hamcrest.Matchers.hasSize;
 import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertThat;
 
 import java.io.Serializable;
 import java.util.Map;
@@ -46,7 +46,6 @@
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.TypeDescriptor;
 import org.apache.beam.sdk.values.TypeDescriptors;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
@@ -538,8 +537,6 @@
 
     PAssert.that(result.output()).containsInAnyOrder(1);
 
-    Map<String, String> expectedFailureInfo =
-        ImmutableMap.of("className", "java.lang.ArithmeticException");
     PAssert.thatSingleton(result.failures())
         .satisfies(
             kv -> {
diff --git a/sdks/java/extensions/euphoria/build.gradle b/sdks/java/extensions/euphoria/build.gradle
index ee7f6ec..93f6ebd 100644
--- a/sdks/java/extensions/euphoria/build.gradle
+++ b/sdks/java/extensions/euphoria/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.sdk.extensions.euphoria')
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: Euphoria Java 8 DSL"
 
diff --git a/sdks/java/extensions/google-cloud-platform-core/build.gradle b/sdks/java/extensions/google-cloud-platform-core/build.gradle
index 89479ae..9ffa229 100644
--- a/sdks/java/extensions/google-cloud-platform-core/build.gradle
+++ b/sdks/java/extensions/google-cloud-platform-core/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.gcp')
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core"
 ext.summary = """Common components used to support multiple
diff --git a/sdks/java/extensions/google-cloud-platform-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java b/sdks/java/extensions/google-cloud-platform-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
index 9787a99..37cf6bc 100644
--- a/sdks/java/extensions/google-cloud-platform-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
+++ b/sdks/java/extensions/google-cloud-platform-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
@@ -94,6 +94,7 @@
    */
   @Description(
       "GCP availability zone for running GCP operations. "
+          + "and GCE availability zone for launching workers "
           + "Default is up to the individual service.")
   String getZone();
 
diff --git a/sdks/java/extensions/jackson/build.gradle b/sdks/java/extensions/jackson/build.gradle
index 42336a4..3d1e692 100644
--- a/sdks/java/extensions/jackson/build.gradle
+++ b/sdks/java/extensions/jackson/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    automaticModuleName: 'org.apache.beam.sdk.extensions.jackson',
     archivesBaseName: 'beam-sdks-java-extensions-json-jackson'
 )
 
diff --git a/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/AsJsons.java b/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/AsJsons.java
index 2be73ff..9f1a199 100644
--- a/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/AsJsons.java
+++ b/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/AsJsons.java
@@ -17,13 +17,27 @@
  */
 package org.apache.beam.sdk.extensions.jackson;
 
+import com.fasterxml.jackson.core.JsonProcessingException;
 import com.fasterxml.jackson.databind.ObjectMapper;
+import edu.umd.cs.findbugs.annotations.Nullable;
 import java.io.IOException;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.Optional;
+import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.transforms.Contextful;
+import org.apache.beam.sdk.transforms.InferableFunction;
 import org.apache.beam.sdk.transforms.MapElements;
 import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ProcessFunction;
+import org.apache.beam.sdk.transforms.Requirements;
 import org.apache.beam.sdk.transforms.SimpleFunction;
+import org.apache.beam.sdk.transforms.WithFailures;
+import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Optional;
+import org.apache.beam.sdk.values.TypeDescriptor;
+import org.apache.beam.sdk.values.TypeDescriptors;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 
 /**
  * {@link PTransform} for serializing objects to JSON {@link String Strings}. Transforms a {@code
@@ -41,12 +55,12 @@
    * into a {@link PCollection} of JSON {@link String Strings} representing those objects using a
    * Jackson {@link ObjectMapper}.
    */
-  public static <OutputT> AsJsons<OutputT> of(Class<? extends OutputT> outputClass) {
-    return new AsJsons<>(outputClass);
+  public static <InputT> AsJsons<InputT> of(Class<? extends InputT> inputClass) {
+    return new AsJsons<>(inputClass);
   }
 
-  private AsJsons(Class<? extends InputT> outputClass) {
-    this.inputClass = outputClass;
+  private AsJsons(Class<? extends InputT> inputClass) {
+    this.inputClass = inputClass;
   }
 
   /** Use custom Jackson {@link ObjectMapper} instead of the default one. */
@@ -56,6 +70,83 @@
     return newTransform;
   }
 
+  /**
+   * Returns a new {@link AsJsonsWithFailures} transform that catches exceptions raised while
+   * writing JSON elements, with the given type descriptor used for the failure collection but the
+   * exception handler yet to be specified using {@link
+   * AsJsonsWithFailures#exceptionsVia(ProcessFunction)}.
+   *
+   * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+   * WithFailures.Result}.
+   */
+  @Experimental(Experimental.Kind.WITH_EXCEPTIONS)
+  public <NewFailureT> AsJsonsWithFailures<NewFailureT> exceptionsInto(
+      TypeDescriptor<NewFailureT> failureTypeDescriptor) {
+    return new AsJsonsWithFailures<>(null, failureTypeDescriptor);
+  }
+
+  /**
+   * Returns a new {@link AsJsonsWithFailures} transform that catches exceptions raised while
+   * writing JSON elements, passing the raised exception instance and the input element being
+   * processed through the given {@code exceptionHandler} and emitting the result to a failure
+   * collection.
+   *
+   * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+   * WithFailures.Result}.
+   *
+   * <p>Example usage:
+   *
+   * <pre>{@code
+   * WithFailures.Result<PCollection<String>, KV<MyPojo, Map<String, String>>> result =
+   *     pojos.apply(
+   *         AsJsons.of(MyPojo.class)
+   *             .exceptionsVia(new WithFailures.ExceptionAsMapHandler<MyPojo>() {}));
+   *
+   * PCollection<String> output = result.output(); // valid json elements
+   * PCollection<KV<MyPojo, Map<String, String>>> failures = result.failures();
+   * }</pre>
+   */
+  @Experimental(Experimental.Kind.WITH_EXCEPTIONS)
+  public <FailureT> AsJsonsWithFailures<FailureT> exceptionsVia(
+      InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT> exceptionHandler) {
+    return new AsJsonsWithFailures<>(exceptionHandler, exceptionHandler.getOutputTypeDescriptor());
+  }
+
+  /**
+   * Returns a new {@link AsJsonsWithFailures} transform that catches exceptions raised while
+   * writing JSON elements, passing the raised exception instance and the input element being
+   * processed through the default exception handler {@link DefaultExceptionAsMapHandler} and
+   * emitting the result to a failure collection.
+   *
+   * <p>See {@link DefaultExceptionAsMapHandler} for more details about default handler behavior.
+   *
+   * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+   * WithFailures.Result}.
+   *
+   * <p>Example usage:
+   *
+   * <pre>{@code
+   * WithFailures.Result<PCollection<String>, KV<MyPojo, Map<String, String>>> result =
+   *     pojos.apply(
+   *         AsJsons.of(MyPojo.class)
+   *             .exceptionsVia());
+   *
+   * PCollection<String> output = result.output(); // valid json elements
+   * PCollection<KV<MyPojo, Map<String, String>>> failures = result.failures();
+   * }</pre>
+   */
+  @Experimental(Experimental.Kind.WITH_EXCEPTIONS)
+  public AsJsonsWithFailures<KV<InputT, Map<String, String>>> exceptionsVia() {
+    DefaultExceptionAsMapHandler<InputT> exceptionHandler =
+        new DefaultExceptionAsMapHandler<InputT>() {};
+    return new AsJsonsWithFailures<>(exceptionHandler, exceptionHandler.getOutputTypeDescriptor());
+  }
+
+  private String writeValue(InputT input) throws JsonProcessingException {
+    ObjectMapper mapper = Optional.ofNullable(customMapper).orElse(DEFAULT_MAPPER);
+    return mapper.writeValueAsString(input);
+  }
+
   @Override
   public PCollection<String> expand(PCollection<InputT> input) {
     return input.apply(
@@ -64,8 +155,7 @@
               @Override
               public String apply(InputT input) {
                 try {
-                  ObjectMapper mapper = Optional.fromNullable(customMapper).or(DEFAULT_MAPPER);
-                  return mapper.writeValueAsString(input);
+                  return writeValue(input);
                 } catch (IOException e) {
                   throw new RuntimeException(
                       "Failed to serialize " + inputClass.getName() + " value: " + input, e);
@@ -73,4 +163,93 @@
               }
             }));
   }
+
+  /** A {@code PTransform} that adds exception handling to {@link AsJsons}. */
+  public class AsJsonsWithFailures<FailureT>
+      extends PTransform<PCollection<InputT>, WithFailures.Result<PCollection<String>, FailureT>> {
+
+    @Nullable
+    private InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT> exceptionHandler;
+
+    @Nullable private final transient TypeDescriptor<FailureT> failureType;
+
+    AsJsonsWithFailures(
+        InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT> exceptionHandler,
+        TypeDescriptor<FailureT> failureType) {
+      this.exceptionHandler = exceptionHandler;
+      this.failureType = failureType;
+    }
+
+    /**
+     * Returns a new {@link AsJsonsWithFailures} transform that catches exceptions raised while
+     * writing JSON elements, passing the raised exception instance and the input element being
+     * processed through the given {@code exceptionHandler} and emitting the result to a failure
+     * collection. It is supposed to be used along with {@link
+     * AsJsons#exceptionsInto(TypeDescriptor)} and get lambda function as exception handler.
+     *
+     * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+     * WithFailures.Result}.
+     *
+     * <p>Example usage:
+     *
+     * <pre>{@code
+     * WithFailures.Result<PCollection<String>, KV<MyPojo, Map<String, String>>> result =
+     *     pojos.apply(
+     *         AsJsons.of(MyPojo.class)
+     *             .exceptionsInto(
+     *                 TypeDescriptors.kvs(
+     *                     TypeDescriptor.of(MyPojo.class), TypeDescriptors.strings()))
+     *             .exceptionsVia(
+     *                 f -> KV.of(f.element(), f.exception().getClass().getCanonicalName())));
+     *
+     * PCollection<String> output = result.output(); // valid json elements
+     * PCollection<KV<MyPojo, Map<String, String>>> failures = result.failures();
+     * }</pre>
+     */
+    public AsJsonsWithFailures<FailureT> exceptionsVia(
+        ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT> exceptionHandler) {
+      return new AsJsonsWithFailures<>(
+          new InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>(
+              exceptionHandler) {},
+          failureType);
+    }
+
+    @Override
+    public WithFailures.Result<PCollection<String>, FailureT> expand(PCollection<InputT> input) {
+      return input.apply(
+          MapElements.into(TypeDescriptors.strings())
+              .via(
+                  Contextful.fn(
+                      (Contextful.Fn<InputT, String>) (input1, c) -> writeValue(input1),
+                      Requirements.empty()))
+              .exceptionsInto(failureType)
+              .exceptionsVia(exceptionHandler));
+    }
+  }
+
+  /**
+   * A default handler that extracts information from an exception to a {@code Map<String, String>}
+   * and returns a {@link KV} where the key is the input element that failed processing, and the
+   * value is the map of exception attributes. It handles only {@code JsonProcessingException},
+   * other type of exceptions will be rethrown as {@code RuntimeException}.
+   *
+   * <p>The keys populated in the map are "className", "message", and "stackTrace" of the exception.
+   */
+  private static class DefaultExceptionAsMapHandler<InputT>
+      extends SimpleFunction<
+          WithFailures.ExceptionElement<InputT>, KV<InputT, Map<String, String>>> {
+    @Override
+    public KV<InputT, Map<String, String>> apply(WithFailures.ExceptionElement<InputT> f)
+        throws RuntimeException {
+      if (!(f.exception() instanceof JsonProcessingException)) {
+        throw new RuntimeException(f.exception());
+      }
+      return KV.of(
+          f.element(),
+          ImmutableMap.of(
+              "className", f.exception().getClass().getName(),
+              "message", f.exception().getMessage(),
+              "stackTrace", Arrays.toString(f.exception().getStackTrace())));
+    }
+  }
 }
diff --git a/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/ParseJsons.java b/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/ParseJsons.java
index 89047eb..a92041b 100644
--- a/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/ParseJsons.java
+++ b/sdks/java/extensions/jackson/src/main/java/org/apache/beam/sdk/extensions/jackson/ParseJsons.java
@@ -19,11 +19,23 @@
 
 import com.fasterxml.jackson.databind.ObjectMapper;
 import java.io.IOException;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.Optional;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.transforms.Contextful;
+import org.apache.beam.sdk.transforms.InferableFunction;
 import org.apache.beam.sdk.transforms.MapElements;
 import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ProcessFunction;
+import org.apache.beam.sdk.transforms.Requirements;
 import org.apache.beam.sdk.transforms.SimpleFunction;
+import org.apache.beam.sdk.transforms.WithFailures;
+import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Optional;
+import org.apache.beam.sdk.values.TypeDescriptor;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 
 /**
  * {@link PTransform} for parsing JSON {@link String Strings}. Parse {@link PCollection} of {@link
@@ -55,6 +67,84 @@
     return newTransform;
   }
 
+  /**
+   * Returns a new {@link ParseJsonsWithFailures} transform that catches exceptions raised while
+   * parsing elements, with the given type descriptor used for the failure collection but the
+   * exception handler yet to be specified using {@link
+   * ParseJsonsWithFailures#exceptionsVia(ProcessFunction)}.
+   *
+   * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+   * WithFailures.Result}.
+   */
+  @Experimental(Experimental.Kind.WITH_EXCEPTIONS)
+  public <NewFailureT> ParseJsonsWithFailures<NewFailureT> exceptionsInto(
+      TypeDescriptor<NewFailureT> failureTypeDescriptor) {
+    return new ParseJsonsWithFailures<>(null, failureTypeDescriptor);
+  }
+
+  /**
+   * Returns a new {@link ParseJsonsWithFailures} transform that catches exceptions raised while
+   * parsing elements, passing the raised exception instance and the input element being processed
+   * through the given {@code exceptionHandler} and emitting the result to a failure collection.
+   *
+   * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+   * WithFailures.Result}.
+   *
+   * <p>Example usage:
+   *
+   * <pre>{@code
+   * WithFailures.Result<PCollection<MyPojo>, KV<String, Map<String, String>>> result =
+   *     json.apply(
+   *         ParseJsons.of(MyPojo.class)
+   *             .exceptionsVia(new WithFailures.ExceptionAsMapHandler<String>() {}));
+   *
+   * PCollection<MyPojo> output = result.output(); // valid POJOs
+   * PCollection<KV<String, Map<String, String>>> failures = result.failures();
+   * }</pre>
+   */
+  @Experimental(Experimental.Kind.WITH_EXCEPTIONS)
+  public <FailureT> ParseJsonsWithFailures<FailureT> exceptionsVia(
+      InferableFunction<WithFailures.ExceptionElement<String>, FailureT> exceptionHandler) {
+    return new ParseJsonsWithFailures<>(
+        exceptionHandler, exceptionHandler.getOutputTypeDescriptor());
+  }
+
+  /**
+   * Returns a new {@link ParseJsonsWithFailures} transform that catches exceptions raised while
+   * parsing elements, passing the raised exception instance and the input element being processed
+   * through the default exception handler {@link DefaultExceptionAsMapHandler} and emitting the
+   * result to a failure collection.
+   *
+   * <p>See {@link DefaultExceptionAsMapHandler} for more details about default handler behavior.
+   *
+   * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+   * WithFailures.Result}.
+   *
+   * <p>Example usage:
+   *
+   * <pre>{@code
+   * WithFailures.Result<PCollection<MyPojo>, KV<String, Map<String, String>>> result =
+   *     json.apply(
+   *         ParseJsons.of(MyPojo.class)
+   *             .exceptionsVia());
+   *
+   * PCollection<MyPojo> output = result.output(); // valid POJOs
+   * PCollection<KV<String, Map<String, String>>> failures = result.failures();
+   * }</pre>
+   */
+  @Experimental(Experimental.Kind.WITH_EXCEPTIONS)
+  public ParseJsonsWithFailures<KV<String, Map<String, String>>> exceptionsVia() {
+    DefaultExceptionAsMapHandler<String> exceptionHandler =
+        new DefaultExceptionAsMapHandler<String>() {};
+    return new ParseJsonsWithFailures<>(
+        exceptionHandler, exceptionHandler.getOutputTypeDescriptor());
+  }
+
+  private OutputT readValue(String input) throws IOException {
+    ObjectMapper mapper = Optional.ofNullable(customMapper).orElse(DEFAULT_MAPPER);
+    return mapper.readValue(input, outputClass);
+  }
+
   @Override
   public PCollection<OutputT> expand(PCollection<String> input) {
     return input.apply(
@@ -63,8 +153,7 @@
               @Override
               public OutputT apply(String input) {
                 try {
-                  ObjectMapper mapper = Optional.fromNullable(customMapper).or(DEFAULT_MAPPER);
-                  return mapper.readValue(input, outputClass);
+                  return readValue(input);
                 } catch (IOException e) {
                   throw new RuntimeException(
                       "Failed to parse a " + outputClass.getName() + " from JSON value: " + input,
@@ -73,4 +162,91 @@
               }
             }));
   }
+
+  /** A {@code PTransform} that adds exception handling to {@link ParseJsons}. */
+  public class ParseJsonsWithFailures<FailureT>
+      extends PTransform<PCollection<String>, WithFailures.Result<PCollection<OutputT>, FailureT>> {
+    @Nullable
+    private InferableFunction<WithFailures.ExceptionElement<String>, FailureT> exceptionHandler;
+
+    @Nullable private final transient TypeDescriptor<FailureT> failureType;
+
+    ParseJsonsWithFailures(
+        InferableFunction<WithFailures.ExceptionElement<String>, FailureT> exceptionHandler,
+        TypeDescriptor<FailureT> failureType) {
+      this.exceptionHandler = exceptionHandler;
+      this.failureType = failureType;
+    }
+
+    /**
+     * Returns a new {@link ParseJsonsWithFailures} transform that catches exceptions raised while
+     * parsing elements, passing the raised exception instance and the input element being processed
+     * through the given {@code exceptionHandler} and emitting the result to a failure collection.
+     * It is supposed to be used along with {@link ParseJsons#exceptionsInto(TypeDescriptor)} and
+     * get lambda function as exception handler.
+     *
+     * <p>See {@link WithFailures} documentation for usage patterns of the returned {@link
+     * WithFailures.Result}.
+     *
+     * <p>Example usage:
+     *
+     * <pre>{@code
+     * WithFailures.Result<PCollection<MyPojo>, KV<String, Map<String, String>>> result =
+     *     json.apply(
+     *         ParseJsons.of(MyPojo.class)
+     *             .exceptionsInto(
+     *                 TypeDescriptors.kvs(TypeDescriptors.strings(), TypeDescriptors.strings()))
+     *             .exceptionsVia(
+     *                 f -> KV.of(f.element(), f.exception().getClass().getCanonicalName())));
+     *
+     * PCollection<MyPojo> output = result.output(); // valid POJOs
+     * PCollection<KV<String, Map<String, String>>> failures = result.failures();
+     * }</pre>
+     */
+    public ParseJsonsWithFailures<FailureT> exceptionsVia(
+        ProcessFunction<WithFailures.ExceptionElement<String>, FailureT> exceptionHandler) {
+      return new ParseJsonsWithFailures<>(
+          new InferableFunction<WithFailures.ExceptionElement<String>, FailureT>(
+              exceptionHandler) {},
+          failureType);
+    }
+
+    @Override
+    public WithFailures.Result<PCollection<OutputT>, FailureT> expand(PCollection<String> input) {
+      return input.apply(
+          MapElements.into(new TypeDescriptor<OutputT>() {})
+              .via(
+                  Contextful.fn(
+                      (Contextful.Fn<String, OutputT>) (input1, c) -> readValue(input1),
+                      Requirements.empty()))
+              .exceptionsInto(failureType)
+              .exceptionsVia(exceptionHandler));
+    }
+  }
+
+  /**
+   * A default handler that extracts information from an exception to a {@code Map<String, String>}
+   * and returns a {@link KV} where the key is the input element that failed processing, and the
+   * value is the map of exception attributes. It handles only {@code IOException}, other type of
+   * exceptions will be rethrown as {@code RuntimeException}.
+   *
+   * <p>The keys populated in the map are "className", "message", and "stackTrace" of the exception.
+   */
+  private static class DefaultExceptionAsMapHandler<OutputT>
+      extends SimpleFunction<
+          WithFailures.ExceptionElement<OutputT>, KV<OutputT, Map<String, String>>> {
+    @Override
+    public KV<OutputT, Map<String, String>> apply(WithFailures.ExceptionElement<OutputT> f)
+        throws RuntimeException {
+      if (!(f.exception() instanceof IOException)) {
+        throw new RuntimeException(f.exception());
+      }
+      return KV.of(
+          f.element(),
+          ImmutableMap.of(
+              "className", f.exception().getClass().getName(),
+              "message", f.exception().getMessage(),
+              "stackTrace", Arrays.toString(f.exception().getStackTrace())));
+    }
+  }
 }
diff --git a/sdks/java/extensions/jackson/src/test/java/org/apache/beam/sdk/extensions/jackson/JacksonTransformsTest.java b/sdks/java/extensions/jackson/src/test/java/org/apache/beam/sdk/extensions/jackson/JacksonTransformsTest.java
index e408d74..4414465 100644
--- a/sdks/java/extensions/jackson/src/test/java/org/apache/beam/sdk/extensions/jackson/JacksonTransformsTest.java
+++ b/sdks/java/extensions/jackson/src/test/java/org/apache/beam/sdk/extensions/jackson/JacksonTransformsTest.java
@@ -17,25 +17,38 @@
  */
 package org.apache.beam.sdk.extensions.jackson;
 
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.hasKey;
+import static org.hamcrest.Matchers.hasSize;
+import static org.junit.Assert.assertEquals;
+
 import com.fasterxml.jackson.databind.DeserializationFeature;
 import com.fasterxml.jackson.databind.ObjectMapper;
 import com.fasterxml.jackson.databind.SerializationFeature;
 import java.io.Serializable;
 import java.util.Arrays;
 import java.util.List;
+import java.util.Map;
 import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.coders.KvCoder;
+import org.apache.beam.sdk.coders.MapCoder;
 import org.apache.beam.sdk.coders.SerializableCoder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
+import org.apache.beam.sdk.transforms.SimpleFunction;
+import org.apache.beam.sdk.transforms.WithFailures;
+import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.TypeDescriptor;
+import org.apache.beam.sdk.values.TypeDescriptors;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
 import org.junit.Rule;
 import org.junit.Test;
 
 /** Test Jackson transforms {@link ParseJsons} and {@link AsJsons}. */
-public class JacksonTransformsTest {
+public class JacksonTransformsTest implements Serializable {
   private static final List<String> VALID_JSONS =
       Arrays.asList("{\"myString\":\"abc\",\"myInt\":3}", "{\"myString\":\"def\",\"myInt\":4}");
 
@@ -51,6 +64,9 @@
   private static final List<MyPojo> POJOS =
       Arrays.asList(new MyPojo("abc", 3), new MyPojo("def", 4));
 
+  private static final List<MyInvalidPojo> INVALID_POJOS =
+      Arrays.asList(new MyInvalidPojo("aaa", 5), new MyInvalidPojo("bbb", 6));
+
   private static final List<MyEmptyBean> EMPTY_BEANS =
       Arrays.asList(new MyEmptyBean("abc", 3), new MyEmptyBean("def", 4));
 
@@ -82,6 +98,83 @@
     pipeline.run();
   }
 
+  @Test
+  public void testParsingInvalidJsonsWithFailuresDefaultHandler() {
+    WithFailures.Result<PCollection<MyPojo>, KV<String, Map<String, String>>> result =
+        pipeline
+            .apply(Create.of(Iterables.concat(VALID_JSONS, INVALID_JSONS)))
+            .apply(ParseJsons.of(MyPojo.class).exceptionsVia());
+
+    result.output().setCoder(SerializableCoder.of(MyPojo.class));
+
+    PAssert.that(result.output()).containsInAnyOrder(POJOS);
+    assertParsingWithErrorMapHandler(result);
+
+    pipeline.run();
+  }
+
+  @Test
+  public void testParsingInvalidJsonsWithFailuresAsMap() {
+    WithFailures.Result<PCollection<MyPojo>, KV<String, Map<String, String>>> result =
+        pipeline
+            .apply(Create.of(Iterables.concat(VALID_JSONS, INVALID_JSONS)))
+            .apply(
+                ParseJsons.of(MyPojo.class)
+                    .exceptionsVia(new WithFailures.ExceptionAsMapHandler<String>() {}));
+
+    result.output().setCoder(SerializableCoder.of(MyPojo.class));
+
+    PAssert.that(result.output()).containsInAnyOrder(POJOS);
+    assertParsingWithErrorMapHandler(result);
+
+    pipeline.run();
+  }
+
+  @Test
+  public void testParsingInvalidJsonsWithFailuresSimpleFunction() {
+    WithFailures.Result<PCollection<MyPojo>, KV<String, String>> result =
+        pipeline
+            .apply(Create.of(Iterables.concat(VALID_JSONS, INVALID_JSONS)))
+            .apply(
+                ParseJsons.of(MyPojo.class)
+                    .exceptionsVia(
+                        new SimpleFunction<
+                            WithFailures.ExceptionElement<String>, KV<String, String>>() {
+                          @Override
+                          public KV<String, String> apply(
+                              WithFailures.ExceptionElement<String> failure) {
+                            return KV.of(
+                                failure.element(),
+                                failure.exception().getClass().getCanonicalName());
+                          }
+                        }));
+    result.output().setCoder(SerializableCoder.of(MyPojo.class));
+
+    PAssert.that(result.output()).containsInAnyOrder(POJOS);
+    assertParsingWithErrorFunctionHandler(result);
+
+    pipeline.run();
+  }
+
+  @Test
+  public void testParsingInvalidJsonsWithFailuresLambda() {
+    WithFailures.Result<PCollection<MyPojo>, KV<String, String>> result =
+        pipeline
+            .apply(Create.of(Iterables.concat(VALID_JSONS, INVALID_JSONS)))
+            .apply(
+                ParseJsons.of(MyPojo.class)
+                    .exceptionsInto(
+                        TypeDescriptors.kvs(TypeDescriptors.strings(), TypeDescriptors.strings()))
+                    .exceptionsVia(
+                        f -> KV.of(f.element(), f.exception().getClass().getCanonicalName())));
+    result.output().setCoder(SerializableCoder.of(MyPojo.class));
+
+    PAssert.that(result.output()).containsInAnyOrder(POJOS);
+    assertParsingWithErrorFunctionHandler(result);
+
+    pipeline.run();
+  }
+
   @Test(expected = Pipeline.PipelineExecutionException.class)
   public void failParsingWithoutCustomMapper() {
     PCollection<MyPojo> output =
@@ -150,6 +243,99 @@
     pipeline.run();
   }
 
+  @Test
+  public void testWritingInvalidJsonsWithFailuresDefaultHandler() {
+    WithFailures.Result<PCollection<String>, KV<MyPojo, Map<String, String>>> result =
+        pipeline
+            .apply(
+                Create.of(Iterables.concat(POJOS, INVALID_POJOS))
+                    .withCoder(SerializableCoder.of(MyPojo.class)))
+            .apply(AsJsons.of(MyPojo.class).exceptionsVia());
+
+    result.output().setCoder(StringUtf8Coder.of());
+
+    result
+        .failures()
+        .setCoder(
+            KvCoder.of(
+                SerializableCoder.of(MyPojo.class),
+                MapCoder.of(StringUtf8Coder.of(), StringUtf8Coder.of())));
+
+    PAssert.that(result.output()).containsInAnyOrder(VALID_JSONS);
+    assertWritingWithErrorMapHandler(result);
+
+    pipeline.run();
+  }
+
+  @Test
+  public void testWritingInvalidJsonsWithFailuresAsMap() {
+    WithFailures.Result<PCollection<String>, KV<MyPojo, Map<String, String>>> result =
+        pipeline
+            .apply(
+                Create.of(Iterables.concat(POJOS, INVALID_POJOS))
+                    .withCoder(SerializableCoder.of(MyPojo.class)))
+            .apply(
+                AsJsons.of(MyPojo.class)
+                    .exceptionsVia(new WithFailures.ExceptionAsMapHandler<MyPojo>() {}));
+
+    result.output().setCoder(StringUtf8Coder.of());
+
+    PAssert.that(result.output()).containsInAnyOrder(VALID_JSONS);
+    assertWritingWithErrorMapHandler(result);
+
+    pipeline.run();
+  }
+
+  @Test
+  public void testWritingInvalidJsonsWithFailuresSimpleFunction() {
+    WithFailures.Result<PCollection<String>, KV<MyPojo, String>> result =
+        pipeline
+            .apply(
+                Create.of(Iterables.concat(POJOS, INVALID_POJOS))
+                    .withCoder(SerializableCoder.of(MyPojo.class)))
+            .apply(
+                AsJsons.of(MyPojo.class)
+                    .exceptionsVia(
+                        new SimpleFunction<
+                            WithFailures.ExceptionElement<MyPojo>, KV<MyPojo, String>>() {
+                          @Override
+                          public KV<MyPojo, String> apply(
+                              WithFailures.ExceptionElement<MyPojo> failure) {
+                            return KV.of(
+                                failure.element(),
+                                failure.exception().getClass().getCanonicalName());
+                          }
+                        }));
+    result.output().setCoder(StringUtf8Coder.of());
+
+    PAssert.that(result.output()).containsInAnyOrder(VALID_JSONS);
+    assertWritingWithErrorFunctionHandler(result);
+
+    pipeline.run();
+  }
+
+  @Test
+  public void testWritingInvalidJsonsWithFailuresLambda() {
+    WithFailures.Result<PCollection<String>, KV<MyPojo, String>> result =
+        pipeline
+            .apply(
+                Create.of(Iterables.concat(POJOS, INVALID_POJOS))
+                    .withCoder(SerializableCoder.of(MyPojo.class)))
+            .apply(
+                AsJsons.of(MyPojo.class)
+                    .exceptionsInto(
+                        TypeDescriptors.kvs(
+                            TypeDescriptor.of(MyPojo.class), TypeDescriptors.strings()))
+                    .exceptionsVia(
+                        f -> KV.of(f.element(), f.exception().getClass().getCanonicalName())));
+    result.output().setCoder(StringUtf8Coder.of());
+
+    PAssert.that(result.output()).containsInAnyOrder(VALID_JSONS);
+    assertWritingWithErrorFunctionHandler(result);
+
+    pipeline.run();
+  }
+
   /** Pojo for tests. */
   @SuppressWarnings({"WeakerAccess", "unused"})
   public static class MyPojo implements Serializable {
@@ -238,4 +424,84 @@
       return result;
     }
   }
+
+  /** Pojo for tests. */
+  @SuppressWarnings({"WeakerAccess", "unused"})
+  public static class MyInvalidPojo extends MyPojo {
+    public MyInvalidPojo(String myString, int myInt) {
+      super(myString, myInt);
+    }
+
+    @Override
+    public String getMyString() {
+      throw new RuntimeException("Unknown error!");
+    }
+  }
+
+  private void assertParsingWithErrorMapHandler(
+      WithFailures.Result<PCollection<MyPojo>, KV<String, Map<String, String>>> result) {
+    PAssert.that(result.failures())
+        .satisfies(
+            kv -> {
+              for (KV<String, Map<String, String>> entry : kv) {
+                if (entry.getKey().equals(INVALID_JSONS.get(0))) {
+                  assertEquals(
+                      "com.fasterxml.jackson.core.JsonParseException",
+                      entry.getValue().get("className"));
+                } else if (entry.getKey().equals(INVALID_JSONS.get(1))) {
+                  assertEquals(
+                      "com.fasterxml.jackson.core.io.JsonEOFException",
+                      entry.getValue().get("className"));
+                } else if (entry.getKey().equals(INVALID_JSONS.get(2))) {
+                  assertEquals(
+                      "com.fasterxml.jackson.databind.exc.MismatchedInputException",
+                      entry.getValue().get("className"));
+                } else {
+                  throw new AssertionError(
+                      "Unexpected key is found in failures result: \"" + entry.getKey() + "\"");
+                }
+                assertThat(entry.getValue().entrySet(), hasSize(3));
+                assertThat(entry.getValue(), hasKey("stackTrace"));
+                assertThat(entry.getValue(), hasKey("message"));
+              }
+
+              return null;
+            });
+  }
+
+  private void assertParsingWithErrorFunctionHandler(
+      WithFailures.Result<PCollection<MyPojo>, KV<String, String>> result) {
+    PAssert.that(result.failures())
+        .containsInAnyOrder(
+            KV.of(INVALID_JSONS.get(0), "com.fasterxml.jackson.core.JsonParseException"),
+            KV.of(INVALID_JSONS.get(1), "com.fasterxml.jackson.core.io.JsonEOFException"),
+            KV.of(
+                INVALID_JSONS.get(2),
+                "com.fasterxml.jackson.databind.exc.MismatchedInputException"));
+  }
+
+  private void assertWritingWithErrorMapHandler(
+      WithFailures.Result<PCollection<String>, KV<MyPojo, Map<String, String>>> result) {
+    PAssert.that(result.failures())
+        .satisfies(
+            kv -> {
+              for (KV<MyPojo, Map<String, String>> entry : kv) {
+                assertThat(entry.getValue().entrySet(), hasSize(3));
+                assertThat(entry.getValue(), hasKey("stackTrace"));
+                assertThat(entry.getValue(), hasKey("message"));
+                assertEquals(
+                    "com.fasterxml.jackson.databind.JsonMappingException",
+                    entry.getValue().get("className"));
+              }
+              return null;
+            });
+  }
+
+  private void assertWritingWithErrorFunctionHandler(
+      WithFailures.Result<PCollection<String>, KV<MyPojo, String>> result) {
+    PAssert.that(result.failures())
+        .containsInAnyOrder(
+            KV.of(INVALID_POJOS.get(0), "com.fasterxml.jackson.databind.JsonMappingException"),
+            KV.of(INVALID_POJOS.get(1), "com.fasterxml.jackson.databind.JsonMappingException"));
+  }
 }
diff --git a/sdks/java/extensions/join-library/build.gradle b/sdks/java/extensions/join-library/build.gradle
index 73353a7..c3c79e9 100644
--- a/sdks/java/extensions/join-library/build.gradle
+++ b/sdks/java/extensions/join-library/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.joinlibrary')
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: Join library"
 
diff --git a/sdks/java/extensions/kryo/build.gradle b/sdks/java/extensions/kryo/build.gradle
index fe5fed7..24c8e0c 100644
--- a/sdks/java/extensions/kryo/build.gradle
+++ b/sdks/java/extensions/kryo/build.gradle
@@ -23,6 +23,7 @@
 }
 
 applyJavaNature(
+    automaticModuleName: 'org.apache.beam.sdk.extensions.kryo',
     exportJavadoc: false,
     shadowClosure: {
     dependencies {
diff --git a/sdks/java/extensions/protobuf/build.gradle b/sdks/java/extensions/protobuf/build.gradle
index f589161..2f068b2 100644
--- a/sdks/java/extensions/protobuf/build.gradle
+++ b/sdks/java/extensions/protobuf/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.protobuf')
 applyGrpcNature()
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: Protobuf"
diff --git a/sdks/java/extensions/sketching/build.gradle b/sdks/java/extensions/sketching/build.gradle
index 17e367c..1f403ee 100644
--- a/sdks/java/extensions/sketching/build.gradle
+++ b/sdks/java/extensions/sketching/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.sketching')
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: Sketching"
 
diff --git a/sdks/java/extensions/sorter/build.gradle b/sdks/java/extensions/sorter/build.gradle
index 349de32..94cbba5 100644
--- a/sdks/java/extensions/sorter/build.gradle
+++ b/sdks/java/extensions/sorter/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.sorter')
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: Sorter"
 
diff --git a/sdks/java/extensions/sql/build.gradle b/sdks/java/extensions/sql/build.gradle
index 6514a8f..d5e4bd2 100644
--- a/sdks/java/extensions/sql/build.gradle
+++ b/sdks/java/extensions/sql/build.gradle
@@ -23,47 +23,9 @@
   id 'ca.coglinc.javacc'
 }
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.sdk.extensions.sql',
   // javacc generated code produces lint warnings
-  disableLintWarnings: ['dep-ann'],
-  testShadowJar: true,
-  enableStrictDependencies: true,
-  shadowClosure: {
-    dependencies {
-      include(dependency(library.java.guava))
-      include(dependency(library.java.protobuf_java))
-      include(dependency(library.java.protobuf_java_util))
-      include(dependency("org.apache.calcite:.*"))
-      include(dependency("org.apache.calcite.avatica:.*"))
-      include(dependency("org.codehaus.janino:.*"))
-      include(dependency("com.google.zetasql:.*"))
-    }
-    // guava uses the com.google.common and com.google.thirdparty package namespaces
-    relocate("com.google.common", project.getJavaRelocatedPath("com.google.common")) {
-      // com.google.common is too generic, need to exclude guava-testlib
-      exclude "com.google.common.collect.testing.**"
-      exclude "com.google.common.escape.testing.**"
-      exclude "com.google.common.testing.**"
-      exclude "com.google.common.util.concurrent.testing.**"
-    }
-    relocate "com.google.cloud", getJavaRelocatedPath("com.google.cloud")
-    relocate "com.google.logging", getJavaRelocatedPath("com.google.logging")
-    relocate "com.google.longrunning", getJavaRelocatedPath("com.google.longrunning")
-    relocate "com.google.rpc", getJavaRelocatedPath("com.google.rpc")
-
-    relocate "com.google.thirdparty", project.getJavaRelocatedPath("com.google.thirdparty")
-
-    relocate "com.google.protobuf", getJavaRelocatedPath("com.google.protobuf")
-    relocate "com.google.zetasql", getJavaRelocatedPath("com.google.zetasql")
-    relocate "org.apache.calcite", getJavaRelocatedPath("org.apache.calcite")
-
-  // Looking up the compiler factory in Calcite depends on having a properties
-  // file in the right location. We package one that is shading compatible
-  // in src/main/resources. Note that if this shaded path changes, that
-  // files name and contents need to be updated as well. TODO, swap to use
-  // getJavaRelocatedPath once the Maven build is no longer also shading this
-  // module.
-  relocate "org.codehaus", "org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus"
-})
+  disableLintWarnings: ['dep-ann'])
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: SQL"
 ext.summary = "Beam SQL provides a new interface to generate a Beam pipeline from SQL statement"
@@ -76,70 +38,40 @@
   fmppTemplates
 }
 
-def calcite_version = "1.20.0"
-def avatica_version = "1.15.0"
 def zetasql_version = "2019.09.1"
 
 dependencies {
   javacc "net.java.dev.javacc:javacc:4.0"
   fmppTask "com.googlecode.fmpp-maven-plugin:fmpp-maven-plugin:1.0"
   fmppTask "org.freemarker:freemarker:2.3.28"
-  fmppTemplates "org.apache.calcite:calcite-core:$calcite_version"
-  shadow library.java.vendored_guava_26_0_jre // Internal use
-  compile library.java.guava // Interfaces with Calcite use this
-  compile "org.apache.calcite:calcite-core:$calcite_version"
-  compile "org.apache.calcite:calcite-linq4j:$calcite_version"
-  compile "org.apache.calcite.avatica:avatica-core:$avatica_version"
-  compile "com.google.api.grpc:proto-google-common-protos:1.12.0" // Interfaces with ZetaSQL use this
+  fmppTemplates library.java.vendored_calcite_1_20_0
+  compile project(":sdks:java:core")
+  compile project(":sdks:java:extensions:join-library")
+  compile project(":runners:direct-java")
+  compile library.java.commons_csv
+  compile library.java.vendored_calcite_1_20_0
+  compile "com.alibaba:fastjson:1.2.49"
+  compile "org.codehaus.janino:janino:3.0.11"
+  compile "org.codehaus.janino:commons-compiler:3.0.11"
   compile "com.google.zetasql:zetasql-jni-channel:$zetasql_version"
   compile "com.google.zetasql:zetasql-client:$zetasql_version"
   compile "com.google.zetasql:zetasql-types:$zetasql_version"
-  shadow project(path: ":sdks:java:core", configuration: "shadow")
-  shadow project(":sdks:java:extensions:join-library")
-  shadow library.java.slf4j_api
-  shadow library.java.commons_codec
-  shadow library.java.commons_csv
-  shadow library.java.commons_lang3
-  shadow library.java.jackson_databind
-  shadow library.java.jackson_dataformat_yaml
-  shadow library.java.joda_time
-  shadow library.java.protobuf_java
-  shadow library.java.protobuf_java_util
-  shadow "com.alibaba:fastjson:1.2.49"
-  shadow "com.jayway.jsonpath:json-path:2.4.0"
-  shadow project(path: ":runners:direct-java", configuration: "shadow")
+  compile "com.google.api.grpc:proto-google-common-protos:1.12.0" // Interfaces with ZetaSQL use this
   provided project(":sdks:java:io:kafka")
   provided project(":sdks:java:io:google-cloud-platform")
   provided project(":sdks:java:io:parquet")
   provided library.java.kafka_clients
-  shadowTest library.java.junit
-  shadowTest library.java.hamcrest_core
-  shadowTest library.java.hamcrest_library
-  shadowTest library.java.mockito_core
-  shadowTest library.java.quickcheck_core
-  shadowTestRuntimeClasspath library.java.slf4j_jdk14
-
-  // Dependencies that we don't directly reference
-  permitUnusedDeclared "com.jayway.jsonpath:json-path:2.4.0"
-  permitUnusedDeclared "net.jcip:jcip-annotations:1.0"
-  permitUnusedDeclared library.java.jackson_dataformat_yaml
-
-  permitUnusedDeclared "com.google.api.grpc:proto-google-common-protos:1.12.0"
-  permitUnusedDeclared "com.google.zetasql:zetasql-jni-channel:$zetasql_version"
-  permitUnusedDeclared "com.google.protobuf:protobuf-java-util:3.6.0"
-
-  // Dependencies that are bundled in when we bundle Calcite
-  permitUsedUndeclared "org.codehaus.janino:janino:3.0.11"
-  permitUsedUndeclared "org.codehaus.janino:commons-compiler:3.0.11"
-
-  // Dependencies where one or the other appears "used" depending on classpath,
-  // but it doesn't matter which is used
-  permitUsedUndeclared "com.google.code.findbugs:jsr305:3.0.2"
-  permitUsedUndeclared "org.apache.avro:avro:1.8.2"
-
+  testCompile library.java.vendored_calcite_1_20_0
+  testCompile library.java.vendored_guava_26_0_jre
+  testCompile library.java.junit
+  testCompile library.java.hamcrest_core
+  testCompile library.java.hamcrest_library
+  testCompile library.java.mockito_core
+  testCompile library.java.quickcheck_core
+  testRuntimeClasspath library.java.slf4j_jdk14
 }
 
-// Copy Caclcite templates and our own template into the build directory
+// Copy Calcite templates and our own template into the build directory
 // so we have one location for the FMPP task to parse.
 task copyFmppTemplatesFromSrc(type: Copy) {
   from "src/main/codegen"
@@ -148,11 +80,19 @@
 task copyFmppTemplatesFromCalciteCore(type: Copy) {
   dependsOn configurations.fmppTemplates
   File calciteCoreJar = files(configurations.fmppTemplates.files).filter {
-    it.name.startsWith("calcite-core")
+    it.name.startsWith("beam-vendor-calcite")
   }.singleFile
   from zipTree(calciteCoreJar)
   include "**/Parser.jj"
   into "${project.buildDir}/templates-fmpp"
+  filter{
+    line ->
+      line.replace('import org.apache.calcite.', 'import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.')
+  }
+  filter{
+    line ->
+      line.replace('import static org.apache.calcite.', 'import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.')
+  }
 }
 
 // Generate the FMPP sources from the FMPP templates.
diff --git a/sdks/java/extensions/sql/datacatalog/build.gradle b/sdks/java/extensions/sql/datacatalog/build.gradle
index 530c2ec..4e95f46 100644
--- a/sdks/java/extensions/sql/datacatalog/build.gradle
+++ b/sdks/java/extensions/sql/datacatalog/build.gradle
@@ -20,12 +20,12 @@
 
 plugins { id 'org.apache.beam.module' }
 
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.sql.datacatalog')
 
 dependencies {
   compile library.java.grpc_google_cloud_datacatalog_v1beta1
   compile library.java.proto_google_cloud_datacatalog_v1beta1
-  provided project(path: ":sdks:java:extensions:sql", configuration: "shadow")
+  provided project(":sdks:java:extensions:sql")
 
   // For Data Catalog GRPC client
   provided library.java.grpc_all
@@ -35,7 +35,6 @@
 
   // Dependencies for the example
   provided project(":sdks:java:io:google-cloud-platform")
-  provided library.java.vendored_guava_26_0_jre
   provided library.java.slf4j_api
   testRuntimeOnly library.java.slf4j_simple
 }
diff --git a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlDataCatalogExample.java b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlDataCatalogExample.java
index a682b2e..81595cb 100644
--- a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlDataCatalogExample.java
+++ b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlDataCatalogExample.java
@@ -31,7 +31,7 @@
 import org.apache.beam.sdk.transforms.MapElements;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TypeDescriptor;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Strings;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Strings;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
diff --git a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/DataCatalogTableProvider.java b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/DataCatalogTableProvider.java
index c4be689..24770b8 100644
--- a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/DataCatalogTableProvider.java
+++ b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/DataCatalogTableProvider.java
@@ -39,8 +39,8 @@
 import org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider;
 import org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonTableProvider;
 import org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /** Uses DataCatalog to get the source type and schema for a table. */
 public class DataCatalogTableProvider extends FullNameTableProvider {
diff --git a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/SchemaUtils.java b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/SchemaUtils.java
index 048ea7b..31174c7 100644
--- a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/SchemaUtils.java
+++ b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/SchemaUtils.java
@@ -26,8 +26,8 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.Field;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Strings;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Strings;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 class SchemaUtils {
 
diff --git a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/TableUtils.java b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/TableUtils.java
index c771f59..cd16615 100644
--- a/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/TableUtils.java
+++ b/sdks/java/extensions/sql/datacatalog/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/datacatalog/TableUtils.java
@@ -22,7 +22,7 @@
 import java.util.Map;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /** Common utilities to create Beam SQL tables from Data Catalog schemas. */
 class TableUtils {
diff --git a/sdks/java/extensions/sql/hcatalog/build.gradle b/sdks/java/extensions/sql/hcatalog/build.gradle
index 08b2c26..0994ea4 100644
--- a/sdks/java/extensions/sql/hcatalog/build.gradle
+++ b/sdks/java/extensions/sql/hcatalog/build.gradle
@@ -20,13 +20,13 @@
 
 plugins { id 'org.apache.beam.module' }
 
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog')
 
 def hive_version = "2.1.0"
 def netty_version = "4.1.30.Final"
 
 dependencies {
-  provided project(path: ":sdks:java:extensions:sql", configuration: "shadow")
+  provided project(":sdks:java:extensions:sql")
   provided project(":sdks:java:io:hcatalog")
 
   // Needed for HCatalogTableProvider tests,
diff --git a/sdks/java/extensions/sql/jdbc/build.gradle b/sdks/java/extensions/sql/jdbc/build.gradle
index 82a1888..acddedf 100644
--- a/sdks/java/extensions/sql/jdbc/build.gradle
+++ b/sdks/java/extensions/sql/jdbc/build.gradle
@@ -20,6 +20,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.sdk.extensions.sql.jdbc',
   exportJavadoc: false,
   testShadowJar: true,
   validateShadowJar: false,
@@ -31,7 +32,7 @@
 }
 
 dependencies {
-  compile project(path: ":sdks:java:extensions:sql", configuration: "shadow")
+  compile project(":sdks:java:extensions:sql")
   compile "jline:jline:2.14.6"
   compile "sqlline:sqlline:1.4.0"
   compile library.java.slf4j_jdk14
diff --git a/sdks/java/extensions/sql/shell/build.gradle b/sdks/java/extensions/sql/shell/build.gradle
index 28c8823..7422a94 100644
--- a/sdks/java/extensions/sql/shell/build.gradle
+++ b/sdks/java/extensions/sql/shell/build.gradle
@@ -22,8 +22,8 @@
 }
 
 dependencies {
-  compile project(path: ":sdks:java:extensions:sql:jdbc", configuration: "shadow")
-  permitUnusedDeclared project(path: ":sdks:java:extensions:sql:jdbc", configuration: "shadow")
+  compile project(":sdks:java:extensions:sql:jdbc")
+  permitUnusedDeclared project(":sdks:java:extensions:sql:jdbc")
 
   if (project.hasProperty("beam.sql.shell.bundled")) {
     project.getProperty("beam.sql.shell.bundled").tokenize(",").each {
diff --git a/sdks/java/extensions/sql/src/main/codegen/config.fmpp b/sdks/java/extensions/sql/src/main/codegen/config.fmpp
index ec163d5..8dcb04b 100644
--- a/sdks/java/extensions/sql/src/main/codegen/config.fmpp
+++ b/sdks/java/extensions/sql/src/main/codegen/config.fmpp
@@ -21,15 +21,15 @@
 
       # List of import statements.
       imports: [
-        "org.apache.calcite.schema.ColumnStrategy"
-        "org.apache.calcite.sql.SqlCreate"
-        "org.apache.calcite.sql.SqlDrop"
+        "org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ColumnStrategy"
+        "org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlCreate"
+        "org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlDrop"
+        "org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName"
         "org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable"
         "org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes"
         "org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam"
-        "org.apache.beam.sdk.schemas.Schema"
         "org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils"
-        "org.apache.calcite.sql.type.SqlTypeName"
+        "org.apache.beam.sdk.schemas.Schema"
       ]
 
       # List of keywords.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/SqlTransform.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/SqlTransform.java
index 0851365..3e07010 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/SqlTransform.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/SqlTransform.java
@@ -40,8 +40,8 @@
 import org.apache.beam.sdk.values.PValue;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TupleTag;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /**
  * {@link SqlTransform} is the DSL interface of Beam SQL. It translates a SQL query as a {@link
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/TableNameExtractionUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/TableNameExtractionUtils.java
index 556c246..c6b1774 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/TableNameExtractionUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/TableNameExtractionUtils.java
@@ -23,14 +23,14 @@
 import java.util.Collections;
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.TableName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlAsOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSelect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSetOperator;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.sql.SqlAsOperator;
-import org.apache.calcite.sql.SqlCall;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlJoin;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlSelect;
-import org.apache.calcite.sql.SqlSetOperator;
 
 /**
  * Helper class to extract table identifiers from the query.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlExample.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlExample.java
index f092018..8496a71 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlExample.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/example/BeamSqlExample.java
@@ -24,7 +24,6 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.MapElements;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.transforms.SimpleFunction;
 import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
@@ -60,11 +59,7 @@
 
     // create a source PCollection with Create.of();
     PCollection<Row> inputTable =
-        PBegin.in(p)
-            .apply(
-                Create.of(row1, row2, row3)
-                    .withSchema(
-                        type, SerializableFunctions.identity(), SerializableFunctions.identity()));
+        PBegin.in(p).apply(Create.of(row1, row2, row3).withRowSchema(type));
 
     // Case 1. run a simple SQL query over input PCollection with BeamSql.simpleQuery;
     PCollection<Row> outputStream =
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchema.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchema.java
index ae84d36..e8a1f5f 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchema.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchema.java
@@ -24,13 +24,13 @@
 import java.util.Set;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.rel.type.RelProtoDataType;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.Schema;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.schema.SchemaVersion;
-import org.apache.calcite.schema.Schemas;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelProtoDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Schema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaVersion;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Schemas;
 
 /** Adapter from {@link TableProvider} to {@link Schema}. */
 public class BeamCalciteSchema implements Schema {
@@ -99,7 +99,8 @@
   }
 
   @Override
-  public org.apache.calcite.schema.Table getTable(String name) {
+  public org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table getTable(
+      String name) {
     Table table = tableProvider.getTable(name);
     if (table == null) {
       return null;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchemaFactory.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchemaFactory.java
index e33f4f9..2c6151a 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchemaFactory.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteSchemaFactory.java
@@ -27,16 +27,16 @@
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
 import org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore;
 import org.apache.beam.sdk.extensions.sql.meta.store.MetaStore;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.jdbc.CalciteConnection;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.rel.type.RelProtoDataType;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.Schema;
-import org.apache.calcite.schema.SchemaFactory;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.schema.SchemaVersion;
-import org.apache.calcite.schema.Table;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelProtoDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Schema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaVersion;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table;
 
 /**
  * Factory classes that Calcite uses to create initial schema for JDBC connection.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteTable.java
index 9a889a9..3c07b21 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamCalciteTable.java
@@ -26,21 +26,21 @@
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel;
 import org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils;
 import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.adapter.java.AbstractQueryableTable;
-import org.apache.calcite.linq4j.QueryProvider;
-import org.apache.calcite.linq4j.Queryable;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptTable;
-import org.apache.calcite.prepare.Prepare;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.TableModify;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.schema.ModifiableTable;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.schema.TranslatableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.AbstractQueryableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.QueryProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Queryable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.Prepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.TableModify;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ModifiableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.TranslatableTable;
 
 /** Adapter from {@link BeamSqlTable} to a calcite Table. */
 public class BeamCalciteTable extends AbstractQueryableTable
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
index 2da3f52..8b03687 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkNotNull;
 
 import java.lang.reflect.Method;
 import java.sql.SQLException;
@@ -43,12 +43,12 @@
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.transforms.Combine.CombineFn;
 import org.apache.beam.sdk.transforms.SerializableFunction;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Strings;
-import org.apache.calcite.jdbc.CalcitePrepare;
-import org.apache.calcite.plan.RelOptUtil;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.sql.SqlExecutableStatement;
-import org.apache.calcite.tools.RuleSet;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Strings;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalcitePrepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlExecutableStatement;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSet;
 
 /**
  * Contains the metadata of tables/UDF functions, and exposes APIs to
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlPipelineOptionsRegistrar.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlPipelineOptionsRegistrar.java
index 7b4fbe4..5a1d313 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlPipelineOptionsRegistrar.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlPipelineOptionsRegistrar.java
@@ -20,7 +20,7 @@
 import com.google.auto.service.AutoService;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.PipelineOptionsRegistrar;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
 
 /** {@link AutoService} registrar for {@link BeamSqlPipelineOptions}. */
 @AutoService(PipelineOptionsRegistrar.class)
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamTableStatistics.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamTableStatistics.java
index 0571d77..b5d6a2e 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamTableStatistics.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamTableStatistics.java
@@ -21,13 +21,13 @@
 import java.util.List;
 import org.apache.beam.sdk.annotations.Experimental;
 import org.apache.beam.sdk.annotations.Internal;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rel.RelCollation;
-import org.apache.calcite.rel.RelDistribution;
-import org.apache.calcite.rel.RelDistributionTraitDef;
-import org.apache.calcite.rel.RelReferentialConstraint;
-import org.apache.calcite.schema.Statistic;
-import org.apache.calcite.util.ImmutableBitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelDistribution;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelDistributionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelReferentialConstraint;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Statistic;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ImmutableBitSet;
 
 /** This class stores row count statistics. */
 @Experimental
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteConnectionWrapper.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteConnectionWrapper.java
index e376ae5..0bdab10 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteConnectionWrapper.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteConnectionWrapper.java
@@ -35,14 +35,14 @@
 import java.util.Map;
 import java.util.Properties;
 import java.util.concurrent.Executor;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.config.CalciteConnectionConfig;
-import org.apache.calcite.jdbc.CalciteConnection;
-import org.apache.calcite.jdbc.CalcitePrepare;
-import org.apache.calcite.linq4j.Enumerator;
-import org.apache.calcite.linq4j.Queryable;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalcitePrepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Enumerator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Queryable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
 
 /**
  * Abstract wrapper for {@link CalciteConnection} to simplify extension.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteFactoryWrapper.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteFactoryWrapper.java
index a039154..6bd714f 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteFactoryWrapper.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteFactoryWrapper.java
@@ -21,18 +21,18 @@
 import java.sql.SQLException;
 import java.util.Properties;
 import java.util.TimeZone;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.avatica.AvaticaConnection;
-import org.apache.calcite.avatica.AvaticaFactory;
-import org.apache.calcite.avatica.AvaticaPreparedStatement;
-import org.apache.calcite.avatica.AvaticaResultSet;
-import org.apache.calcite.avatica.AvaticaSpecificDatabaseMetaData;
-import org.apache.calcite.avatica.AvaticaStatement;
-import org.apache.calcite.avatica.Meta;
-import org.apache.calcite.avatica.QueryState;
-import org.apache.calcite.avatica.UnregisteredDriver;
-import org.apache.calcite.jdbc.CalciteFactory;
-import org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaPreparedStatement;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaResultSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaSpecificDatabaseMetaData;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaStatement;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.Meta;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.QueryState;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.UnregisteredDriver;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
 
 /**
  * Wrapper for {@link CalciteFactory}.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteQueryPlanner.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteQueryPlanner.java
index 8215346..93c26fa 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteQueryPlanner.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/CalciteQueryPlanner.java
@@ -24,42 +24,42 @@
 import org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.config.CalciteConnectionConfig;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.plan.Contexts;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelOptCost;
-import org.apache.calcite.plan.RelOptPlanner.CannotPlanException;
-import org.apache.calcite.plan.RelOptUtil;
-import org.apache.calcite.plan.RelTraitDef;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.prepare.CalciteCatalogReader;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.RelRoot;
-import org.apache.calcite.rel.metadata.BuiltInMetadata;
-import org.apache.calcite.rel.metadata.ChainedRelMetadataProvider;
-import org.apache.calcite.rel.metadata.JaninoRelMetadataProvider;
-import org.apache.calcite.rel.metadata.MetadataDef;
-import org.apache.calcite.rel.metadata.MetadataHandler;
-import org.apache.calcite.rel.metadata.ReflectiveRelMetadataProvider;
-import org.apache.calcite.rel.metadata.RelMetadataProvider;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlOperatorTable;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
-import org.apache.calcite.sql.parser.SqlParseException;
-import org.apache.calcite.sql.parser.SqlParser;
-import org.apache.calcite.sql.parser.SqlParserImplFactory;
-import org.apache.calcite.sql.util.ChainedSqlOperatorTable;
-import org.apache.calcite.tools.FrameworkConfig;
-import org.apache.calcite.tools.Frameworks;
-import org.apache.calcite.tools.Planner;
-import org.apache.calcite.tools.RelConversionException;
-import org.apache.calcite.tools.RuleSet;
-import org.apache.calcite.tools.ValidationException;
-import org.apache.calcite.util.BuiltInMethod;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Contexts;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCost;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner.CannotPlanException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.CalciteCatalogReader;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelRoot;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.BuiltInMetadata;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.ChainedRelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.JaninoRelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.MetadataDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.MetadataHandler;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.ReflectiveRelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParseException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParser;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserImplFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.util.ChainedSqlOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Frameworks;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Planner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelConversionException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.ValidationException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.BuiltInMethod;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcConnection.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcConnection.java
index 8426127..1783f53 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcConnection.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcConnection.java
@@ -25,10 +25,10 @@
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.values.KV;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.jdbc.CalciteConnection;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
 
 /**
  * Beam JDBC Connection.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriver.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriver.java
index edea9db..f012fc2 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriver.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriver.java
@@ -17,8 +17,8 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl;
 
-import static org.apache.calcite.config.CalciteConnectionProperty.SCHEMA_FACTORY;
-import static org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionProperty.SCHEMA_FACTORY;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory;
 
 import com.fasterxml.jackson.databind.ObjectMapper;
 import com.google.auto.service.AutoService;
@@ -32,19 +32,19 @@
 import org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets;
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
 import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.calcite.avatica.AvaticaFactory;
-import org.apache.calcite.jdbc.CalciteConnection;
-import org.apache.calcite.jdbc.CalciteFactory;
-import org.apache.calcite.jdbc.Driver;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelTraitDef;
-import org.apache.calcite.prepare.CalcitePrepareImpl;
-import org.apache.calcite.rel.RelCollationTraitDef;
-import org.apache.calcite.rel.rules.CalcRemoveRule;
-import org.apache.calcite.rel.rules.SortRemoveRule;
-import org.apache.calcite.runtime.Hook;
-import org.apache.calcite.tools.RuleSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.Driver;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.CalcitePrepareImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollationTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.CalcRemoveRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.SortRemoveRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.runtime.Hook;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSet;
 
 /**
  * Calcite JDBC driver with Beam defaults.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcFactory.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcFactory.java
index 70c1229..22a6a52 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcFactory.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/JdbcFactory.java
@@ -18,35 +18,37 @@
 package org.apache.beam.sdk.extensions.sql.impl;
 
 import static org.apache.beam.sdk.extensions.sql.impl.JdbcDriver.TOP_LEVEL_BEAM_SCHEMA;
-import static org.apache.calcite.avatica.BuiltInConnectionProperty.TIME_ZONE;
-import static org.apache.calcite.config.CalciteConnectionProperty.LEX;
-import static org.apache.calcite.config.CalciteConnectionProperty.PARSER_FACTORY;
-import static org.apache.calcite.config.CalciteConnectionProperty.SCHEMA;
-import static org.apache.calcite.config.CalciteConnectionProperty.SCHEMA_FACTORY;
-import static org.apache.calcite.config.CalciteConnectionProperty.TYPE_SYSTEM;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.BuiltInConnectionProperty.TIME_ZONE;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionProperty.LEX;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionProperty.PARSER_FACTORY;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionProperty.SCHEMA;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionProperty.SCHEMA_FACTORY;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionProperty.TYPE_SYSTEM;
 
 import java.util.Properties;
 import org.apache.beam.sdk.extensions.sql.impl.parser.impl.BeamSqlParserImpl;
 import org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem;
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
 import org.apache.beam.sdk.util.ReleaseInfo;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.avatica.AvaticaConnection;
-import org.apache.calcite.avatica.AvaticaFactory;
-import org.apache.calcite.avatica.ConnectionProperty;
-import org.apache.calcite.avatica.UnregisteredDriver;
-import org.apache.calcite.config.Lex;
-import org.apache.calcite.jdbc.CalciteFactory;
-import org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.AvaticaFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.ConnectionProperty;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.UnregisteredDriver;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.Lex;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
 
 /**
  * Implements {@link CalciteFactory} that is used by Clacite JDBC driver to instantiate different
  * JDBC objects, like connections, result sets, etc.
  *
  * <p>The purpose of this class is to intercept the connection creation and force a cache-less root
- * schema ({@link org.apache.calcite.jdbc.SimpleCalciteSchema}). Otherwise Calcite uses {@link
- * org.apache.calcite.jdbc.CachingCalciteSchema} that eagerly caches table information. This
- * behavior does not work well for dynamic table providers.
+ * schema ({@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.SimpleCalciteSchema}). Otherwise
+ * Calcite uses {@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CachingCalciteSchema} that eagerly
+ * caches table information. This behavior does not work well for dynamic table providers.
  */
 class JdbcFactory extends CalciteFactoryWrapper {
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/QueryPlanner.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/QueryPlanner.java
index 0593921..cec0045 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/QueryPlanner.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/QueryPlanner.java
@@ -18,7 +18,7 @@
 package org.apache.beam.sdk.extensions.sql.impl;
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode;
-import org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
 
 /**
  * An interface that planners should implement to convert sql statement to {@link BeamRelNode} or
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/ScalarFunctionImpl.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/ScalarFunctionImpl.java
index d052044..3ef4d9f 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/ScalarFunctionImpl.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/ScalarFunctionImpl.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl;
 
-import static org.apache.calcite.util.Static.RESOURCE;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Static.RESOURCE;
 
 import java.lang.reflect.Constructor;
 import java.lang.reflect.Method;
@@ -27,28 +27,29 @@
 import java.util.Arrays;
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMultimap;
-import org.apache.calcite.adapter.enumerable.CallImplementor;
-import org.apache.calcite.adapter.enumerable.NullPolicy;
-import org.apache.calcite.adapter.enumerable.ReflectiveCallNotNullImplementor;
-import org.apache.calcite.adapter.enumerable.RexImpTable;
-import org.apache.calcite.adapter.enumerable.RexToLixTranslator;
-import org.apache.calcite.avatica.util.ByteString;
-import org.apache.calcite.linq4j.function.SemiStrict;
-import org.apache.calcite.linq4j.function.Strict;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.linq4j.tree.Expressions;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rex.RexCall;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.ImplementableFunction;
-import org.apache.calcite.schema.ScalarFunction;
-import org.apache.calcite.sql.SqlOperatorBinding;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMultimap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.CallImplementor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.NullPolicy;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.ReflectiveCallNotNullImplementor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.RexImpTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.RexToLixTranslator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.ByteString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.function.SemiStrict;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.function.Strict;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expressions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ImplementableFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ScalarFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperatorBinding;
 
 /**
- * Beam-customized version from {@link org.apache.calcite.schema.impl.ScalarFunctionImpl}, to
+ * Beam-customized version from {@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.impl.ScalarFunctionImpl} , to
  * address BEAM-5921.
  */
 public class ScalarFunctionImpl extends UdfImplReflectiveFunctionBase
@@ -62,7 +63,10 @@
     this.implementor = implementor;
   }
 
-  /** Creates {@link org.apache.calcite.schema.Function} for each method in a given class. */
+  /**
+   * Creates {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function} for
+   * each method in a given class.
+   */
   public static ImmutableMultimap<String, Function> createAll(Class<?> clazz) {
     final ImmutableMultimap.Builder<String, Function> builder = ImmutableMultimap.builder();
     for (Method method : clazz.getMethods()) {
@@ -79,7 +83,8 @@
   }
 
   /**
-   * Creates {@link org.apache.calcite.schema.Function} from given class.
+   * Creates {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function} from
+   * given class.
    *
    * <p>If a method of the given name is not found or it does not suit, returns {@code null}.
    *
@@ -96,8 +101,8 @@
   }
 
   /**
-   * Creates {@link org.apache.calcite.schema.Function} from given method. When {@code eval} method
-   * does not suit, {@code null} is returned.
+   * Creates {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function} from
+   * given method. When {@code eval} method does not suit, {@code null} is returned.
    *
    * @param method method that is used to implement the function
    * @return created {@link Function} or null
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/TableResolutionUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/TableResolutionUtils.java
index 247f1f7..1659e87 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/TableResolutionUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/TableResolutionUtils.java
@@ -28,9 +28,9 @@
 import org.apache.beam.sdk.extensions.sql.TableNameExtractionUtils;
 import org.apache.beam.sdk.extensions.sql.meta.CustomTableResolver;
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.sql.SqlNode;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -169,7 +169,7 @@
    */
   private static class SchemaWithName {
     String name;
-    org.apache.calcite.schema.Schema schema;
+    org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Schema schema;
 
     static SchemaWithName create(JdbcConnection connection, String name) {
       SchemaWithName schemaWithName = new SchemaWithName();
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdafImpl.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdafImpl.java
index 70aa89d..532232b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdafImpl.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdafImpl.java
@@ -25,12 +25,12 @@
 import org.apache.beam.sdk.annotations.Internal;
 import org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils;
 import org.apache.beam.sdk.transforms.Combine.CombineFn;
-import org.apache.calcite.adapter.enumerable.AggImplementor;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.schema.AggregateFunction;
-import org.apache.calcite.schema.FunctionParameter;
-import org.apache.calcite.schema.ImplementableAggFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.AggImplementor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.AggregateFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.FunctionParameter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ImplementableAggFunction;
 
 /** Implement {@link AggregateFunction} to take a {@link CombineFn} as UDAF. */
 @Experimental
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImpl.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImpl.java
index ba5848e..34f5683 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImpl.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImpl.java
@@ -18,9 +18,9 @@
 package org.apache.beam.sdk.extensions.sql.impl;
 
 import java.lang.reflect.Method;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.TranslatableTable;
-import org.apache.calcite.schema.impl.TableMacroImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.TranslatableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.impl.TableMacroImpl;
 
 /** Beam-customized facade behind {@link Function} to address BEAM-5921. */
 class UdfImpl {
@@ -28,7 +28,8 @@
   private UdfImpl() {}
 
   /**
-   * Creates {@link org.apache.calcite.schema.Function} from given class.
+   * Creates {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function} from
+   * given class.
    *
    * <p>If a method of the given name is not found or it does not suit, returns {@code null}.
    *
@@ -45,7 +46,8 @@
   }
 
   /**
-   * Creates {@link org.apache.calcite.schema.Function} from given method.
+   * Creates {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function} from
+   * given method.
    *
    * @param method method that is used to implement the function
    * @return created {@link Function} or null
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImplReflectiveFunctionBase.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImplReflectiveFunctionBase.java
index 244ac51..be13523 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImplReflectiveFunctionBase.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/UdfImplReflectiveFunctionBase.java
@@ -23,13 +23,13 @@
 import java.util.ArrayList;
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.FunctionParameter;
-import org.apache.calcite.schema.impl.ReflectiveFunctionBase;
-import org.apache.calcite.util.ReflectUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.FunctionParameter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.impl.ReflectiveFunctionBase;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ReflectUtil;
 
 /** Beam-customized version from {@link ReflectiveFunctionBase}, to address BEAM-5921. */
 public abstract class UdfImplReflectiveFunctionBase implements Function {
@@ -95,7 +95,10 @@
     return new ParameterListBuilder();
   }
 
-  /** Helps build lists of {@link org.apache.calcite.schema.FunctionParameter}. */
+  /**
+   * Helps build lists of {@link
+   * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.FunctionParameter}.
+   */
   public static class ParameterListBuilder {
     final List<FunctionParameter> builder = new ArrayList<>();
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCheckConstraint.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCheckConstraint.java
index de6a6f3..a6d145d 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCheckConstraint.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCheckConstraint.java
@@ -18,15 +18,15 @@
 package org.apache.beam.sdk.extensions.sql.impl.parser;
 
 import java.util.List;
-import org.apache.calcite.sql.SqlCall;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.SqlSpecialOperator;
-import org.apache.calcite.sql.SqlWriter;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.util.ImmutableNullableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSpecialOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlWriter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ImmutableNullableList;
 
 /**
  * Parse tree for {@code UNIQUE}, {@code PRIMARY KEY} constraints.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlColumnDeclaration.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlColumnDeclaration.java
index bf5110d..1ffe80f 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlColumnDeclaration.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlColumnDeclaration.java
@@ -18,16 +18,16 @@
 package org.apache.beam.sdk.extensions.sql.impl.parser;
 
 import java.util.List;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.sql.SqlCall;
-import org.apache.calcite.sql.SqlDataTypeSpec;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.SqlSpecialOperator;
-import org.apache.calcite.sql.SqlWriter;
-import org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlDataTypeSpec;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSpecialOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlWriter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
 
 /** Parse tree for column. */
 public class SqlColumnDeclaration extends SqlCall {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCreateExternalTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCreateExternalTable.java
index d797f77..bd331a9 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCreateExternalTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlCreateExternalTable.java
@@ -19,8 +19,8 @@
 
 import static com.alibaba.fastjson.JSON.parseObject;
 import static org.apache.beam.sdk.schemas.Schema.toSchema;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
-import static org.apache.calcite.util.Static.RESOURCE;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkNotNull;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Static.RESOURCE;
 
 import com.alibaba.fastjson.JSONObject;
 import java.util.List;
@@ -28,19 +28,19 @@
 import org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.calcite.jdbc.CalcitePrepare;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.sql.SqlCreate;
-import org.apache.calcite.sql.SqlExecutableStatement;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.SqlSpecialOperator;
-import org.apache.calcite.sql.SqlUtil;
-import org.apache.calcite.sql.SqlWriter;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalcitePrepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlCreate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlExecutableStatement;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSpecialOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlWriter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
 
 /** Parse tree for {@code CREATE EXTERNAL TABLE} statement. */
 public class SqlCreateExternalTable extends SqlCreate implements SqlExecutableStatement {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDdlNodes.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDdlNodes.java
index d9ceeb5..dbeb98d 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDdlNodes.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDdlNodes.java
@@ -19,16 +19,16 @@
 
 import java.util.List;
 import javax.annotation.Nullable;
-import org.apache.calcite.jdbc.CalcitePrepare;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.sql.SqlDataTypeSpec;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlLiteral;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.util.NlsString;
-import org.apache.calcite.util.Pair;
-import org.apache.calcite.util.Util;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalcitePrepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlDataTypeSpec;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.NlsString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Util;
 
 /** Utilities concerning {@link SqlNode} for DDL. */
 public class SqlDdlNodes {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropObject.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropObject.java
index 1f7b0d1..2801dcd 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropObject.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropObject.java
@@ -17,21 +17,21 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.parser;
 
-import static org.apache.calcite.util.Static.RESOURCE;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Static.RESOURCE;
 
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.jdbc.CalcitePrepare;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.sql.SqlDrop;
-import org.apache.calcite.sql.SqlExecutableStatement;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.SqlUtil;
-import org.apache.calcite.sql.SqlWriter;
-import org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalcitePrepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlDrop;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlExecutableStatement;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlWriter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
 
 /**
  * Base class for parse trees of {@code DROP TABLE}, {@code DROP VIEW} and {@code DROP MATERIALIZED
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropTable.java
index 3714cf6..9541242 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlDropTable.java
@@ -17,11 +17,11 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.parser;
 
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.SqlSpecialOperator;
-import org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSpecialOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
 
 /** Parse tree for {@code DROP TABLE} statement. */
 public class SqlDropTable extends SqlDropObject {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlSetOptionBeam.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlSetOptionBeam.java
index 7314305..c74a1fb 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlSetOptionBeam.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/parser/SqlSetOptionBeam.java
@@ -17,18 +17,18 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.parser;
 
-import static org.apache.calcite.util.Static.RESOURCE;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Static.RESOURCE;
 
 import org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema;
-import org.apache.calcite.jdbc.CalcitePrepare;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.sql.SqlExecutableStatement;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlSetOption;
-import org.apache.calcite.sql.SqlUtil;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalcitePrepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlExecutableStatement;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlSetOption;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
 
 /** SQL parse tree node to represent {@code SET} and {@code RESET} statements. */
 public class SqlSetOptionBeam extends SqlSetOption implements SqlExecutableStatement {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamCostModel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamCostModel.java
index 10fc833..2e57cb1 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamCostModel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamCostModel.java
@@ -18,9 +18,9 @@
 package org.apache.beam.sdk.extensions.sql.impl.planner;
 
 import java.util.Objects;
-import org.apache.calcite.plan.RelOptCost;
-import org.apache.calcite.plan.RelOptCostFactory;
-import org.apache.calcite.plan.RelOptUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCost;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCostFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptUtil;
 
 /**
  * <code>VolcanoCost</code> represents the cost of a plan node.
@@ -216,8 +216,9 @@
   }
 
   /**
-   * Implementation of {@link org.apache.calcite.plan.RelOptCostFactory} that creates {@link
-   * BeamCostModel}s.
+   * Implementation of {@link
+   * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCostFactory} that creates
+   * {@link BeamCostModel}s.
    */
   public static class Factory implements RelOptCostFactory {
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamJavaTypeFactory.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamJavaTypeFactory.java
index 8d6114e..bc67b93 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamJavaTypeFactory.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamJavaTypeFactory.java
@@ -18,12 +18,12 @@
 package org.apache.beam.sdk.extensions.sql.impl.planner;
 
 import java.lang.reflect.Type;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.jdbc.JavaTypeFactoryImpl;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.sql.type.BasicSqlType;
-import org.apache.calcite.sql.type.IntervalSqlType;
-import org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.JavaTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.BasicSqlType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.IntervalSqlType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 
 /** customized data type in Beam. */
 public class BeamJavaTypeFactory extends JavaTypeFactoryImpl {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRelDataTypeSystem.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRelDataTypeSystem.java
index 4356e82..b83a1bf 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRelDataTypeSystem.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRelDataTypeSystem.java
@@ -17,8 +17,8 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.planner;
 
-import org.apache.calcite.rel.type.RelDataTypeSystem;
-import org.apache.calcite.rel.type.RelDataTypeSystemImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeSystem;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeSystemImpl;
 
 /** customized data type in Beam. */
 public class BeamRelDataTypeSystem extends RelDataTypeSystemImpl {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRuleSets.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRuleSets.java
index b2766d6..33d69dd 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRuleSets.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamRuleSets.java
@@ -36,34 +36,34 @@
 import org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule;
 import org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule;
 import org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.rules.AggregateJoinTransposeRule;
-import org.apache.calcite.rel.rules.AggregateProjectMergeRule;
-import org.apache.calcite.rel.rules.AggregateRemoveRule;
-import org.apache.calcite.rel.rules.AggregateUnionAggregateRule;
-import org.apache.calcite.rel.rules.CalcMergeRule;
-import org.apache.calcite.rel.rules.FilterAggregateTransposeRule;
-import org.apache.calcite.rel.rules.FilterCalcMergeRule;
-import org.apache.calcite.rel.rules.FilterJoinRule;
-import org.apache.calcite.rel.rules.FilterProjectTransposeRule;
-import org.apache.calcite.rel.rules.FilterSetOpTransposeRule;
-import org.apache.calcite.rel.rules.FilterToCalcRule;
-import org.apache.calcite.rel.rules.JoinCommuteRule;
-import org.apache.calcite.rel.rules.JoinPushExpressionsRule;
-import org.apache.calcite.rel.rules.ProjectCalcMergeRule;
-import org.apache.calcite.rel.rules.ProjectFilterTransposeRule;
-import org.apache.calcite.rel.rules.ProjectMergeRule;
-import org.apache.calcite.rel.rules.ProjectSetOpTransposeRule;
-import org.apache.calcite.rel.rules.ProjectSortTransposeRule;
-import org.apache.calcite.rel.rules.ProjectToCalcRule;
-import org.apache.calcite.rel.rules.PruneEmptyRules;
-import org.apache.calcite.rel.rules.SortProjectTransposeRule;
-import org.apache.calcite.rel.rules.UnionEliminatorRule;
-import org.apache.calcite.rel.rules.UnionToDistinctRule;
-import org.apache.calcite.tools.RuleSet;
-import org.apache.calcite.tools.RuleSets;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.AggregateJoinTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.AggregateProjectMergeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.AggregateRemoveRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.AggregateUnionAggregateRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.CalcMergeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.FilterAggregateTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.FilterCalcMergeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.FilterJoinRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.FilterProjectTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.FilterSetOpTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.FilterToCalcRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinCommuteRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinPushExpressionsRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.ProjectCalcMergeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.ProjectFilterTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.ProjectMergeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.ProjectSetOpTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.ProjectSortTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.ProjectToCalcRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.PruneEmptyRules;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.SortProjectTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.UnionEliminatorRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.UnionToDistinctRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSets;
 
 /**
  * {@link RuleSet} used in {@code BeamQueryPlanner}. It translates a standard Calcite {@link
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsMetadata.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsMetadata.java
index 8bc62ee..f0991af 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsMetadata.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsMetadata.java
@@ -18,12 +18,12 @@
 package org.apache.beam.sdk.extensions.sql.impl.planner;
 
 import java.lang.reflect.Method;
-import org.apache.calcite.linq4j.tree.Types;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.metadata.Metadata;
-import org.apache.calcite.rel.metadata.MetadataDef;
-import org.apache.calcite.rel.metadata.MetadataHandler;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Types;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.Metadata;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.MetadataDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.MetadataHandler;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /**
  * This is a metadata used for row count and rate estimation. It extends Calcite's Metadata
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/RelMdNodeStats.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/RelMdNodeStats.java
index c01fbb5..0619a4b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/RelMdNodeStats.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/RelMdNodeStats.java
@@ -21,12 +21,12 @@
 import java.util.Map;
 import java.util.stream.Collectors;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.metadata.MetadataDef;
-import org.apache.calcite.rel.metadata.MetadataHandler;
-import org.apache.calcite.rel.metadata.ReflectiveRelMetadataProvider;
-import org.apache.calcite.rel.metadata.RelMetadataProvider;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.MetadataDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.MetadataHandler;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.ReflectiveRelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /**
  * This is the implementation of NodeStatsMetadata. Methods to estimate rate and row count for
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRel.java
index d951ab3..453c648 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRel.java
@@ -19,7 +19,7 @@
 
 import static java.util.stream.Collectors.toList;
 import static org.apache.beam.sdk.values.PCollection.IsBounded.BOUNDED;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.io.Serializable;
 import java.util.List;
@@ -50,16 +50,16 @@
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.WindowingStrategy;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelWriter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Aggregate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.AggregateCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ImmutableBitSet;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.RelWriter;
-import org.apache.calcite.rel.core.Aggregate;
-import org.apache.calcite.rel.core.AggregateCall;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.util.ImmutableBitSet;
 import org.joda.time.Duration;
 
 /** {@link BeamRelNode} to replace a {@link Aggregate} node. */
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRel.java
index 8e6229c..3d666aa 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRel.java
@@ -19,8 +19,8 @@
 
 import static org.apache.beam.sdk.schemas.Schema.FieldType;
 import static org.apache.beam.sdk.schemas.Schema.TypeName;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
-import static org.apache.calcite.avatica.util.DateTimeUtils.MILLIS_PER_DAY;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.DateTimeUtils.MILLIS_PER_DAY;
 
 import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
@@ -49,41 +49,41 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.DataContext;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.JavaRowFormat;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.PhysType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.PhysTypeImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.RexToLixTranslator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.ByteString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.QueryProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.BlockBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expressions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.GotoExpressionKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.MemberDeclaration;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.ParameterExpression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Types;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPredicateList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Calc;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLocalRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexProgram;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexSimplify;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlConformance;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlConformanceEnum;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.BuiltInMethod;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Maps;
-import org.apache.calcite.DataContext;
-import org.apache.calcite.adapter.enumerable.JavaRowFormat;
-import org.apache.calcite.adapter.enumerable.PhysType;
-import org.apache.calcite.adapter.enumerable.PhysTypeImpl;
-import org.apache.calcite.adapter.enumerable.RexToLixTranslator;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.avatica.util.ByteString;
-import org.apache.calcite.linq4j.QueryProvider;
-import org.apache.calcite.linq4j.tree.BlockBuilder;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.linq4j.tree.Expressions;
-import org.apache.calcite.linq4j.tree.GotoExpressionKind;
-import org.apache.calcite.linq4j.tree.MemberDeclaration;
-import org.apache.calcite.linq4j.tree.ParameterExpression;
-import org.apache.calcite.linq4j.tree.Types;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelOptPredicateList;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Calc;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexLocalRef;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.rex.RexProgram;
-import org.apache.calcite.rex.RexSimplify;
-import org.apache.calcite.rex.RexUtil;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.sql.validate.SqlConformance;
-import org.apache.calcite.sql.validate.SqlConformanceEnum;
-import org.apache.calcite.util.BuiltInMethod;
 import org.codehaus.commons.compiler.CompileException;
 import org.codehaus.janino.ScriptEvaluator;
 import org.joda.time.DateTime;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRel.java
index d6ac71b..bef3cb7 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRel.java
@@ -37,13 +37,13 @@
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.WindowingStrategy;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.CorrelationId;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.CorrelationId;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /**
  * A {@code BeamJoinRel} which does CoGBK Join
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverter.java
index f42098f..cdd1444 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverter.java
@@ -17,8 +17,8 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
-import static org.apache.calcite.avatica.util.DateTimeUtils.MILLIS_PER_DAY;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.DateTimeUtils.MILLIS_PER_DAY;
 
 import java.io.IOException;
 import java.util.Iterator;
@@ -54,24 +54,24 @@
 import org.apache.beam.sdk.values.PCollection.IsBounded;
 import org.apache.beam.sdk.values.PValue;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.adapter.enumerable.EnumerableRel;
-import org.apache.calcite.adapter.enumerable.EnumerableRelImplementor;
-import org.apache.calcite.adapter.enumerable.PhysType;
-import org.apache.calcite.adapter.enumerable.PhysTypeImpl;
-import org.apache.calcite.linq4j.Enumerable;
-import org.apache.calcite.linq4j.Linq4j;
-import org.apache.calcite.linq4j.tree.BlockBuilder;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.linq4j.tree.Expressions;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptCost;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterImpl;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.EnumerableRel;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.EnumerableRelImplementor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.PhysType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.PhysTypeImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Enumerable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Linq4j;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.BlockBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expressions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCost;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
 import org.joda.time.Duration;
 import org.joda.time.ReadableInstant;
 import org.slf4j.Logger;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSinkRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSinkRel.java
index 6bdd3fa..d943aed 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSinkRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSinkRel.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.util.List;
 import java.util.Map;
@@ -29,16 +29,16 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelOptTable;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.prepare.Prepare;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.TableModify;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql2rel.RelStructuredTypeFlattener;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.Prepare;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.TableModify;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql2rel.RelStructuredTypeFlattener;
 
 /** BeamRelNode to replace a {@code TableModify} node. */
 public class BeamIOSinkRel extends TableModify
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRel.java
index a18f882..69ff227 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRel.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.util.Map;
 import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
@@ -29,12 +29,12 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptCost;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelOptTable;
-import org.apache.calcite.rel.core.TableScan;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCost;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.TableScan;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /** BeamRelNode to replace a {@code TableScan} node. */
 public class BeamIOSourceRel extends TableScan implements BeamRelNode {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRel.java
index e7502f4..80db503 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRel.java
@@ -24,13 +24,13 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Intersect;
-import org.apache.calcite.rel.core.SetOp;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Intersect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.SetOp;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /**
  * {@code BeamRelNode} to replace a {@code Intersect} node.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamJoinRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamJoinRel.java
index 422242f..4201328 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamJoinRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamJoinRel.java
@@ -43,23 +43,23 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Optional;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.plan.volcano.RelSubset;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.CorrelationId;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rex.RexCall;
-import org.apache.calcite.rex.RexFieldAccess;
-import org.apache.calcite.rex.RexInputRef;
-import org.apache.calcite.rex.RexLiteral;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Optional;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.volcano.RelSubset;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.CorrelationId;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexFieldAccess;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexInputRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
 
 /**
  * An abstract {@code BeamRelNode} to implement Join Rels.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamLogicalConvention.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamLogicalConvention.java
index f134686..4133f0a 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamLogicalConvention.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamLogicalConvention.java
@@ -17,12 +17,12 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTrait;
-import org.apache.calcite.plan.RelTraitDef;
-import org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTrait;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
 
 /** Convertion for Beam SQL. */
 public enum BeamLogicalConvention implements Convention {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRel.java
index c9f0c4f..5e9e075 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRel.java
@@ -24,13 +24,13 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Minus;
-import org.apache.calcite.rel.core.SetOp;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Minus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.SetOp;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /**
  * {@code BeamRelNode} to replace a {@code Minus} node.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamRelNode.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamRelNode.java
index b5e80e9..1b549b4 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamRelNode.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamRelNode.java
@@ -25,9 +25,9 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /** A {@link RelNode} that can also give a {@link PTransform} that implements the expression. */
 public interface BeamRelNode extends RelNode {
@@ -72,8 +72,10 @@
    * SQLTransform Path (and not JDBC path). When a RelNode wants to calculate its BeamCost or
    * estimate its NodeStats, it may need NodeStat of its inputs. However, it should not call this
    * directly (because maybe its inputs are not physical yet). It should call {@link
-   * org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils#getNodeStats(org.apache.calcite.rel.RelNode,
-   * org.apache.calcite.rel.metadata.RelMetadataQuery)} instead.
+   * org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils#getNodeStats(
+   * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode,
+   * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery)}
+   * instead.
    */
   NodeStats estimateNodeStats(RelMetadataQuery mq);
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSetOperatorRelBase.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSetOperatorRelBase.java
index e1672a4..1dfeb0e 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSetOperatorRelBase.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSetOperatorRelBase.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.io.Serializable;
 import org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRel.java
index 7366dcd..06011a9 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRel.java
@@ -31,13 +31,13 @@
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.CorrelationId;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.CorrelationId;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /**
  * A {@code BeamJoinRel} which does sideinput Join
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputLookupJoinRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputLookupJoinRel.java
index 27ecae6..b4dbd56 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputLookupJoinRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputLookupJoinRel.java
@@ -26,13 +26,13 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.CorrelationId;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.CorrelationId;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /**
  * A {@code BeamJoinRel} which does Lookup Join
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRel.java
index c660328..afe18b0 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRel.java
@@ -17,8 +17,8 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects.firstNonNull;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.MoreObjects.firstNonNull;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.io.Serializable;
 import java.math.BigDecimal;
@@ -41,7 +41,6 @@
 import org.apache.beam.sdk.transforms.Flatten;
 import org.apache.beam.sdk.transforms.PTransform;
 import org.apache.beam.sdk.transforms.ParDo;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.transforms.Top;
 import org.apache.beam.sdk.transforms.WithKeys;
 import org.apache.beam.sdk.transforms.windowing.GlobalWindows;
@@ -51,19 +50,19 @@
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.WindowingStrategy;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelCollation;
-import org.apache.calcite.rel.RelCollationImpl;
-import org.apache.calcite.rel.RelFieldCollation;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Sort;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rex.RexInputRef;
-import org.apache.calcite.rex.RexLiteral;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollationImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelFieldCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Sort;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexInputRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 
 /**
  * {@code BeamRelNode} to replace a {@code Sort} node.
@@ -224,10 +223,7 @@
 
         return rawStream
             .apply("flatten", Flatten.iterables())
-            .setSchema(
-                CalciteUtils.toSchema(getRowType()),
-                SerializableFunctions.identity(),
-                SerializableFunctions.identity());
+            .setRowSchema(CalciteUtils.toSchema(getRowType()));
       }
     }
   }
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSqlRelUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSqlRelUtils.java
index fb44f28..9bf45c7 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSqlRelUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSqlRelUtils.java
@@ -28,9 +28,9 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.volcano.RelSubset;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.volcano.RelSubset;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /** Utilities for {@code BeamRelNode}. */
 public class BeamSqlRelUtils {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRel.java
index b031a50..2b2511d 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRel.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel;
 import org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats;
@@ -29,12 +29,12 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Uncollect;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Uncollect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /** {@link BeamRelNode} to implement an uncorrelated {@link Uncollect}, aka UNNEST. */
 public class BeamUncollectRel extends Uncollect implements BeamRelNode {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRel.java
index bab912a..5fc3d07 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRel.java
@@ -25,13 +25,13 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.SetOp;
-import org.apache.calcite.rel.core.Union;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.SetOp;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Union;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 
 /**
  * {@link BeamRelNode} to replace a {@link Union}.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnnestRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnnestRel.java
index 0af4ee3..1263b3d 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnnestRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnnestRel.java
@@ -29,18 +29,18 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.RelWriter;
-import org.apache.calcite.rel.core.Correlate;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rel.core.Uncollect;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.sql.validate.SqlValidatorUtil;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelWriter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Correlate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Uncollect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlValidatorUtil;
 
 /**
  * {@link BeamRelNode} to implement UNNEST, supporting specifically only {@link Correlate} with
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRel.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRel.java
index cc63aa9..9fa5037 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRel.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRel.java
@@ -20,9 +20,8 @@
 import static java.util.stream.Collectors.toList;
 import static org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils.autoCastField;
 import static org.apache.beam.sdk.values.Row.toRow;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
-import com.google.common.collect.ImmutableList;
 import java.util.List;
 import java.util.Map;
 import java.util.stream.IntStream;
@@ -35,14 +34,15 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionList;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.core.Values;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rex.RexLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Values;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLiteral;
 
 /**
  * {@code BeamRelNode} to replace a {@code Values} node.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/package-info.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/package-info.java
index d09d802..6c74569 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/package-info.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rel/package-info.java
@@ -16,7 +16,10 @@
  * limitations under the License.
  */
 
-/** BeamSQL specified nodes, to replace {@link org.apache.calcite.rel.RelNode}. */
+/**
+ * BeamSQL specified nodes, to replace {@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode}.
+ */
 @DefaultAnnotation(NonNull.class)
 package org.apache.beam.sdk.extensions.sql.impl.rel;
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamAggregationRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamAggregationRule.java
index bf52652..2d54ae6 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamAggregationRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamAggregationRule.java
@@ -26,18 +26,18 @@
 import org.apache.beam.sdk.transforms.windowing.Sessions;
 import org.apache.beam.sdk.transforms.windowing.SlidingWindows;
 import org.apache.beam.sdk.transforms.windowing.WindowFn;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Aggregate;
-import org.apache.calcite.rel.core.Project;
-import org.apache.calcite.rel.core.RelFactories;
-import org.apache.calcite.rex.RexCall;
-import org.apache.calcite.rex.RexLiteral;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.tools.RelBuilderFactory;
-import org.apache.calcite.util.ImmutableBitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Aggregate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Project;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.RelFactories;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelBuilderFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ImmutableBitSet;
 import org.joda.time.Duration;
 
 /** Rule to detect the window/trigger settings. */
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamBasicAggregationRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamBasicAggregationRule.java
index eb93911..3522cef 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamBasicAggregationRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamBasicAggregationRule.java
@@ -19,13 +19,13 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Aggregate;
-import org.apache.calcite.rel.core.RelFactories;
-import org.apache.calcite.rel.core.TableScan;
-import org.apache.calcite.tools.RelBuilderFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Aggregate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.RelFactories;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.TableScan;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelBuilderFactory;
 
 /**
  * Aggregation rule that doesn't include projection.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCalcRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCalcRule.java
index b5aee81..824e5fc 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCalcRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCalcRule.java
@@ -19,13 +19,13 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Calc;
-import org.apache.calcite.rel.logical.LogicalCalc;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Calc;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalCalc;
 
 /** A {@code ConverterRule} to replace {@link Calc} with {@link BeamCalcRel}. */
 public class BeamCalcRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCoGBKJoinRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCoGBKJoinRule.java
index 88ef48c..516bc09 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCoGBKJoinRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamCoGBKJoinRule.java
@@ -21,12 +21,12 @@
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.values.PCollection;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.RelFactories;
-import org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.RelFactories;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
 
 /**
  * Rule to convert {@code LogicalJoin} node to {@code BeamCoGBKJoinRel} node.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamEnumerableConverterRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamEnumerableConverterRule.java
index ec64b44..773fef1 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamEnumerableConverterRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamEnumerableConverterRule.java
@@ -20,10 +20,10 @@
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode;
-import org.apache.calcite.adapter.enumerable.EnumerableConvention;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.EnumerableConvention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
 
 /** A {@code ConverterRule} to Convert {@link BeamRelNode} to {@link EnumerableConvention}. */
 public class BeamEnumerableConverterRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIOSinkRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIOSinkRule.java
index 9c6bfa6..d67e106 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIOSinkRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIOSinkRule.java
@@ -20,9 +20,9 @@
 import java.util.Arrays;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.TableModify;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.TableModify;
 
 /** A {@code ConverterRule} to replace {@link TableModify} with {@link BeamIOSinkRel}. */
 public class BeamIOSinkRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIntersectRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIntersectRule.java
index 2ffe983a..1a91e4c 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIntersectRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamIntersectRule.java
@@ -20,11 +20,11 @@
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Intersect;
-import org.apache.calcite.rel.logical.LogicalIntersect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Intersect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalIntersect;
 
 /** {@code ConverterRule} to replace {@code Intersect} with {@code BeamIntersectRel}. */
 public class BeamIntersectRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinAssociateRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinAssociateRule.java
index c437c45..3eb7ab5 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinAssociateRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinAssociateRule.java
@@ -18,15 +18,16 @@
 package org.apache.beam.sdk.extensions.sql.impl.rule;
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.RelFactories;
-import org.apache.calcite.rel.rules.JoinAssociateRule;
-import org.apache.calcite.tools.RelBuilderFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.RelFactories;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinAssociateRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelBuilderFactory;
 
 /**
- * This is very similar to {@link org.apache.calcite.rel.rules.JoinAssociateRule}. It only checks if
- * the resulting condition is supported before transforming.
+ * This is very similar to {@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinAssociateRule}. It only
+ * checks if the resulting condition is supported before transforming.
  */
 public class BeamJoinAssociateRule extends JoinAssociateRule {
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinPushThroughJoinRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinPushThroughJoinRule.java
index 830c3ae..f2a10b9 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinPushThroughJoinRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamJoinPushThroughJoinRule.java
@@ -18,17 +18,18 @@
 package org.apache.beam.sdk.extensions.sql.impl.rule;
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.RelFactories;
-import org.apache.calcite.rel.logical.LogicalJoin;
-import org.apache.calcite.rel.rules.JoinPushThroughJoinRule;
-import org.apache.calcite.tools.RelBuilderFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.RelFactories;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinPushThroughJoinRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelBuilderFactory;
 
 /**
- * This is exactly similar to {@link org.apache.calcite.rel.rules.JoinPushThroughJoinRule}. It only
- * checks if the condition of the new bottom join is supported.
+ * This is exactly similar to {@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinPushThroughJoinRule}. It
+ * only checks if the condition of the new bottom join is supported.
  */
 public class BeamJoinPushThroughJoinRule extends JoinPushThroughJoinRule {
   /** Instance of the rule that works on logical joins only, and pushes to the right. */
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamMinusRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamMinusRule.java
index 73ac601..29d4a97 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamMinusRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamMinusRule.java
@@ -20,11 +20,11 @@
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Minus;
-import org.apache.calcite.rel.logical.LogicalMinus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Minus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalMinus;
 
 /** {@code ConverterRule} to replace {@code Minus} with {@code BeamMinusRel}. */
 public class BeamMinusRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputJoinRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputJoinRule.java
index 44347c9..98227bb 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputJoinRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputJoinRule.java
@@ -21,12 +21,12 @@
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel;
 import org.apache.beam.sdk.values.PCollection;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.RelFactories;
-import org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.RelFactories;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
 
 /**
  * Rule to convert {@code LogicalJoin} node to {@code BeamSideInputJoinRel} node.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputLookupJoinRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputLookupJoinRule.java
index 2e233d5..2c96bd95 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputLookupJoinRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSideInputLookupJoinRule.java
@@ -20,12 +20,12 @@
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
 
 /**
  * Rule to convert {@code LogicalJoin} node to {@code BeamSideInputLookupJoinRel} node.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSortRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSortRule.java
index 18c24f4..1647bf7 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSortRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamSortRule.java
@@ -19,11 +19,11 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Sort;
-import org.apache.calcite.rel.logical.LogicalSort;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Sort;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalSort;
 
 /** {@code ConverterRule} to replace {@code Sort} with {@code BeamSortRel}. */
 public class BeamSortRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUncollectRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUncollectRule.java
index 6ce75fc..393882b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUncollectRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUncollectRule.java
@@ -19,10 +19,10 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Uncollect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Uncollect;
 
 /** A {@code ConverterRule} to replace {@link Uncollect} with {@link BeamUncollectRule}. */
 public class BeamUncollectRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnionRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnionRule.java
index 1d4a637..7b84e25 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnionRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnionRule.java
@@ -19,14 +19,15 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Union;
-import org.apache.calcite.rel.logical.LogicalUnion;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Union;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalUnion;
 
 /**
- * A {@code ConverterRule} to replace {@link org.apache.calcite.rel.core.Union} with {@link
+ * A {@code ConverterRule} to replace {@link
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Union} with {@link
  * BeamUnionRule}.
  */
 public class BeamUnionRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnnestRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnnestRule.java
index 0851c98..cc10225 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnnestRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamUnnestRule.java
@@ -19,18 +19,18 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.plan.volcano.RelSubset;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.SingleRel;
-import org.apache.calcite.rel.core.Correlate;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rel.core.Uncollect;
-import org.apache.calcite.rel.logical.LogicalCorrelate;
-import org.apache.calcite.rel.logical.LogicalProject;
-import org.apache.calcite.rex.RexFieldAccess;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.volcano.RelSubset;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.SingleRel;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Correlate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Uncollect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalCorrelate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalProject;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexFieldAccess;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /**
  * A {@code ConverterRule} to replace {@link Correlate} {@link Uncollect} with {@link
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamValuesRule.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamValuesRule.java
index 68c626e..6fbe1e0 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamValuesRule.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/BeamValuesRule.java
@@ -19,11 +19,11 @@
 
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel;
-import org.apache.calcite.plan.Convention;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.convert.ConverterRule;
-import org.apache.calcite.rel.core.Values;
-import org.apache.calcite.rel.logical.LogicalValues;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Convention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.convert.ConverterRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Values;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalValues;
 
 /** {@code ConverterRule} to replace {@code Values} with {@code BeamValuesRel}. */
 public class BeamValuesRule extends ConverterRule {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinRelOptRuleCall.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinRelOptRuleCall.java
index 07601bd..27d8168 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinRelOptRuleCall.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinRelOptRuleCall.java
@@ -19,13 +19,13 @@
 
 import java.util.List;
 import java.util.Map;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelOptRuleCall;
-import org.apache.calcite.plan.RelOptRuleOperand;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
-import org.apache.calcite.tools.RelBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRuleOperand;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelBuilder;
 
 /**
  * This is a class to catch the built join and check if it is a legal join before passing it to the
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/package-info.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/package-info.java
index 6f82253..7c3d0b2 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/package-info.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/rule/package-info.java
@@ -17,8 +17,8 @@
  */
 
 /**
- * {@link org.apache.calcite.plan.RelOptRule} to generate {@link
- * org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode}.
+ * {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule} to generate
+ * {@link org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode}.
  */
 @DefaultAnnotation(NonNull.class)
 package org.apache.beam.sdk.extensions.sql.impl.rule;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamTableUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamTableUtils.java
index a76302c..c3761bb 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamTableUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamTableUtils.java
@@ -31,7 +31,7 @@
 import org.apache.beam.sdk.schemas.Schema.FieldType;
 import org.apache.beam.sdk.schemas.Schema.TypeName;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.util.NlsString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.NlsString;
 import org.apache.commons.csv.CSVFormat;
 import org.apache.commons.csv.CSVParser;
 import org.apache.commons.csv.CSVPrinter;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamBuiltinAggregations.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamBuiltinAggregations.java
index 28463da..ad99c28 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamBuiltinAggregations.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamBuiltinAggregations.java
@@ -40,7 +40,7 @@
 import org.apache.beam.sdk.transforms.Min;
 import org.apache.beam.sdk.transforms.Sum;
 import org.apache.beam.sdk.values.KV;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /** Built-in aggregations functions for COUNT/MAX/MIN/SUM/AVG/VAR_POP/VAR_SAMP. */
 public class BeamBuiltinAggregations {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamJoinTransforms.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamJoinTransforms.java
index 3ea6d4c..47b9912 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamJoinTransforms.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamJoinTransforms.java
@@ -36,11 +36,11 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rex.RexCall;
-import org.apache.calcite.rex.RexInputRef;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexInputRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
 
 /** Collections of {@code PTransform} and {@code DoFn} used to perform JOIN operation. */
 public class BeamJoinTransforms {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamSetOperatorsTransforms.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamSetOperatorsTransforms.java
index 244fab4..4ea7dcf 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamSetOperatorsTransforms.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/BeamSetOperatorsTransforms.java
@@ -25,7 +25,7 @@
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TupleTag;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterators;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Iterators;
 
 /** Collections of {@code PTransform} and {@code DoFn} used to perform Set operations. */
 public abstract class BeamSetOperatorsTransforms {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/AggregationCombineFnAdapter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/AggregationCombineFnAdapter.java
index 972438d..3905eb6 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/AggregationCombineFnAdapter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/AggregationCombineFnAdapter.java
@@ -27,8 +27,8 @@
 import org.apache.beam.sdk.schemas.SchemaCoder;
 import org.apache.beam.sdk.transforms.Combine.CombineFn;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.core.AggregateCall;
-import org.apache.calcite.sql.validate.SqlUserDefinedAggFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.AggregateCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlUserDefinedAggFunction;
 
 /** Wrapper {@link CombineFn}s for aggregation function calls. */
 public class AggregationCombineFnAdapter<T> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/CovarianceFn.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/CovarianceFn.java
index 6c5bcb9..825aad8 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/CovarianceFn.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/CovarianceFn.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.transform.agg;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.math.BigDecimal;
 import java.math.MathContext;
@@ -32,7 +32,7 @@
 import org.apache.beam.sdk.transforms.Combine;
 import org.apache.beam.sdk.transforms.SerializableFunction;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.runtime.SqlFunctions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.runtime.SqlFunctions;
 
 /**
  * {@link Combine.CombineFn} for <em>Covariance</em> on {@link Number} types.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/VarianceFn.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/VarianceFn.java
index a0353e5..8114ee4 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/VarianceFn.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/transform/agg/VarianceFn.java
@@ -29,7 +29,7 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.transforms.Combine;
 import org.apache.beam.sdk.transforms.SerializableFunction;
-import org.apache.calcite.runtime.SqlFunctions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.runtime.SqlFunctions;
 
 /**
  * {@link Combine.CombineFn} for <em>Variance</em> on {@link Number} types.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/udf/BuiltinStringFunctions.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/udf/BuiltinStringFunctions.java
index 4b558e5..1a90bf5 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/udf/BuiltinStringFunctions.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/udf/BuiltinStringFunctions.java
@@ -22,7 +22,7 @@
 import com.google.auto.service.AutoService;
 import java.util.Arrays;
 import org.apache.beam.sdk.schemas.Schema.TypeName;
-import org.apache.calcite.linq4j.function.Strict;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.function.Strict;
 import org.apache.commons.codec.DecoderException;
 import org.apache.commons.codec.binary.Hex;
 import org.apache.commons.lang3.ArrayUtils;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/BigDecimalConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/BigDecimalConverter.java
index 0f2340d..d00e6d6 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/BigDecimalConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/BigDecimalConverter.java
@@ -21,7 +21,7 @@
 import java.util.Map;
 import org.apache.beam.sdk.schemas.Schema.TypeName;
 import org.apache.beam.sdk.transforms.SerializableFunction;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /**
  * Provides converters from {@link BigDecimal} to other numeric types based on the input {@link
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtils.java
index 9d9e0cf..dad5647 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtils.java
@@ -25,14 +25,14 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
 import org.apache.beam.sdk.schemas.Schema.TypeName;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.BiMap;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableBiMap;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.avatica.util.ByteString;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rel.type.RelDataTypeField;
-import org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.BiMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableBiMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.ByteString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeField;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 import org.joda.time.Instant;
 import org.joda.time.base.AbstractInstant;
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexFieldAccess.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexFieldAccess.java
index 6bf3cc2..ce75b92 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexFieldAccess.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexFieldAccess.java
@@ -20,8 +20,8 @@
 import java.util.ArrayList;
 import java.util.Collections;
 import java.util.List;
-import org.apache.calcite.rex.RexFieldAccess;
-import org.apache.calcite.rex.RexInputRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexFieldAccess;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexInputRef;
 
 /** SerializableRexFieldAccess. */
 public class SerializableRexFieldAccess extends SerializableRexNode {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexInputRef.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexInputRef.java
index 0b40c98..4d4d364 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexInputRef.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexInputRef.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl.utils;
 
-import org.apache.calcite.rex.RexInputRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexInputRef;
 
 /** SerializableRexInputRef. */
 public class SerializableRexInputRef extends SerializableRexNode {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexNode.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexNode.java
index 31d5ab9..9796bf3 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexNode.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/utils/SerializableRexNode.java
@@ -18,9 +18,9 @@
 package org.apache.beam.sdk.extensions.sql.impl.utils;
 
 import java.io.Serializable;
-import org.apache.calcite.rex.RexFieldAccess;
-import org.apache.calcite.rex.RexInputRef;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexFieldAccess;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexInputRef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /** SerializableRexNode. */
 public abstract class SerializableRexNode implements Serializable {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/ReadOnlyTableProvider.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/ReadOnlyTableProvider.java
index 290fa29..9ba8675 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/ReadOnlyTableProvider.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/ReadOnlyTableProvider.java
@@ -21,7 +21,7 @@
 import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /**
  * A {@code ReadOnlyTableProvider} provides in-memory read only set of {@code BeamSqlTable
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTable.java
index 4f1b6a9..770fb47 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTable.java
@@ -20,12 +20,16 @@
 import java.io.IOException;
 import java.io.Serializable;
 import java.math.BigInteger;
+import java.util.Arrays;
+import java.util.List;
+import java.util.stream.Collectors;
 import org.apache.beam.sdk.annotations.Experimental;
 import org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics;
 import org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers;
 import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method;
 import org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions;
 import org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils;
 import org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions;
@@ -35,7 +39,7 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.POutput;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.annotations.VisibleForTesting;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -45,15 +49,44 @@
  */
 @Experimental
 class BigQueryTable extends BaseBeamTable implements Serializable {
+  @VisibleForTesting static final String METHOD_PROPERTY = "method";
   @VisibleForTesting final String bqLocation;
   private final ConversionOptions conversionOptions;
   private BeamTableStatistics rowCountStatistics = null;
   private static final Logger LOGGER = LoggerFactory.getLogger(BigQueryTable.class);
+  @VisibleForTesting final Method method;
 
   BigQueryTable(Table table, BigQueryUtils.ConversionOptions options) {
     super(table.getSchema());
     this.conversionOptions = options;
     this.bqLocation = table.getLocation();
+
+    if (table.getProperties().containsKey(METHOD_PROPERTY)) {
+      List<String> validMethods =
+          Arrays.stream(Method.values()).map(Enum::toString).collect(Collectors.toList());
+      // toUpperCase should make it case-insensitive
+      String selectedMethod = table.getProperties().getString(METHOD_PROPERTY).toUpperCase();
+
+      if (validMethods.contains(selectedMethod)) {
+        method = Method.valueOf(selectedMethod);
+      } else {
+        InvalidPropertyException e =
+            new InvalidPropertyException(
+                "Invalid method "
+                    + "'"
+                    + selectedMethod
+                    + "'. "
+                    + "Supported methods are: "
+                    + validMethods.toString()
+                    + ".");
+
+        throw e;
+      }
+    } else {
+      method = Method.DEFAULT;
+    }
+
+    LOGGER.info("BigQuery method is set to: " + method.toString());
   }
 
   @Override
@@ -79,6 +112,7 @@
             BigQueryIO.read(
                     record ->
                         BigQueryUtils.toBeamRow(record.getRecord(), getSchema(), conversionOptions))
+                .withMethod(method)
                 .from(bqLocation)
                 .withCoder(SchemaCoder.of(getSchema())))
         .setRowSchema(getSchema());
@@ -111,4 +145,10 @@
 
     return BeamTableStatistics.BOUNDED_UNKNOWN;
   }
+
+  public static class InvalidPropertyException extends UnsupportedOperationException {
+    private InvalidPropertyException(String s) {
+      super(s);
+    }
+  }
 }
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProvider.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProvider.java
index b39d57c..65494e4 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProvider.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProvider.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects.firstNonNull;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.MoreObjects.firstNonNull;
 
 import com.google.auto.service.AutoService;
 import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaTable.java
index 11c12f6..70ba9cd 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaTable.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.meta.provider.kafka;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.util.Collection;
 import java.util.HashMap;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableProvider.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableProvider.java
index 9dd1923..9a8a12b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableProvider.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableProvider.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.meta.provider.test;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import com.google.auto.service.AutoService;
 import java.io.Serializable;
@@ -40,7 +40,6 @@
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.DoFn;
 import org.apache.beam.sdk.transforms.ParDo;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PDone;
@@ -132,10 +131,7 @@
     }
 
     public Coder<Row> rowCoder() {
-      return SchemaCoder.of(
-          tableWithRows.table.getSchema(),
-          SerializableFunctions.identity(),
-          SerializableFunctions.identity());
+      return SchemaCoder.of(tableWithRows.table.getSchema());
     }
 
     @Override
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableUtils.java
index 777209a..7ebbf91 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestTableUtils.java
@@ -27,7 +27,7 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Lists;
 
 /** Utility functions for mock classes. */
 @Experimental
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestUnboundedTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestUnboundedTable.java
index f3b56f4..22c1bd2 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestUnboundedTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/test/TestUnboundedTable.java
@@ -25,12 +25,11 @@
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.testing.TestStream;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TimestampedValue;
-import org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 
@@ -108,9 +107,7 @@
 
   @Override
   public PCollection<Row> buildIOReader(PBegin begin) {
-    TestStream.Builder<Row> values =
-        TestStream.create(
-            schema, SerializableFunctions.identity(), SerializableFunctions.identity());
+    TestStream.Builder<Row> values = TestStream.create(schema);
 
     for (Pair<Duration, List<Row>> pair : timestampedRows) {
       values = values.advanceWatermarkTo(new Instant(0).plus(pair.getKey()));
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProvider.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProvider.java
index c3f1eb7..8666e33 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProvider.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProvider.java
@@ -19,7 +19,7 @@
 
 import static org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils.beamRow2CsvLine;
 import static org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils.csvLines2BeamRows;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import com.alibaba.fastjson.JSONObject;
 import com.google.auto.service.AutoService;
@@ -37,9 +37,9 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TypeDescriptors;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.annotations.VisibleForTesting;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.MoreObjects;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableSet;
 import org.apache.commons.csv.CSVFormat;
 
 /**
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/store/InMemoryMetaStore.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/store/InMemoryMetaStore.java
index 06bb228..82ef447 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/store/InMemoryMetaStore.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/store/InMemoryMetaStore.java
@@ -22,7 +22,7 @@
 import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 
 /**
  * A {@link MetaStore} which stores the meta info in memory.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/BeamBuiltinMethods.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/BeamBuiltinMethods.java
index 9d16948..9f6b33b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/BeamBuiltinMethods.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/BeamBuiltinMethods.java
@@ -18,7 +18,7 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.lang.reflect.Method;
-import org.apache.calcite.linq4j.tree.Types;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Types;
 
 /** BeamBuiltinMethods. */
 public class BeamBuiltinMethods {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/DateTimeUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/DateTimeUtils.java
index c992d8f..5f90efb 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/DateTimeUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/DateTimeUtils.java
@@ -23,13 +23,13 @@
 import com.google.zetasql.Value;
 import io.grpc.Status;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.TimeUnit;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.DateString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.TimeString;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Splitter;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.math.LongMath;
-import org.apache.calcite.avatica.util.TimeUnit;
-import org.apache.calcite.util.DateString;
-import org.apache.calcite.util.TimeString;
 import org.joda.time.DateTime;
 import org.joda.time.DateTimeZone;
 import org.joda.time.LocalTime;
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlAnalyzer.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlAnalyzer.java
index d3b1ed3..2de00c4 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlAnalyzer.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlAnalyzer.java
@@ -40,14 +40,14 @@
 import java.util.Map;
 import java.util.Optional;
 import org.apache.beam.sdk.extensions.sql.zetasql.TableResolution.SimpleTableWithPath;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Context;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeField;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.plan.Context;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeField;
-import org.apache.calcite.schema.SchemaPlus;
 
 /** Adapter for {@link Analyzer} to simplify the API for parsing the query and resolving the AST. */
 class SqlAnalyzer {
@@ -170,7 +170,7 @@
 
     SimpleCatalog leafCatalog = createNestedCatalogs(topLevelCatalog, tablePath);
 
-    org.apache.calcite.schema.Table calciteTable =
+    org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table calciteTable =
         TableResolution.resolveCalciteTable(
             builder.calciteContext, builder.topLevelSchema, tablePath);
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCaseWithValueOperatorRewriter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCaseWithValueOperatorRewriter.java
index 8bf7057..79d99b0 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCaseWithValueOperatorRewriter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCaseWithValueOperatorRewriter.java
@@ -19,13 +19,13 @@
 
 import java.util.ArrayList;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
 
 /**
  * Rewrites $case_with_value calls as $case_no_value calls.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCoalesceOperatorRewriter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCoalesceOperatorRewriter.java
index 39198d0..58743f6 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCoalesceOperatorRewriter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlCoalesceOperatorRewriter.java
@@ -19,13 +19,13 @@
 
 import java.util.ArrayList;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Util;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
-import org.apache.calcite.util.Util;
 
 /**
  * Rewrites COALESCE calls as CASE ($case_no_value) calls.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlExtractTimestampOperatorRewriter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlExtractTimestampOperatorRewriter.java
index a3e9c55..129031e 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlExtractTimestampOperatorRewriter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlExtractTimestampOperatorRewriter.java
@@ -18,11 +18,11 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlOperator;
 
 /** Rewrites EXTRACT calls by swapping first two arguments to fit for calcite SqlExtractOperator. */
 public class SqlExtractTimestampOperatorRewriter implements SqlOperatorRewriter {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlIfNullOperatorRewriter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlIfNullOperatorRewriter.java
index 69478bb..039797d 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlIfNullOperatorRewriter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlIfNullOperatorRewriter.java
@@ -18,12 +18,12 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
 
 /**
  * Rewrites IFNULL calls as CASE ($case_no_value) calls.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlNullIfOperatorRewriter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlNullIfOperatorRewriter.java
index 69184ac..382a5ca 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlNullIfOperatorRewriter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlNullIfOperatorRewriter.java
@@ -18,12 +18,12 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
 
 /**
  * Rewrites NULLIF calls as CASE ($case_no_value) calls.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperatorRewriter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperatorRewriter.java
index f4245bd..949632c 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperatorRewriter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperatorRewriter.java
@@ -18,8 +18,8 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.List;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /** Interface for rewriting calls a specific ZetaSQL operator. */
 public interface SqlOperatorRewriter {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperators.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperators.java
index 99bfb13..9f0e948 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperators.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlOperators.java
@@ -22,27 +22,27 @@
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl;
 import org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.JavaTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.FunctionParameter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ScalarFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.FamilyOperandTypeChecker;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.InferTypes;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.OperandTypes;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlReturnTypeInference;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeFamily;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlUserDefinedFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Util;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
-import org.apache.calcite.jdbc.JavaTypeFactoryImpl;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rel.type.RelDataTypeFactoryImpl;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.FunctionParameter;
-import org.apache.calcite.schema.ScalarFunction;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.sql.type.FamilyOperandTypeChecker;
-import org.apache.calcite.sql.type.InferTypes;
-import org.apache.calcite.sql.type.OperandTypes;
-import org.apache.calcite.sql.type.SqlReturnTypeInference;
-import org.apache.calcite.sql.type.SqlTypeFactoryImpl;
-import org.apache.calcite.sql.type.SqlTypeFamily;
-import org.apache.calcite.sql.type.SqlTypeName;
-import org.apache.calcite.sql.validate.SqlUserDefinedFunction;
-import org.apache.calcite.util.Util;
 
 /**
  * A separate SqlOperators table for those functions that do not exist or not compatible with
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlStdOperatorMappingTable.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlStdOperatorMappingTable.java
index ca9e4c5..3079100 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlStdOperatorMappingTable.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/SqlStdOperatorMappingTable.java
@@ -21,10 +21,10 @@
 import java.util.Arrays;
 import java.util.List;
 import org.apache.beam.sdk.annotations.Internal;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
 
 /** SqlStdOperatorMappingTable. */
 @Internal
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/StringFunctions.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/StringFunctions.java
index 07f8746..c126370 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/StringFunctions.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/StringFunctions.java
@@ -18,8 +18,8 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.regex.Pattern;
-import org.apache.calcite.linq4j.function.Strict;
-import org.apache.calcite.runtime.SqlFunctions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.function.Strict;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.runtime.SqlFunctions;
 
 /** StringFunctions. */
 public class StringFunctions {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolution.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolution.java
index 54e3ab1..a982d93 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolution.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolution.java
@@ -19,10 +19,10 @@
 
 import com.google.zetasql.SimpleTable;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Context;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
-import org.apache.calcite.plan.Context;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.schema.Table;
 
 /** Utility methods to resolve a table, given a top-level Calcite schema and a table path. */
 public class TableResolution {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolutionContext.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolutionContext.java
index e088019..3aed2c1 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolutionContext.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolutionContext.java
@@ -18,8 +18,8 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.Map;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Context;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.plan.Context;
 import org.codehaus.commons.nullanalysis.Nullable;
 
 /**
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolver.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolver.java
index 398eaf7..b7e516e 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolver.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolver.java
@@ -18,8 +18,8 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.List;
-import org.apache.calcite.schema.Schema;
-import org.apache.calcite.schema.Table;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Schema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table;
 
 /** An interface to implement a custom resolution strategy. */
 interface TableResolver {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolverImpl.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolverImpl.java
index fad71dc..ad2f7ce 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolverImpl.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TableResolverImpl.java
@@ -18,9 +18,9 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Schema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
-import org.apache.calcite.schema.Schema;
-import org.apache.calcite.schema.Table;
 
 /** A couple of implementations of TableResolver. */
 class TableResolverImpl {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TimestampFunctions.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TimestampFunctions.java
index 6e9ef5a..9bbbb9b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TimestampFunctions.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TimestampFunctions.java
@@ -18,7 +18,7 @@
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
 import java.util.TimeZone;
-import org.apache.calcite.linq4j.function.Strict;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.function.Strict;
 import org.joda.time.DateTime;
 import org.joda.time.DateTimeZone;
 
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TypeUtils.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TypeUtils.java
index fc5b604..0967f2e 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TypeUtils.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/TypeUtils.java
@@ -39,10 +39,10 @@
 import java.util.List;
 import java.util.function.Function;
 import org.apache.beam.sdk.annotations.Internal;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.sql.type.SqlTypeName;
 
 /** Utility to convert types from Calcite Schema types. */
 @Internal
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLCastFunctionImpl.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLCastFunctionImpl.java
index 3b9eec1..b0e8e56 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLCastFunctionImpl.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLCastFunctionImpl.java
@@ -17,24 +17,24 @@
  */
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
-import static org.apache.calcite.adapter.enumerable.RexImpTable.createImplementor;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.RexImpTable.createImplementor;
 
 import java.util.List;
-import org.apache.calcite.adapter.enumerable.CallImplementor;
-import org.apache.calcite.adapter.enumerable.NotNullImplementor;
-import org.apache.calcite.adapter.enumerable.NullPolicy;
-import org.apache.calcite.adapter.enumerable.RexImpTable;
-import org.apache.calcite.adapter.enumerable.RexToLixTranslator;
-import org.apache.calcite.linq4j.tree.Expression;
-import org.apache.calcite.linq4j.tree.Expressions;
-import org.apache.calcite.rex.RexCall;
-import org.apache.calcite.schema.Function;
-import org.apache.calcite.schema.FunctionParameter;
-import org.apache.calcite.schema.ImplementableFunction;
-import org.apache.calcite.sql.SqlIdentifier;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.sql.type.SqlTypeName;
-import org.apache.calcite.sql.validate.SqlUserDefinedFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.CallImplementor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.NotNullImplementor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.NullPolicy;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.RexImpTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.RexToLixTranslator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expression;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.tree.Expressions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.FunctionParameter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ImplementableFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIdentifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlUserDefinedFunction;
 
 /** ZetaSQLCastFunctionImpl. */
 public class ZetaSQLCastFunctionImpl implements Function, ImplementableFunction {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLPlannerImpl.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLPlannerImpl.java
index 09b93f3..424ea28 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLPlannerImpl.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLPlannerImpl.java
@@ -17,7 +17,6 @@
  */
 package org.apache.beam.sdk.extensions.sql.zetasql;
 
-import com.google.common.collect.ImmutableList;
 import com.google.zetasql.Value;
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedQueryStmt;
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedStatement;
@@ -27,29 +26,30 @@
 import org.apache.beam.sdk.extensions.sql.zetasql.translation.ConversionContext;
 import org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter;
 import org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptPlanner;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.RelRoot;
-import org.apache.calcite.rel.metadata.CachingRelMetadataProvider;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexExecutor;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.parser.SqlParseException;
-import org.apache.calcite.tools.FrameworkConfig;
-import org.apache.calcite.tools.Frameworks;
-import org.apache.calcite.tools.Planner;
-import org.apache.calcite.tools.Program;
-import org.apache.calcite.tools.RelConversionException;
-import org.apache.calcite.tools.ValidationException;
-import org.apache.calcite.util.Pair;
-import org.apache.calcite.util.Util;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelRoot;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.CachingRelMetadataProvider;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexExecutor;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParseException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Frameworks;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Planner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Program;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelConversionException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.ValidationException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Pair;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.Util;
 
 /** ZetaSQLPlannerImpl. */
 public class ZetaSQLPlannerImpl implements Planner {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLQueryPlanner.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLQueryPlanner.java
index f767d09..3730857 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLQueryPlanner.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLQueryPlanner.java
@@ -26,26 +26,26 @@
 import org.apache.beam.sdk.extensions.sql.impl.SqlConversionException;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Contexts;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.CalciteCatalogReader;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelRoot;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParser;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserImplFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.util.ChainedSqlOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Frameworks;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelConversionException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSet;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.config.CalciteConnectionConfig;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.plan.Contexts;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelTraitDef;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.prepare.CalciteCatalogReader;
-import org.apache.calcite.rel.RelRoot;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.SqlOperatorTable;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
-import org.apache.calcite.sql.parser.SqlParser;
-import org.apache.calcite.sql.parser.SqlParserImplFactory;
-import org.apache.calcite.sql.util.ChainedSqlOperatorTable;
-import org.apache.calcite.tools.FrameworkConfig;
-import org.apache.calcite.tools.Frameworks;
-import org.apache.calcite.tools.RelConversionException;
-import org.apache.calcite.tools.RuleSet;
 
 /** ZetaSQLQueryPlanner. */
 public class ZetaSQLQueryPlanner implements QueryPlanner {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/AggregateScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/AggregateScanConverter.java
index 9f7b472..0339592 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/AggregateScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/AggregateScanConverter.java
@@ -35,16 +35,16 @@
 import java.util.stream.Collectors;
 import java.util.stream.IntStream;
 import org.apache.beam.sdk.extensions.sql.zetasql.SqlStdOperatorMappingTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.AggregateCall;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalAggregate;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalProject;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlAggFunction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ImmutableBitSet;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.AggregateCall;
-import org.apache.calcite.rel.logical.LogicalAggregate;
-import org.apache.calcite.rel.logical.LogicalProject;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlAggFunction;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
-import org.apache.calcite.util.ImmutableBitSet;
 
 /** Converts aggregate calls. */
 class AggregateScanConverter extends RelConverter<ResolvedAggregateScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToJoinConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToJoinConverter.java
index 66e6bc3..f445006 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToJoinConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToJoinConverter.java
@@ -23,16 +23,16 @@
 import java.util.ArrayList;
 import java.util.Collections;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.CorrelationId;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Uncollect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalProject;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalValues;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.CorrelationId;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rel.core.Uncollect;
-import org.apache.calcite.rel.logical.LogicalJoin;
-import org.apache.calcite.rel.logical.LogicalProject;
-import org.apache.calcite.rel.logical.LogicalValues;
-import org.apache.calcite.rex.RexNode;
 
 /** Converts array scan that represents join of an uncollect(array_field) to uncollect. */
 class ArrayScanToJoinConverter extends RelConverter<ResolvedArrayScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToUncollectConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToUncollectConverter.java
index 55a42ce..ef336cd 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToUncollectConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ArrayScanToUncollectConverter.java
@@ -21,12 +21,12 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedLiteral;
 import java.util.Collections;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Uncollect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalProject;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalValues;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.Uncollect;
-import org.apache.calcite.rel.logical.LogicalProject;
-import org.apache.calcite.rel.logical.LogicalValues;
-import org.apache.calcite.rex.RexNode;
 
 /** Converts array scan that represents an array literal to uncollect. */
 class ArrayScanToUncollectConverter extends RelConverter<ResolvedArrayScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ConversionContext.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ConversionContext.java
index be367d0..1133574 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ConversionContext.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ConversionContext.java
@@ -18,8 +18,8 @@
 package org.apache.beam.sdk.extensions.sql.zetasql.translation;
 
 import org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
 
 /** Conversion context, some rules need this data to convert the nodes. */
 public class ConversionContext {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ExpressionConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ExpressionConverter.java
index 8a75a35..652aabd 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ExpressionConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ExpressionConverter.java
@@ -64,22 +64,22 @@
 import org.apache.beam.sdk.extensions.sql.zetasql.SqlOperators;
 import org.apache.beam.sdk.extensions.sql.zetasql.SqlStdOperatorMappingTable;
 import org.apache.beam.sdk.extensions.sql.zetasql.TypeUtils;
-import org.apache.calcite.avatica.util.ByteString;
-import org.apache.calcite.avatica.util.TimeUnit;
-import org.apache.calcite.avatica.util.TimeUnitRange;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rel.type.RelDataTypeField;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexLiteral;
-import org.apache.calcite.rex.RexNode;
-import org.apache.calcite.sql.SqlIntervalQualifier;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
-import org.apache.calcite.sql.parser.SqlParserPos;
-import org.apache.calcite.sql.type.SqlTypeName;
-import org.apache.calcite.util.TimestampString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.ByteString;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.TimeUnit;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.TimeUnitRange;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeField;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlIntervalQualifier;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.TimestampString;
 
 /**
  * Extracts expressions (function calls, field accesses) from the resolve query nodes, converts them
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/FilterScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/FilterScanConverter.java
index bcddd9c..1a04ca1 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/FilterScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/FilterScanConverter.java
@@ -21,9 +21,9 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedFilterScan;
 import java.util.Collections;
 import java.util.List;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalFilter;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalFilter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /** Converts filter. */
 class FilterScanConverter extends RelConverter<ResolvedFilterScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanConverter.java
index e5e8cef..d0819f3 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanConverter.java
@@ -23,14 +23,14 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedJoinScan;
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedWithRefScan;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.JoinRelType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeField;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.core.JoinRelType;
-import org.apache.calcite.rel.logical.LogicalJoin;
-import org.apache.calcite.rel.type.RelDataTypeField;
-import org.apache.calcite.rex.RexNode;
 
 /** Converts joins if neither side of the join is a WithRefScan. */
 class JoinScanConverter extends RelConverter<ResolvedJoinScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanWithRefConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanWithRefConverter.java
index 55594f9..fee0124 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanWithRefConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/JoinScanWithRefConverter.java
@@ -25,11 +25,11 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedScan;
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedWithRefScan;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalJoin;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalJoin;
-import org.apache.calcite.rex.RexNode;
 
 /** Converts joins where at least one of the inputs is a WITH subquery. */
 class JoinScanWithRefConverter extends RelConverter<ResolvedJoinScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToLimitConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToLimitConverter.java
index bbe930b..0be8e2c 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToLimitConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToLimitConverter.java
@@ -22,12 +22,12 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedOrderByScan;
 import java.util.Collections;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollations;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalSort;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.rel.RelCollation;
-import org.apache.calcite.rel.RelCollations;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalSort;
-import org.apache.calcite.rex.RexNode;
 
 /** Converts LIMIT without ORDER BY. */
 class LimitOffsetScanToLimitConverter extends RelConverter<ResolvedLimitOffsetScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToOrderByLimitConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToOrderByLimitConverter.java
index d38abc7..2492088 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToOrderByLimitConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/LimitOffsetScanToOrderByLimitConverter.java
@@ -18,8 +18,8 @@
 package org.apache.beam.sdk.extensions.sql.zetasql.translation;
 
 import static java.util.stream.Collectors.toList;
-import static org.apache.calcite.rel.RelFieldCollation.Direction.ASCENDING;
-import static org.apache.calcite.rel.RelFieldCollation.Direction.DESCENDING;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelFieldCollation.Direction.ASCENDING;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelFieldCollation.Direction.DESCENDING;
 
 import com.google.zetasql.resolvedast.ResolvedNode;
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedLimitOffsetScan;
@@ -27,14 +27,14 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedOrderByScan;
 import java.util.Collections;
 import java.util.List;
-import org.apache.calcite.rel.RelCollation;
-import org.apache.calcite.rel.RelCollationImpl;
-import org.apache.calcite.rel.RelFieldCollation;
-import org.apache.calcite.rel.RelFieldCollation.Direction;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalProject;
-import org.apache.calcite.rel.logical.LogicalSort;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollationImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelFieldCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelFieldCollation.Direction;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalProject;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalSort;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /** Converts ORDER BY LIMIT. */
 class LimitOffsetScanToOrderByLimitConverter extends RelConverter<ResolvedLimitOffsetScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/OrderByScanUnsupportedConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/OrderByScanUnsupportedConverter.java
index a496862..878b2b2 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/OrderByScanUnsupportedConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/OrderByScanUnsupportedConverter.java
@@ -19,7 +19,7 @@
 
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedOrderByScan;
 import java.util.List;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 
 /**
  * Always throws exception, represents the case when order by is used without limit.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ProjectScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ProjectScanConverter.java
index 323277f8..d19b765 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ProjectScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/ProjectScanConverter.java
@@ -21,9 +21,9 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedProjectScan;
 import java.util.Collections;
 import java.util.List;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalProject;
-import org.apache.calcite.rex.RexNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalProject;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexNode;
 
 /** Converts projection. */
 class ProjectScanConverter extends RelConverter<ResolvedProjectScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/QueryStatementConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/QueryStatementConverter.java
index 29c5cd4..5513482 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/QueryStatementConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/QueryStatementConverter.java
@@ -36,8 +36,8 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedQueryStmt;
 import java.util.Collections;
 import java.util.List;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMultimap;
-import org.apache.calcite.rel.RelNode;
 
 /**
  * Converts a resolved Zeta SQL query represented by a tree to corresponding Calcite representation.
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/RelConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/RelConverter.java
index 69f01e4..2b1b722 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/RelConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/RelConverter.java
@@ -20,10 +20,10 @@
 import com.google.zetasql.resolvedast.ResolvedNode;
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.tools.FrameworkConfig;
 
 /** A rule that converts Zeta SQL resolved relational node to corresponding Calcite rel node. */
 abstract class RelConverter<T extends ResolvedNode> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SetOperationScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SetOperationScanConverter.java
index 7c52d6e..375021b 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SetOperationScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SetOperationScanConverter.java
@@ -32,12 +32,12 @@
 import java.util.List;
 import java.util.function.BiFunction;
 import java.util.function.Function;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalIntersect;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalMinus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalUnion;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalIntersect;
-import org.apache.calcite.rel.logical.LogicalMinus;
-import org.apache.calcite.rel.logical.LogicalUnion;
 
 /** Converts set operations. */
 class SetOperationScanConverter extends RelConverter<ResolvedSetOperationScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SingleRowScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SingleRowScanConverter.java
index f4553e5..4721b33 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SingleRowScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SingleRowScanConverter.java
@@ -19,8 +19,8 @@
 
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedSingleRowScan;
 import java.util.List;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.logical.LogicalValues;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.logical.LogicalValues;
 
 /** Converts a single row value. */
 class SingleRowScanConverter extends RelConverter<ResolvedSingleRowScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/TableScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/TableScanConverter.java
index 2ad54db..2f0c1e6 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/TableScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/TableScanConverter.java
@@ -17,30 +17,30 @@
  */
 package org.apache.beam.sdk.extensions.sql.zetasql.translation;
 
-import static com.google.common.base.Preconditions.checkNotNull;
 import static com.google.zetasql.ZetaSQLType.TypeKind.TYPE_DATETIME;
 import static com.google.zetasql.ZetaSQLType.TypeKind.TYPE_NUMERIC;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkNotNull;
 
-import com.google.common.collect.ImmutableList;
-import com.google.common.collect.ImmutableSet;
 import com.google.zetasql.ZetaSQLType.TypeKind;
 import com.google.zetasql.resolvedast.ResolvedColumn;
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedTableScan;
 import java.util.List;
 import java.util.Properties;
 import org.apache.beam.sdk.extensions.sql.zetasql.TableResolution;
-import org.apache.calcite.config.CalciteConnectionConfigImpl;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelOptTable;
-import org.apache.calcite.prepare.CalciteCatalogReader;
-import org.apache.calcite.prepare.RelOptTableImpl;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.RelRoot;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.schema.Table;
-import org.apache.calcite.schema.TranslatableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.config.CalciteConnectionConfigImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.CalciteCatalogReader;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.RelOptTableImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelRoot;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.TranslatableTable;
 
 /** Converts table scan. */
 class TableScanConverter extends RelConverter<ResolvedTableScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithRefScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithRefScanConverter.java
index 11180c6..6dceeef 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithRefScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithRefScanConverter.java
@@ -21,7 +21,7 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedWithRefScan;
 import java.util.Collections;
 import java.util.List;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 
 /** Converts a call-site reference to a named WITH subquery. */
 class WithRefScanConverter extends RelConverter<ResolvedWithRefScan> {
diff --git a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithScanConverter.java b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithScanConverter.java
index 208c396..7159356 100644
--- a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithScanConverter.java
+++ b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/WithScanConverter.java
@@ -21,7 +21,7 @@
 import com.google.zetasql.resolvedast.ResolvedNodes.ResolvedWithScan;
 import java.util.Collections;
 import java.util.List;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 
 /** Converts a named WITH. */
 class WithScanConverter extends RelConverter<ResolvedWithScan> {
diff --git a/sdks/java/extensions/sql/src/main/resources/org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.commons.compiler.properties b/sdks/java/extensions/sql/src/main/resources/org.apache.beam.vendor.calcite.v1_20_0.org.codehaus.commons.compiler.properties
similarity index 90%
rename from sdks/java/extensions/sql/src/main/resources/org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.commons.compiler.properties
rename to sdks/java/extensions/sql/src/main/resources/org.apache.beam.vendor.calcite.v1_20_0.org.codehaus.commons.compiler.properties
index 72a4eec..ab9a234 100644
--- a/sdks/java/extensions/sql/src/main/resources/org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.commons.compiler.properties
+++ b/sdks/java/extensions/sql/src/main/resources/org.apache.beam.vendor.calcite.v1_20_0.org.codehaus.commons.compiler.properties
@@ -15,4 +15,4 @@
 #  See the License for the specific language governing permissions and
 # limitations under the License.
 ################################################################################
-compilerFactory=org.apache.beam.sdks.java.extensions.sql.repackaged.org.codehaus.janino.CompilerFactory
+compilerFactory=org.apache.beam.vendor.calcite.v1_20_0.org.codehaus.janino.CompilerFactory
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamComplexTypeTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamComplexTypeTest.java
index d968e1f..24d23c9 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamComplexTypeTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamComplexTypeTest.java
@@ -31,8 +31,8 @@
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.joda.time.Instant;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlCastTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlCastTest.java
index 12b3a3c..01872a5 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlCastTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlCastTest.java
@@ -24,7 +24,6 @@
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.joda.time.DateTime;
@@ -46,10 +45,7 @@
     PCollection<Row> input =
         pipeline.apply(
             Create.of(Row.withSchema(INPUT_ROW_SCHEMA).addValues(1).addValue("20181018").build())
-                .withSchema(
-                    INPUT_ROW_SCHEMA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+                .withRowSchema(INPUT_ROW_SCHEMA));
 
     Schema resultType =
         Schema.builder().addInt32Field("f_int").addNullableField("f_date", DATETIME).build();
@@ -78,10 +74,7 @@
     PCollection<Row> input =
         pipeline.apply(
             Create.of(Row.withSchema(INPUT_ROW_SCHEMA).addValues(1).addValue("20181018").build())
-                .withSchema(
-                    INPUT_ROW_SCHEMA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+                .withRowSchema(INPUT_ROW_SCHEMA));
 
     Schema resultType = Schema.builder().addInt32Field("f_int").addDateTimeField("f_date").build();
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationCovarianceTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationCovarianceTest.java
index 3cd8ee6..edb765a 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationCovarianceTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationCovarianceTest.java
@@ -24,7 +24,6 @@
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.junit.Before;
@@ -59,11 +58,7 @@
                 2.0, 6, 4, 0, 8.0, 4.0, 1.0, 8, 4, 0)
             .getRows();
 
-    boundedInput =
-        pipeline.apply(
-            Create.of(rowsInTableB)
-                .withSchema(
-                    schema, SerializableFunctions.identity(), SerializableFunctions.identity()));
+    boundedInput = pipeline.apply(Create.of(rowsInTableB).withRowSchema(schema));
   }
 
   @Test
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationNullableTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationNullableTest.java
index 91e19db..91925f0 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationNullableTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationNullableTest.java
@@ -18,7 +18,6 @@
 package org.apache.beam.sdk.extensions.sql;
 
 import static org.apache.beam.sdk.extensions.sql.utils.RowAsserts.matchesScalar;
-import static org.apache.beam.sdk.transforms.SerializableFunctions.identity;
 import static org.junit.Assert.assertEquals;
 
 import java.util.List;
@@ -60,8 +59,7 @@
             .addRows(3, 2, 1)
             .getRows();
 
-    boundedInput =
-        PBegin.in(pipeline).apply(Create.of(rows).withSchema(schema, identity(), identity()));
+    boundedInput = PBegin.in(pipeline).apply(Create.of(rows).withRowSchema(schema));
   }
 
   @Test
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationTest.java
index f395a46..7591b73 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationTest.java
@@ -36,7 +36,6 @@
 import org.apache.beam.sdk.testing.UsesTestStream;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.SerializableFunction;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.transforms.windowing.AfterPane;
 import org.apache.beam.sdk.transforms.windowing.DefaultTrigger;
 import org.apache.beam.sdk.transforms.windowing.FixedWindows;
@@ -50,6 +49,7 @@
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Before;
+import org.junit.Ignore;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
 
@@ -104,13 +104,7 @@
             .getRows();
 
     boundedInput3 =
-        pipeline.apply(
-            "boundedInput3",
-            Create.of(rowsInTableB)
-                .withSchema(
-                    schemaInTableB,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+        pipeline.apply("boundedInput3", Create.of(rowsInTableB).withRowSchema(schemaInTableB));
   }
 
   /** GROUP-BY with single aggregation function with bounded PCollection. */
@@ -424,8 +418,7 @@
 
     PCollection<Row> input =
         pipeline.apply(
-            TestStream.create(
-                    inputSchema, SerializableFunctions.identity(), SerializableFunctions.identity())
+            TestStream.create(inputSchema)
                 .addElements(
                     Row.withSchema(inputSchema)
                         .addValues(1, parseTimestampWithoutTimeZone("2017-01-01 01:01:01"))
@@ -696,7 +689,7 @@
                             2, 4,
                             2, 5)
                         .getRows()))
-            .setSchema(schema, SerializableFunctions.identity(), SerializableFunctions.identity());
+            .setRowSchema(schema);
 
     String sql = "SELECT SUM(f_intValue) FROM PCOLLECTION GROUP BY f_intGroupingKey";
 
@@ -708,6 +701,38 @@
   }
 
   @Test
+  @Ignore("https://issues.apache.org/jira/browse/BEAM-8317")
+  public void testSupportsAggregationWithFilterWithoutProjection() throws Exception {
+    pipeline.enableAbandonedNodeEnforcement(false);
+
+    Schema schema =
+        Schema.builder().addInt32Field("f_intGroupingKey").addInt32Field("f_intValue").build();
+
+    PCollection<Row> inputRows =
+        pipeline
+            .apply(
+                Create.of(
+                    TestUtils.rowsBuilderOf(schema)
+                        .addRows(
+                            0, 1,
+                            0, 2,
+                            1, 3,
+                            2, 4,
+                            2, 5)
+                        .getRows()))
+            .setRowSchema(schema);
+
+    String sql =
+        "SELECT SUM(f_intValue) FROM PCOLLECTION WHERE f_intValue < 5 GROUP BY f_intGroupingKey";
+
+    PCollection<Row> result = inputRows.apply("sql", SqlTransform.query(sql));
+
+    PAssert.that(result).containsInAnyOrder(rowsWithSingleIntField("sum", Arrays.asList(3, 3, 4)));
+
+    pipeline.run();
+  }
+
+  @Test
   public void testSupportsNonGlobalWindowWithCustomTrigger() {
     DateTime startTime = parseTimestampWithoutTimeZone("2017-1-1 0:0:0");
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationVarianceTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationVarianceTest.java
index 5f0dc12..808b27a 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationVarianceTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslAggregationVarianceTest.java
@@ -24,7 +24,6 @@
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.junit.Before;
@@ -55,11 +54,7 @@
                 1, 1.0, 0, 4, 4.0, 0, 7, 7.0, 0, 13, 13.0, 0, 5, 5.0, 0, 10, 10.0, 0, 17, 17.0, 0)
             .getRows();
 
-    boundedInput =
-        pipeline.apply(
-            Create.of(rowsInTableB)
-                .withSchema(
-                    schema, SerializableFunctions.identity(), SerializableFunctions.identity()));
+    boundedInput = pipeline.apply(Create.of(rowsInTableB).withRowSchema(schema));
   }
 
   @Test
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslArrayTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslArrayTest.java
index eb90274..a9da87a 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslArrayTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslArrayTest.java
@@ -22,7 +22,6 @@
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionTuple;
 import org.apache.beam.sdk.values.Row;
@@ -113,13 +112,7 @@
     Row inputRow = Row.withSchema(INPUT_SCHEMA).addValues(1).addArray(Arrays.asList("111")).build();
 
     PCollection<Row> input =
-        pipeline.apply(
-            "boundedInput1",
-            Create.of(inputRow)
-                .withSchema(
-                    INPUT_SCHEMA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+        pipeline.apply("boundedInput1", Create.of(inputRow).withRowSchema(INPUT_SCHEMA));
 
     Schema resultType = Schema.builder().addStringField("f_arrElem").build();
 
@@ -154,11 +147,7 @@
     PCollection<Row> input =
         pipeline.apply(
             "boundedInput1",
-            Create.empty(TypeDescriptor.of(Row.class))
-                .withSchema(
-                    INPUT_SCHEMA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+            Create.empty(TypeDescriptor.of(Row.class)).withRowSchema(INPUT_SCHEMA));
 
     // Because we have a multi-part FROM the DSL considers it multi-input
     TupleTag<Row> mainTag = new TupleTag<Row>("main") {};
@@ -184,11 +173,7 @@
     PCollection<Row> input =
         pipeline.apply(
             "boundedInput1",
-            Create.empty(TypeDescriptor.of(Row.class))
-                .withSchema(
-                    INPUT_SCHEMA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+            Create.empty(TypeDescriptor.of(Row.class)).withRowSchema(INPUT_SCHEMA));
 
     // Because we have a multi-part FROM the DSL considers it multi-input
     TupleTag<Row> mainTag = new TupleTag<Row>("main") {};
@@ -222,13 +207,7 @@
         Row.withSchema(INPUT_SCHEMA).addValues(13).addArray(Arrays.asList("444", "555")).build();
 
     PCollection<Row> input =
-        pipeline.apply(
-            "boundedInput1",
-            Create.of(row1, row2)
-                .withSchema(
-                    INPUT_SCHEMA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+        pipeline.apply("boundedInput1", Create.of(row1, row2).withRowSchema(INPUT_SCHEMA));
 
     // Because we have a multi-part FROM the DSL considers it multi-input
     TupleTag<Row> mainTag = new TupleTag<Row>("main") {};
@@ -287,8 +266,7 @@
                                 Row.withSchema(elementSchema).addValues("CC", 33).build(),
                                 Row.withSchema(elementSchema).addValues("DD", 44).build()))
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -343,8 +321,7 @@
                                 Row.withSchema(elementSchema).addValues("CC", 33).build(),
                                 Row.withSchema(elementSchema).addValues("DD", 44).build()))
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -389,8 +366,7 @@
                                 Row.withSchema(elementSchema).addValues("CC", 33).build(),
                                 Row.withSchema(elementSchema).addValues("DD", 44).build()))
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -417,7 +393,6 @@
                     .addValues(2)
                     .addArray(Arrays.asList("33", "44", "55"))
                     .build())
-            .withSchema(
-                INPUT_SCHEMA, SerializableFunctions.identity(), SerializableFunctions.identity()));
+            .withRowSchema(INPUT_SCHEMA));
   }
 }
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslBase.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslBase.java
index 8ad2b92..ad26d4a 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslBase.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslBase.java
@@ -29,7 +29,6 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.testing.TestStream;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.transforms.windowing.FixedWindows;
 import org.apache.beam.sdk.transforms.windowing.Window;
 import org.apache.beam.sdk.values.PBegin;
@@ -256,66 +255,34 @@
   @Before
   public void preparePCollections() {
     boundedInput1 =
-        pipeline.apply(
-            "boundedInput1",
-            Create.of(rowsInTableA)
-                .withSchema(
-                    schemaInTableA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+        pipeline.apply("boundedInput1", Create.of(rowsInTableA).withRowSchema(schemaInTableA));
 
     boundedInput2 =
         pipeline.apply(
-            "boundedInput2",
-            Create.of(rowsInTableA.get(0))
-                .withSchema(
-                    schemaInTableA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+            "boundedInput2", Create.of(rowsInTableA.get(0)).withRowSchema(schemaInTableA));
 
     boundedInputFloatDouble =
         pipeline.apply(
             "boundedInputFloatDouble",
-            Create.of(rowsOfFloatDouble)
-                .withSchema(
-                    schemaFloatDouble,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+            Create.of(rowsOfFloatDouble).withRowSchema(schemaFloatDouble));
 
     boundedInputBytes =
-        pipeline.apply(
-            "boundedInputBytes",
-            Create.of(rowsOfBytes)
-                .withSchema(
-                    schemaBytes,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+        pipeline.apply("boundedInputBytes", Create.of(rowsOfBytes).withRowSchema(schemaBytes));
 
     boundedInputBytesPaddingTest =
         pipeline.apply(
             "boundedInputBytesPaddingTest",
-            Create.of(rowsOfBytesPaddingTest)
-                .withSchema(
-                    schemaBytesPaddingTest,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+            Create.of(rowsOfBytesPaddingTest).withRowSchema(schemaBytesPaddingTest));
     boundedInputMonthly =
         pipeline.apply(
-            "boundedInputMonthly",
-            Create.of(monthlyRowsInTableA)
-                .withSchema(
-                    schemaInTableA,
-                    SerializableFunctions.identity(),
-                    SerializableFunctions.identity()));
+            "boundedInputMonthly", Create.of(monthlyRowsInTableA).withRowSchema(schemaInTableA));
 
     unboundedInput1 = prepareUnboundedPCollection1();
     unboundedInput2 = prepareUnboundedPCollection2();
   }
 
   private PCollection<Row> prepareUnboundedPCollection1() {
-    TestStream.Builder<Row> values =
-        TestStream.create(
-            schemaInTableA, SerializableFunctions.identity(), SerializableFunctions.identity());
+    TestStream.Builder<Row> values = TestStream.create(schemaInTableA);
 
     for (Row row : rowsInTableA) {
       values = values.advanceWatermarkTo(new Instant(row.getDateTime("f_timestamp")));
@@ -330,9 +297,7 @@
   }
 
   private PCollection<Row> prepareUnboundedPCollection2() {
-    TestStream.Builder<Row> values =
-        TestStream.create(
-            schemaInTableA, SerializableFunctions.identity(), SerializableFunctions.identity());
+    TestStream.Builder<Row> values = TestStream.create(schemaInTableA);
 
     Row row = rowsInTableA.get(0);
     values = values.advanceWatermarkTo(new Instant(row.getDateTime("f_timestamp")));
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslNestedRowsTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslNestedRowsTest.java
index 2f1f3c0..db6cc14 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslNestedRowsTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslNestedRowsTest.java
@@ -22,7 +22,6 @@
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.junit.Rule;
@@ -62,8 +61,7 @@
                         .addValues(
                             1, Row.withSchema(nestedSchema).addValues(312, "CC", 313).build())
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -106,8 +104,7 @@
                         .addValues(
                             1, Row.withSchema(nestedSchema).addValues(312, "CC", 313).build())
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -148,8 +145,7 @@
                         .addValues(
                             2, Row.withSchema(nestedSchema).addValues(412, "DD", 413).build())
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -200,8 +196,7 @@
                                 .addValues(412, "DD", 413, Arrays.asList("three", "four"))
                                 .build())
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
@@ -251,8 +246,7 @@
                                 .addValues(412, "DD", 413, Arrays.asList("three", "four"))
                                 .build())
                         .build())
-                .withSchema(
-                    inputType, SerializableFunctions.identity(), SerializableFunctions.identity()));
+                .withRowSchema(inputType));
 
     PCollection<Row> result =
         input
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslSqlStdOperatorsTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslSqlStdOperatorsTest.java
index c083aeb..28ac9f4 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslSqlStdOperatorsTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslSqlStdOperatorsTest.java
@@ -43,14 +43,14 @@
 import org.apache.beam.sdk.extensions.sql.integrationtest.BeamSqlBuiltinFunctionsIntegrationTestBase;
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Joiner;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Ordering;
-import org.apache.calcite.runtime.SqlFunctions;
-import org.apache.calcite.sql.SqlKind;
-import org.apache.calcite.sql.SqlOperator;
-import org.apache.calcite.sql.fun.SqlStdOperatorTable;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Joiner;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Lists;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Ordering;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.runtime.SqlFunctions;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlKind;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlOperator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable;
 import org.junit.Ignore;
 import org.junit.Rule;
 import org.junit.Test;
@@ -58,7 +58,7 @@
 
 /**
  * DSL compliance tests for the row-level operators of {@link
- * org.apache.calcite.sql.fun.SqlStdOperatorTable}.
+ * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.fun.SqlStdOperatorTable}.
  */
 public class BeamSqlDslSqlStdOperatorsTest extends BeamSqlBuiltinFunctionsIntegrationTestBase {
   private static final BigDecimal ZERO = BigDecimal.valueOf(0.0);
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslUdfUdafTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslUdfUdafTest.java
index 7c140ae..75e8a08 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslUdfUdafTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlDslUdfUdafTest.java
@@ -37,9 +37,9 @@
 import org.apache.beam.sdk.values.PCollectionTuple;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TupleTag;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.linq4j.function.Parameter;
-import org.apache.calcite.schema.TranslatableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.function.Parameter;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.TranslatableTable;
 import org.joda.time.Instant;
 import org.junit.Test;
 
@@ -174,7 +174,9 @@
     pipeline.run().waitUntilFinish();
   }
 
-  /** test {@link org.apache.calcite.schema.TableMacro} UDF. */
+  /**
+   * test {@link org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.TableMacro} UDF.
+   */
   @Test
   public void testTableMacroUdf() throws Exception {
     String sql1 = "SELECT * FROM table(range_udf(0, 3))";
@@ -345,7 +347,10 @@
     }
   }
 
-  /** UDF to test support for {@link org.apache.calcite.schema.TableMacro}. */
+  /**
+   * UDF to test support for {@link
+   * org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.TableMacro}.
+   */
   public static final class RangeUdf implements BeamSqlUdf {
     public static TranslatableTable eval(int startInclusive, int endExclusive) {
       Schema schema = Schema.of(Schema.Field.of("f0", Schema.FieldType.INT32));
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlExplainTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlExplainTest.java
index 3f5a8f0..3f0d2f2 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlExplainTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlExplainTest.java
@@ -21,9 +21,9 @@
 
 import org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider;
 import org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore;
-import org.apache.calcite.sql.parser.SqlParseException;
-import org.apache.calcite.tools.RelConversionException;
-import org.apache.calcite.tools.ValidationException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParseException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RelConversionException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.ValidationException;
 import org.junit.Before;
 import org.junit.Ignore;
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMapTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMapTest.java
index 350a096..e175530 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMapTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMapTest.java
@@ -21,10 +21,9 @@
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.rules.ExpectedException;
@@ -145,9 +144,6 @@
                     .addValues(2)
                     .addValue(ImmutableMap.of("key33", 33, "key44", 44, "key55", 55))
                     .build())
-            .withSchema(
-                INPUT_ROW_TYPE,
-                SerializableFunctions.identity(),
-                SerializableFunctions.identity()));
+            .withRowSchema(INPUT_ROW_TYPE));
   }
 }
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMultipleSchemasTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMultipleSchemasTest.java
index f36b6d5..41f916b 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMultipleSchemasTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/BeamSqlMultipleSchemasTest.java
@@ -29,7 +29,7 @@
 import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.rules.ExpectedException;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/PubsubToBigqueryIT.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/PubsubToBigqueryIT.java
index b4d2f1f..dc73b20 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/PubsubToBigqueryIT.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/PubsubToBigqueryIT.java
@@ -34,8 +34,8 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 import org.junit.Rule;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/TestUtils.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/TestUtils.java
index b7d6791..3ab0ddd 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/TestUtils.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/TestUtils.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 
 import java.util.ArrayList;
 import java.util.Arrays;
@@ -27,7 +27,6 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.testing.TestStream;
 import org.apache.beam.sdk.transforms.DoFn;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionTuple;
@@ -209,9 +208,7 @@
         type = rows.get(0).getSchema();
       }
 
-      TestStream.Builder<Row> values =
-          TestStream.create(
-              type, SerializableFunctions.identity(), SerializableFunctions.identity());
+      TestStream.Builder<Row> values = TestStream.create(type);
 
       for (Row row : rows) {
         if (timestampField != null) {
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriverTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriverTest.java
index 0567908..5d01661 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriverTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/JdbcDriverTest.java
@@ -50,10 +50,10 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.util.ReleaseInfo;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.jdbc.CalciteConnection;
-import org.apache.calcite.jdbc.CalciteSchema;
-import org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.joda.time.ReadableInstant;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/parser/BeamDDLNestedTypesTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/parser/BeamDDLNestedTypesTest.java
index 39b5f0d..b472bba 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/parser/BeamDDLNestedTypesTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/parser/BeamDDLNestedTypesTest.java
@@ -32,7 +32,7 @@
 import org.apache.beam.sdk.extensions.sql.utils.QuickCheckGenerators.PrimitiveTypes;
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
-import org.apache.calcite.sql.parser.SqlParseException;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParseException;
 import org.junit.runner.RunWith;
 
 /**
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsTest.java
index 10e0b61..9e442c5 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/planner/NodeStatsTest.java
@@ -21,11 +21,11 @@
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils;
 import org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.plan.volcano.RelSubset;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.SingleRel;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.volcano.RelSubset;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.SingleRel;
 import org.junit.Assert;
 import org.junit.BeforeClass;
 import org.junit.Test;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRelTest.java
index df9305b..3ebe01e 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamAggregationRelTest.java
@@ -23,7 +23,7 @@
 import org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable;
 import org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Assert;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRelTest.java
index ad64f0d..8b1c2dc 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCalcRelTest.java
@@ -23,7 +23,7 @@
 import org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable;
 import org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Assert;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelBoundedVsBoundedTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelBoundedVsBoundedTest.java
index f572b67..6859f96 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelBoundedVsBoundedTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelBoundedVsBoundedTest.java
@@ -25,7 +25,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.hamcrest.core.StringContains;
 import org.junit.Assert;
 import org.junit.BeforeClass;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelUnboundedVsUnboundedTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelUnboundedVsUnboundedTest.java
index 1f73e85..f310265 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelUnboundedVsUnboundedTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamCoGBKJoinRelUnboundedVsUnboundedTest.java
@@ -28,7 +28,7 @@
 import org.apache.beam.sdk.transforms.ParDo;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Assert;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverterTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverterTest.java
index 7a0d04b..bbba865 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverterTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamEnumerableConverterTest.java
@@ -21,7 +21,6 @@
 import static org.junit.Assert.assertFalse;
 import static org.junit.Assert.assertTrue;
 
-import com.google.common.collect.ImmutableList;
 import java.math.BigDecimal;
 import java.util.List;
 import org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics;
@@ -40,17 +39,18 @@
 import org.apache.beam.sdk.values.PDone;
 import org.apache.beam.sdk.values.POutput;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.jdbc.JavaTypeFactoryImpl;
-import org.apache.calcite.linq4j.Enumerable;
-import org.apache.calcite.linq4j.Enumerator;
-import org.apache.calcite.plan.RelOptCluster;
-import org.apache.calcite.plan.volcano.VolcanoPlanner;
-import org.apache.calcite.prepare.RelOptTableImpl;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeSystem;
-import org.apache.calcite.rex.RexBuilder;
-import org.apache.calcite.rex.RexLiteral;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.JavaTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Enumerable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Enumerator;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptCluster;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.volcano.VolcanoPlanner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.prepare.RelOptTableImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeSystem;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexBuilder;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rex.RexLiteral;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
 import org.junit.runner.RunWith;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRelTest.java
index 22fb229..ff0d70f 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIOSourceRelTest.java
@@ -24,8 +24,8 @@
 import org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable;
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.testing.TestPipeline;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.metadata.RelMetadataQuery;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.metadata.RelMetadataQuery;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Assert;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRelTest.java
index 2b58272..d5acfab 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamIntersectRelTest.java
@@ -26,7 +26,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.junit.Assert;
 import org.junit.BeforeClass;
 import org.junit.Rule;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRelTest.java
index c29eeb2..074c447 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamMinusRelTest.java
@@ -28,7 +28,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Assert;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRelTest.java
index 91043c3..39e2a73 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSideInputJoinRelTest.java
@@ -28,7 +28,7 @@
 import org.apache.beam.sdk.transforms.ParDo;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.junit.Assert;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRelTest.java
index 15cf8cb..bba4876 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamSortRelTest.java
@@ -25,7 +25,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.joda.time.DateTime;
 import org.junit.Assert;
 import org.junit.Before;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRelTest.java
index d5b2857..640a1df 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUncollectRelTest.java
@@ -27,7 +27,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.junit.Assert;
 import org.junit.Rule;
 import org.junit.Test;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRelTest.java
index 3ed476d..6f77751 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamUnionRelTest.java
@@ -26,7 +26,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.junit.Assert;
 import org.junit.BeforeClass;
 import org.junit.Rule;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRelTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRelTest.java
index 065b558..0787751 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRelTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rel/BeamValuesRelTest.java
@@ -25,7 +25,7 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
 import org.junit.Assert;
 import org.junit.BeforeClass;
 import org.junit.Rule;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinReorderingTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinReorderingTest.java
index 9b2602f..2d0a1be 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinReorderingTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/rule/JoinReorderingTest.java
@@ -31,43 +31,43 @@
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.DataContext;
-import org.apache.calcite.adapter.enumerable.EnumerableConvention;
-import org.apache.calcite.adapter.enumerable.EnumerableRules;
-import org.apache.calcite.linq4j.Enumerable;
-import org.apache.calcite.linq4j.Linq4j;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelOptRule;
-import org.apache.calcite.plan.RelTraitSet;
-import org.apache.calcite.rel.RelCollationTraitDef;
-import org.apache.calcite.rel.RelCollations;
-import org.apache.calcite.rel.RelFieldCollation;
-import org.apache.calcite.rel.RelNode;
-import org.apache.calcite.rel.RelRoot;
-import org.apache.calcite.rel.core.Join;
-import org.apache.calcite.rel.core.TableScan;
-import org.apache.calcite.rel.rules.JoinCommuteRule;
-import org.apache.calcite.rel.rules.SortProjectTransposeRule;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.schema.ScannableTable;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.schema.Statistic;
-import org.apache.calcite.schema.Statistics;
-import org.apache.calcite.schema.Table;
-import org.apache.calcite.schema.impl.AbstractSchema;
-import org.apache.calcite.schema.impl.AbstractTable;
-import org.apache.calcite.sql.SqlNode;
-import org.apache.calcite.sql.parser.SqlParser;
-import org.apache.calcite.tools.FrameworkConfig;
-import org.apache.calcite.tools.Frameworks;
-import org.apache.calcite.tools.Planner;
-import org.apache.calcite.tools.Programs;
-import org.apache.calcite.tools.RuleSet;
-import org.apache.calcite.tools.RuleSets;
-import org.apache.calcite.util.ImmutableBitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.DataContext;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.EnumerableConvention;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.enumerable.EnumerableRules;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Enumerable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.linq4j.Linq4j;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelOptRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollationTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelCollations;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelFieldCollation;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.RelRoot;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.Join;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.core.TableScan;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.JoinCommuteRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.rules.SortProjectTransposeRule;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.ScannableTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Statistic;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Statistics;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.impl.AbstractSchema;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.impl.AbstractTable;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlNode;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.parser.SqlParser;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Frameworks;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Planner;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Programs;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.RuleSets;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.util.ImmutableBitSet;
 import org.junit.Assert;
 import org.junit.Test;
 
@@ -417,7 +417,8 @@
   }
 
   @Override
-  protected Map<String, org.apache.calcite.schema.Table> getTableMap() {
+  protected Map<String, org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Table>
+      getTableMap() {
     return tables;
   }
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamSqlRowCoderTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamSqlRowCoderTest.java
index 6f14065..3624d127 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamSqlRowCoderTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/schema/BeamSqlRowCoderTest.java
@@ -23,12 +23,11 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.SchemaCoder;
 import org.apache.beam.sdk.testing.CoderProperties;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.Row;
-import org.apache.calcite.jdbc.JavaTypeFactoryImpl;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeSystem;
-import org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.JavaTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeSystem;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 import org.joda.time.DateTime;
 import org.junit.Test;
 
@@ -70,9 +69,7 @@
                 DateTime.now(),
                 true)
             .build();
-    Coder<Row> coder =
-        SchemaCoder.of(
-            beamSchema, SerializableFunctions.identity(), SerializableFunctions.identity());
+    Coder<Row> coder = SchemaCoder.of(beamSchema);
     CoderProperties.coderDecodeEncodeEqual(coder, row);
   }
 }
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtilsTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtilsTest.java
index 02349a5..50b6ab2 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtilsTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/impl/utils/CalciteUtilsTest.java
@@ -24,11 +24,11 @@
 import java.util.Map;
 import java.util.stream.Collectors;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.calcite.rel.type.RelDataType;
-import org.apache.calcite.rel.type.RelDataTypeFactory;
-import org.apache.calcite.rel.type.RelDataTypeSystem;
-import org.apache.calcite.sql.type.SqlTypeFactoryImpl;
-import org.apache.calcite.sql.type.SqlTypeName;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataType;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeSystem;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 import org.junit.Before;
 import org.junit.Test;
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlBuiltinFunctionsIntegrationTestBase.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlBuiltinFunctionsIntegrationTestBase.java
index 24f6a37..f025ee5 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlBuiltinFunctionsIntegrationTestBase.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlBuiltinFunctionsIntegrationTestBase.java
@@ -19,7 +19,7 @@
 
 import static org.apache.beam.sdk.extensions.sql.utils.DateTimeUtils.parseTimestampWithUTCTimeZone;
 import static org.apache.beam.sdk.extensions.sql.utils.RowAsserts.matchesScalar;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Preconditions.checkArgument;
 import static org.junit.Assert.assertTrue;
 
 import com.google.auto.value.AutoValue;
@@ -46,14 +46,13 @@
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.MapElements;
 import org.apache.beam.sdk.transforms.PTransform;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PDone;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TypeDescriptors;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Iterables;
 import org.joda.time.DateTime;
 import org.junit.Rule;
 
@@ -391,12 +390,7 @@
       public PDone expand(PBegin begin) {
         PCollection<Boolean> result =
             begin
-                .apply(
-                    Create.of(DUMMY_ROW)
-                        .withSchema(
-                            DUMMY_SCHEMA,
-                            SerializableFunctions.identity(),
-                            SerializableFunctions.identity()))
+                .apply(Create.of(DUMMY_ROW).withRowSchema(DUMMY_SCHEMA))
                 .apply(SqlTransform.query("SELECT " + expr))
                 .apply(MapElements.into(TypeDescriptors.booleans()).via(row -> row.getBoolean(0)));
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlDateFunctionsIntegrationTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlDateFunctionsIntegrationTest.java
index dd2a1db..0342cf0 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlDateFunctionsIntegrationTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/integrationtest/BeamSqlDateFunctionsIntegrationTest.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.integrationtest;
 
-import static org.apache.calcite.avatica.util.DateTimeUtils.MILLIS_PER_DAY;
+import static org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.avatica.util.DateTimeUtils.MILLIS_PER_DAY;
 import static org.junit.Assert.assertFalse;
 import static org.junit.Assert.assertTrue;
 
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryReadWriteIT.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryReadWriteIT.java
index ab43fc6..2c00edb 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryReadWriteIT.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryReadWriteIT.java
@@ -17,6 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
 
+import static org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable.METHOD_PROPERTY;
 import static org.apache.beam.sdk.extensions.sql.utils.DateTimeUtils.parseTimestampWithUTCTimeZone;
 import static org.apache.beam.sdk.schemas.Schema.FieldType.BOOLEAN;
 import static org.apache.beam.sdk.schemas.Schema.FieldType.BYTE;
@@ -42,16 +43,16 @@
 import org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils;
 import org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider;
 import org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method;
 import org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery;
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.Schema.FieldType;
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
 import org.joda.time.Duration;
 import org.junit.Rule;
 import org.junit.Test;
@@ -155,6 +156,150 @@
   }
 
   @Test
+  public void testSQLRead_withExport() {
+    BeamSqlEnv sqlEnv = BeamSqlEnv.inMemory(new BigQueryTableProvider());
+
+    String createTableStatement =
+        "CREATE EXTERNAL TABLE TEST( \n"
+            + "   c_bigint BIGINT, \n"
+            + "   c_tinyint TINYINT, \n"
+            + "   c_smallint SMALLINT, \n"
+            + "   c_integer INTEGER, \n"
+            + "   c_float FLOAT, \n"
+            + "   c_double DOUBLE, \n"
+            + "   c_boolean BOOLEAN, \n"
+            + "   c_timestamp TIMESTAMP, \n"
+            + "   c_varchar VARCHAR, \n "
+            + "   c_char CHAR, \n"
+            + "   c_arr ARRAY<VARCHAR> \n"
+            + ") \n"
+            + "TYPE 'bigquery' \n"
+            + "LOCATION '"
+            + bigQueryTestingTypes.tableSpec()
+            + "' \n"
+            + "TBLPROPERTIES "
+            + "'{ "
+            + METHOD_PROPERTY
+            + ": \""
+            + Method.EXPORT.toString()
+            + "\" }'";
+    sqlEnv.executeDdl(createTableStatement);
+
+    String insertStatement =
+        "INSERT INTO TEST VALUES ("
+            + "9223372036854775807, "
+            + "127, "
+            + "32767, "
+            + "2147483647, "
+            + "1.0, "
+            + "1.0, "
+            + "TRUE, "
+            + "TIMESTAMP '2018-05-28 20:17:40.123', "
+            + "'varchar', "
+            + "'char', "
+            + "ARRAY['123', '456']"
+            + ")";
+
+    sqlEnv.parseQuery(insertStatement);
+    BeamSqlRelUtils.toPCollection(pipeline, sqlEnv.parseQuery(insertStatement));
+    pipeline.run().waitUntilFinish(Duration.standardMinutes(5));
+
+    String selectTableStatement = "SELECT * FROM TEST";
+    PCollection<Row> output =
+        BeamSqlRelUtils.toPCollection(readPipeline, sqlEnv.parseQuery(selectTableStatement));
+
+    PAssert.that(output)
+        .containsInAnyOrder(
+            row(
+                SOURCE_SCHEMA_TWO,
+                9223372036854775807L,
+                (byte) 127,
+                (short) 32767,
+                2147483647,
+                (float) 1.0,
+                1.0,
+                true,
+                parseTimestampWithUTCTimeZone("2018-05-28 20:17:40.123"),
+                "varchar",
+                "char",
+                Arrays.asList("123", "456")));
+    PipelineResult.State state = readPipeline.run().waitUntilFinish(Duration.standardMinutes(5));
+    assertEquals(state, State.DONE);
+  }
+
+  @Test
+  public void testSQLRead_withDirectRead() {
+    BeamSqlEnv sqlEnv = BeamSqlEnv.inMemory(new BigQueryTableProvider());
+
+    String createTableStatement =
+        "CREATE EXTERNAL TABLE TEST( \n"
+            + "   c_bigint BIGINT, \n"
+            + "   c_tinyint TINYINT, \n"
+            + "   c_smallint SMALLINT, \n"
+            + "   c_integer INTEGER, \n"
+            + "   c_float FLOAT, \n"
+            + "   c_double DOUBLE, \n"
+            + "   c_boolean BOOLEAN, \n"
+            + "   c_timestamp TIMESTAMP, \n"
+            + "   c_varchar VARCHAR, \n "
+            + "   c_char CHAR, \n"
+            + "   c_arr ARRAY<VARCHAR> \n"
+            + ") \n"
+            + "TYPE 'bigquery' \n"
+            + "LOCATION '"
+            + bigQueryTestingTypes.tableSpec()
+            + "' \n"
+            + "TBLPROPERTIES "
+            + "'{ "
+            + METHOD_PROPERTY
+            + ": \""
+            + Method.DIRECT_READ.toString()
+            + "\" }'";
+    sqlEnv.executeDdl(createTableStatement);
+
+    String insertStatement =
+        "INSERT INTO TEST VALUES ("
+            + "9223372036854775807, "
+            + "127, "
+            + "32767, "
+            + "2147483647, "
+            + "1.0, "
+            + "1.0, "
+            + "TRUE, "
+            + "TIMESTAMP '2018-05-28 20:17:40.123', "
+            + "'varchar', "
+            + "'char', "
+            + "ARRAY['123', '456']"
+            + ")";
+
+    sqlEnv.parseQuery(insertStatement);
+    BeamSqlRelUtils.toPCollection(pipeline, sqlEnv.parseQuery(insertStatement));
+    pipeline.run().waitUntilFinish(Duration.standardMinutes(5));
+
+    String selectTableStatement = "SELECT * FROM TEST";
+    PCollection<Row> output =
+        BeamSqlRelUtils.toPCollection(readPipeline, sqlEnv.parseQuery(selectTableStatement));
+
+    PAssert.that(output)
+        .containsInAnyOrder(
+            row(
+                SOURCE_SCHEMA_TWO,
+                9223372036854775807L,
+                (byte) 127,
+                (short) 32767,
+                2147483647,
+                (float) 1.0,
+                1.0,
+                true,
+                parseTimestampWithUTCTimeZone("2018-05-28 20:17:40.123"),
+                "varchar",
+                "char",
+                Arrays.asList("123", "456")));
+    PipelineResult.State state = readPipeline.run().waitUntilFinish(Duration.standardMinutes(5));
+    assertEquals(state, State.DONE);
+  }
+
+  @Test
   public void testSQLTypes() {
     BeamSqlEnv sqlEnv = BeamSqlEnv.inMemory(new BigQueryTableProvider());
 
@@ -267,10 +412,7 @@
   }
 
   private PCollection<Row> createPCollection(Pipeline pipeline, Row... rows) {
-    return pipeline.apply(
-        Create.of(Arrays.asList(rows))
-            .withSchema(
-                SOURCE_SCHEMA, SerializableFunctions.identity(), SerializableFunctions.identity()));
+    return pipeline.apply(Create.of(Arrays.asList(rows)).withRowSchema(SOURCE_SCHEMA));
   }
 
   private Row row(Schema schema, Object... values) {
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryRowCountIT.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryRowCountIT.java
index 3a97754..5964521 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryRowCountIT.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryRowCountIT.java
@@ -38,7 +38,7 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.runner.RunWith;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProviderTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProviderTest.java
index 47983e2..6d2eee0 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProviderTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProviderTest.java
@@ -17,14 +17,18 @@
  */
 package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
 
+import static org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable.METHOD_PROPERTY;
 import static org.apache.beam.sdk.schemas.Schema.toSchema;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.assertThrows;
 import static org.junit.Assert.assertTrue;
 
+import com.alibaba.fastjson.JSON;
 import java.util.stream.Stream;
 import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method;
 import org.apache.beam.sdk.schemas.Schema;
 import org.junit.Test;
 
@@ -49,6 +53,66 @@
     assertEquals("project:dataset.table", bqTable.bqLocation);
   }
 
+  @Test
+  public void testDefaultMethod_whenPropertiesAreNotSet() {
+    Table table = fakeTable("hello");
+    BigQueryTable sqlTable = (BigQueryTable) provider.buildBeamSqlTable(table);
+
+    assertEquals(Method.DEFAULT, sqlTable.method);
+  }
+
+  @Test
+  public void testSelectDefaultMethodExplicitly() {
+    Table table =
+        fakeTableWithProperties(
+            "hello", "{ " + METHOD_PROPERTY + ": " + "\"" + Method.DEFAULT.toString() + "\" }");
+    BigQueryTable sqlTable = (BigQueryTable) provider.buildBeamSqlTable(table);
+
+    assertEquals(Method.DEFAULT, sqlTable.method);
+  }
+
+  @Test
+  public void testSelectDirectReadMethod() {
+    Table table =
+        fakeTableWithProperties(
+            "hello", "{ " + METHOD_PROPERTY + ": " + "\"" + Method.DIRECT_READ.toString() + "\" }");
+    BigQueryTable sqlTable = (BigQueryTable) provider.buildBeamSqlTable(table);
+
+    assertEquals(Method.DIRECT_READ, sqlTable.method);
+  }
+
+  @Test
+  public void testSelectExportMethod() {
+    Table table =
+        fakeTableWithProperties(
+            "hello", "{ " + METHOD_PROPERTY + ": " + "\"" + Method.EXPORT.toString() + "\" }");
+    BigQueryTable sqlTable = (BigQueryTable) provider.buildBeamSqlTable(table);
+
+    assertEquals(Method.EXPORT, sqlTable.method);
+  }
+
+  @Test
+  public void testRuntimeExceptionThrown_whenAnInvalidPropertyIsSpecified() {
+    Table table = fakeTableWithProperties("hello", "{ " + METHOD_PROPERTY + ": \"blahblah\" }");
+
+    assertThrows(
+        RuntimeException.class,
+        () -> {
+          provider.buildBeamSqlTable(table);
+        });
+  }
+
+  @Test
+  public void testRuntimeExceptionThrown_whenAPropertyOfInvalidTypeIsSpecified() {
+    Table table = fakeTableWithProperties("hello", "{ " + METHOD_PROPERTY + ": 1337 }");
+
+    assertThrows(
+        RuntimeException.class,
+        () -> {
+          provider.buildBeamSqlTable(table);
+        });
+  }
+
   private static Table fakeTable(String name) {
     return Table.builder()
         .name(name)
@@ -62,4 +126,19 @@
         .type("bigquery")
         .build();
   }
+
+  private static Table fakeTableWithProperties(String name, String properties) {
+    return Table.builder()
+        .name(name)
+        .comment(name + " table")
+        .location("project:dataset.table")
+        .schema(
+            Stream.of(
+                    Schema.Field.nullable("id", Schema.FieldType.INT32),
+                    Schema.Field.nullable("name", Schema.FieldType.STRING))
+                .collect(toSchema()))
+        .type("bigquery")
+        .properties(JSON.parseObject(properties))
+        .build();
+  }
 }
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTestTableProvider.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTestTableProvider.java
index b7ecea4..69fa9ed 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTestTableProvider.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTestTableProvider.java
@@ -17,7 +17,7 @@
  */
 package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
 
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects.firstNonNull;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.MoreObjects.firstNonNull;
 
 import java.util.HashMap;
 import java.util.Map;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaCSVTableTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaCSVTableTest.java
index c407ff4..a15a149 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaCSVTableTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaCSVTableTest.java
@@ -36,11 +36,11 @@
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.JavaTypeFactoryImpl;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.rel.type.RelDataTypeSystem;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.type.SqlTypeName;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.calcite.adapter.java.JavaTypeFactory;
-import org.apache.calcite.jdbc.JavaTypeFactoryImpl;
-import org.apache.calcite.rel.type.RelDataTypeSystem;
-import org.apache.calcite.sql.type.SqlTypeName;
 import org.apache.commons.csv.CSVFormat;
 import org.junit.Assert;
 import org.junit.Rule;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/KafkaTableProviderTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/KafkaTableProviderTest.java
index 758fdcd..0f76daf 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/KafkaTableProviderTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/KafkaTableProviderTest.java
@@ -28,7 +28,7 @@
 import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.schemas.Schema;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
 import org.junit.Test;
 
 /** UnitTest for {@link KafkaTableProvider}. */
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubJsonIT.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubJsonIT.java
index 235ad7b..2896003 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubJsonIT.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubJsonIT.java
@@ -53,15 +53,14 @@
 import org.apache.beam.sdk.schemas.Schema;
 import org.apache.beam.sdk.schemas.SchemaCoder;
 import org.apache.beam.sdk.testing.TestPipeline;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.util.common.ReflectHelpers;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableList;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableSet;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.jdbc.CalciteConnection;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Supplier;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
-import org.apache.calcite.jdbc.CalciteConnection;
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 import org.junit.Ignore;
@@ -137,8 +136,7 @@
     queryOutput.apply(
         "waitForSuccess",
         resultSignal.signalSuccessWhen(
-            SchemaCoder.of(
-                PAYLOAD_SCHEMA, SerializableFunctions.identity(), SerializableFunctions.identity()),
+            SchemaCoder.of(PAYLOAD_SCHEMA),
             observedRows ->
                 observedRows.equals(
                     ImmutableSet.of(
@@ -209,8 +207,7 @@
     queryOutput.apply(
         "waitForSuccess",
         resultSignal.signalSuccessWhen(
-            SchemaCoder.of(
-                PAYLOAD_SCHEMA, SerializableFunctions.identity(), SerializableFunctions.identity()),
+            SchemaCoder.of(PAYLOAD_SCHEMA),
             observedRows ->
                 observedRows.equals(
                     ImmutableSet.of(
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubMessageToRowTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubMessageToRowTest.java
index f3280c2..370c214 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubMessageToRowTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/pubsub/PubsubMessageToRowTest.java
@@ -22,7 +22,7 @@
 import static org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.VARCHAR;
 import static org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubMessageToRow.DLQ_TAG;
 import static org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubMessageToRow.MAIN_TAG;
-import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables.size;
+import static org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Iterables.size;
 import static org.junit.Assert.assertEquals;
 
 import java.io.Serializable;
@@ -42,8 +42,8 @@
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TimestampedValue;
 import org.apache.beam.sdk.values.TupleTagList;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableMap;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.ImmutableSet;
 import org.joda.time.DateTime;
 import org.joda.time.Instant;
 import org.junit.Rule;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProviderTest.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProviderTest.java
index 2c3eeea..172bdb1 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProviderTest.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/text/TextTableProviderTest.java
@@ -34,7 +34,7 @@
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.sdk.values.TypeDescriptors;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Charsets;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.base.Charsets;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.rules.TemporaryFolder;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/utils/RowAsserts.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/utils/RowAsserts.java
index 36afc26..abbb4df 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/utils/RowAsserts.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/utils/RowAsserts.java
@@ -22,7 +22,7 @@
 
 import org.apache.beam.sdk.transforms.SerializableFunction;
 import org.apache.beam.sdk.values.Row;
-import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
+import org.apache.beam.vendor.calcite.v1_20_0.com.google.common.collect.Iterables;
 
 /** Contain helpers to assert {@link Row}s. */
 public class RowAsserts {
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/JoinCompoundIdentifiersTestZetaSQL.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/JoinCompoundIdentifiersTestZetaSQL.java
index 3195a60..04589ab 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/JoinCompoundIdentifiersTestZetaSQL.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/JoinCompoundIdentifiersTestZetaSQL.java
@@ -37,14 +37,14 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Contexts;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Frameworks;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.plan.Contexts;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelTraitDef;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.tools.FrameworkConfig;
-import org.apache.calcite.tools.Frameworks;
 import org.joda.time.Duration;
 import org.junit.Rule;
 import org.junit.Test;
diff --git a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLDialectSpecTestZetaSQL.java b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLDialectSpecTestZetaSQL.java
index 0c44a26..fb5028d 100644
--- a/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLDialectSpecTestZetaSQL.java
+++ b/sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/zetasql/ZetaSQLDialectSpecTestZetaSQL.java
@@ -72,15 +72,15 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Context;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.Contexts;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.ConventionTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.plan.RelTraitDef;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.SchemaPlus;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.FrameworkConfig;
+import org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.tools.Frameworks;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap;
-import org.apache.calcite.plan.Context;
-import org.apache.calcite.plan.Contexts;
-import org.apache.calcite.plan.ConventionTraitDef;
-import org.apache.calcite.plan.RelTraitDef;
-import org.apache.calcite.schema.SchemaPlus;
-import org.apache.calcite.tools.FrameworkConfig;
-import org.apache.calcite.tools.Frameworks;
 import org.joda.time.DateTime;
 import org.joda.time.Duration;
 import org.joda.time.chrono.ISOChronology;
diff --git a/sdks/java/extensions/zetasketch/build.gradle b/sdks/java/extensions/zetasketch/build.gradle
index 157a193..30e8bc8 100644
--- a/sdks/java/extensions/zetasketch/build.gradle
+++ b/sdks/java/extensions/zetasketch/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.extensions.zetasketch')
 
 description = "Apache Beam :: SDKs :: Java :: Extensions :: ZetaSketch"
 
diff --git a/sdks/java/fn-execution/build.gradle b/sdks/java/fn-execution/build.gradle
index bf4d5c4..ea46cff 100644
--- a/sdks/java/fn-execution/build.gradle
+++ b/sdks/java/fn-execution/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.fn')
 
 description = "Apache Beam :: SDKs :: Java :: Fn Execution"
 ext.summary = """Contains code shared across the Beam Java SDK Harness and Java Runners to execute using
diff --git a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserver.java b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserver.java
index c2cec10..02460bf 100644
--- a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserver.java
+++ b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserver.java
@@ -97,14 +97,14 @@
     // This will add an empty data block representing the end of stream.
     elements
         .addDataBuilder()
-        .setInstructionReference(outputLocation.getInstructionId())
-        .setPtransformId(outputLocation.getPTransformId());
+        .setInstructionId(outputLocation.getInstructionId())
+        .setTransformId(outputLocation.getTransformId());
 
     LOG.debug(
         "Closing stream for instruction {} and "
             + "transform {} having transmitted {} values {} bytes",
         outputLocation.getInstructionId(),
-        outputLocation.getPTransformId(),
+        outputLocation.getTransformId(),
         counter,
         byteCounter);
     outboundObserver.onNext(elements.build());
@@ -137,8 +137,8 @@
 
     elements
         .addDataBuilder()
-        .setInstructionReference(outputLocation.getInstructionId())
-        .setPtransformId(outputLocation.getPTransformId())
+        .setInstructionId(outputLocation.getInstructionId())
+        .setTransformId(outputLocation.getTransformId())
         .setData(bufferedElements.toByteString());
 
     byteCounter += bufferedElements.size();
diff --git a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexer.java b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexer.java
index 8bd669d..7ed83df 100644
--- a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexer.java
+++ b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexer.java
@@ -127,8 +127,7 @@
     public void onNext(BeamFnApi.Elements value) {
       for (BeamFnApi.Elements.Data data : value.getDataList()) {
         try {
-          LogicalEndpoint key =
-              LogicalEndpoint.of(data.getInstructionReference(), data.getPtransformId());
+          LogicalEndpoint key = LogicalEndpoint.of(data.getInstructionId(), data.getTransformId());
           CompletableFuture<Consumer<BeamFnApi.Elements.Data>> consumer = receiverFuture(key);
           if (!consumer.isDone()) {
             LOG.debug(
@@ -147,15 +146,15 @@
         } catch (ExecutionException | InterruptedException e) {
           LOG.error(
               "Client interrupted during handling of data for instruction {} and transform {}",
-              data.getInstructionReference(),
-              data.getPtransformId(),
+              data.getInstructionId(),
+              data.getTransformId(),
               e);
           outboundObserver.onError(e);
         } catch (RuntimeException e) {
           LOG.error(
               "Client failed to handle data for instruction {} and transform {}",
-              data.getInstructionReference(),
-              data.getPtransformId(),
+              data.getInstructionId(),
+              data.getTransformId(),
               e);
           outboundObserver.onError(e);
         }
diff --git a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataInboundObserver.java b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataInboundObserver.java
index e43d900..2ed5539 100644
--- a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataInboundObserver.java
+++ b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/BeamFnDataInboundObserver.java
@@ -62,8 +62,8 @@
         LOG.debug(
             "Closing stream for instruction {} and "
                 + "transform {} having consumed {} values {} bytes",
-            t.getInstructionReference(),
-            t.getPtransformId(),
+            t.getInstructionId(),
+            t.getTransformId(),
             counter,
             byteCounter);
         readFuture.complete();
diff --git a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/LogicalEndpoint.java b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/LogicalEndpoint.java
index 57477db..2adbfa9 100644
--- a/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/LogicalEndpoint.java
+++ b/sdks/java/fn-execution/src/main/java/org/apache/beam/sdk/fn/data/LogicalEndpoint.java
@@ -30,7 +30,7 @@
 
   public abstract String getInstructionId();
 
-  public abstract String getPTransformId();
+  public abstract String getTransformId();
 
   public static LogicalEndpoint of(String instructionId, String transformId) {
     return new AutoValue_LogicalEndpoint(instructionId, transformId);
diff --git a/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserverTest.java b/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserverTest.java
index 6983a9d..cb1752c 100644
--- a/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserverTest.java
+++ b/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataBufferingOutboundObserverTest.java
@@ -140,8 +140,8 @@
         BeamFnApi.Elements.newBuilder(messageWithData(new byte[1]))
             .addData(
                 BeamFnApi.Elements.Data.newBuilder()
-                    .setInstructionReference(OUTPUT_LOCATION.getInstructionId())
-                    .setPtransformId(OUTPUT_LOCATION.getPTransformId()))
+                    .setInstructionId(OUTPUT_LOCATION.getInstructionId())
+                    .setTransformId(OUTPUT_LOCATION.getTransformId()))
             .build(),
         Iterables.get(values, 1));
   }
@@ -154,8 +154,8 @@
     return BeamFnApi.Elements.newBuilder()
         .addData(
             BeamFnApi.Elements.Data.newBuilder()
-                .setInstructionReference(OUTPUT_LOCATION.getInstructionId())
-                .setPtransformId(OUTPUT_LOCATION.getPTransformId())
+                .setInstructionId(OUTPUT_LOCATION.getInstructionId())
+                .setTransformId(OUTPUT_LOCATION.getTransformId())
                 .setData(output.toByteString()))
         .build();
   }
diff --git a/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexerTest.java b/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexerTest.java
index 5b4a426..bf1b1d3 100644
--- a/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexerTest.java
+++ b/sdks/java/fn-execution/src/test/java/org/apache/beam/sdk/fn/data/BeamFnDataGrpcMultiplexerTest.java
@@ -44,16 +44,16 @@
       BeamFnApi.Elements.newBuilder()
           .addData(
               BeamFnApi.Elements.Data.newBuilder()
-                  .setInstructionReference(OUTPUT_LOCATION.getInstructionId())
-                  .setPtransformId(OUTPUT_LOCATION.getPTransformId())
+                  .setInstructionId(OUTPUT_LOCATION.getInstructionId())
+                  .setTransformId(OUTPUT_LOCATION.getTransformId())
                   .setData(ByteString.copyFrom(new byte[1])))
           .build();
   private static final BeamFnApi.Elements TERMINAL_ELEMENTS =
       BeamFnApi.Elements.newBuilder()
           .addData(
               BeamFnApi.Elements.Data.newBuilder()
-                  .setInstructionReference(OUTPUT_LOCATION.getInstructionId())
-                  .setPtransformId(OUTPUT_LOCATION.getPTransformId()))
+                  .setInstructionId(OUTPUT_LOCATION.getInstructionId())
+                  .setTransformId(OUTPUT_LOCATION.getTransformId()))
           .build();
 
   @Test
diff --git a/sdks/java/harness/build.gradle b/sdks/java/harness/build.gradle
index cc8ff4f..51c65bb 100644
--- a/sdks/java/harness/build.gradle
+++ b/sdks/java/harness/build.gradle
@@ -27,6 +27,7 @@
                         ":runners:core-java", ":runners:core-construction-java"]
 
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.fn.harness',
   validateShadowJar: false,
   testShadowJar: true,
   shadowClosure:
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/SplittableProcessElementsRunner.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/SplittableProcessElementsRunner.java
index 4c97db8..ec2875e 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/SplittableProcessElementsRunner.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/SplittableProcessElementsRunner.java
@@ -231,13 +231,13 @@
       }
       BundleApplication primaryApplication =
           BundleApplication.newBuilder()
-              .setPtransformId(context.ptransformId)
+              .setTransformId(context.ptransformId)
               .setInputId(mainInputId)
               .setElement(primaryBytes.toByteString())
               .build();
       BundleApplication residualApplication =
           BundleApplication.newBuilder()
-              .setPtransformId(context.ptransformId)
+              .setTransformId(context.ptransformId)
               .setInputId(mainInputId)
               .setElement(residualBytes.toByteString())
               .build();
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
index ca3d83c..8b1c5fd 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
@@ -220,7 +220,7 @@
     // process() calls will execute on this thread when queueingClient.drainAndBlock() is called.
     QueueingBeamFnDataClient queueingClient = new QueueingBeamFnDataClient(this.beamFnDataClient);
 
-    String bundleId = request.getProcessBundle().getProcessBundleDescriptorReference();
+    String bundleId = request.getProcessBundle().getProcessBundleDescriptorId();
     BeamFnApi.ProcessBundleDescriptor bundleDescriptor =
         (BeamFnApi.ProcessBundleDescriptor) fnApiRegistry.apply(bundleId);
 
@@ -264,13 +264,13 @@
             // Reset primaries and accumulate residuals.
             Multimap<String, BundleApplication> newPrimaries = ArrayListMultimap.create();
             for (BundleApplication primary : primaries) {
-              newPrimaries.put(primary.getPtransformId(), primary);
+              newPrimaries.put(primary.getTransformId(), primary);
             }
             allPrimaries.clear();
             allPrimaries.putAll(newPrimaries);
 
             for (DelayedBundleApplication residual : residuals) {
-              allResiduals.put(residual.getApplication().getPtransformId(), residual);
+              allResiduals.put(residual.getApplication().getTransformId(), residual);
             }
           };
 
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
index de85e38..375c9af 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
@@ -82,7 +82,7 @@
     LOG.debug(
         "Registering consumer for instruction {} and transform {}",
         inputLocation.getInstructionId(),
-        inputLocation.getPTransformId());
+        inputLocation.getTransformId());
 
     BeamFnDataGrpcMultiplexer client = getClientFor(apiServiceDescriptor);
     BeamFnDataInboundObserver<T> inboundObserver =
@@ -111,7 +111,7 @@
     LOG.debug(
         "Creating output consumer for instruction {} and transform {}",
         outputLocation.getInstructionId(),
-        outputLocation.getPTransformId());
+        outputLocation.getTransformId());
     Optional<Integer> bufferLimit = getBufferLimit(options);
     if (bufferLimit.isPresent()) {
       return BeamFnDataBufferingOutboundObserver.forLocationWithBufferLimit(
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClient.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClient.java
index d666cf0..01cf0c7 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClient.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClient.java
@@ -61,7 +61,7 @@
     LOG.debug(
         "Registering consumer for instruction {} and transform {}",
         inputLocation.getInstructionId(),
-        inputLocation.getPTransformId());
+        inputLocation.getTransformId());
 
     QueueingFnDataReceiver<T> queueingConsumer = new QueueingFnDataReceiver<T>(consumer);
     InboundDataClient inboundDataClient =
@@ -133,7 +133,7 @@
     LOG.debug(
         "Creating output consumer for instruction {} and transform {}",
         outputLocation.getInstructionId(),
-        outputLocation.getPTransformId());
+        outputLocation.getTransformId());
     return this.mainClient.send(apiServiceDescriptor, outputLocation, coder);
   }
 
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/BagUserState.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/BagUserState.java
index 11f16d4..b3e6f64 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/BagUserState.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/BagUserState.java
@@ -63,10 +63,10 @@
 
     StateRequest.Builder requestBuilder = StateRequest.newBuilder();
     requestBuilder
-        .setInstructionReference(instructionId)
+        .setInstructionId(instructionId)
         .getStateKeyBuilder()
         .getBagUserStateBuilder()
-        .setPtransformId(ptransformId)
+        .setTransformId(ptransformId)
         .setUserStateId(stateId)
         .setWindow(encodedWindow)
         .setKey(encodedKey);
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/FnApiStateAccessor.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/FnApiStateAccessor.java
index b1224bd..26b0dfa 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/FnApiStateAccessor.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/FnApiStateAccessor.java
@@ -104,7 +104,11 @@
 
               ByteString.Output encodedKeyOut = ByteString.newOutput();
               try {
-                ((Coder) keyCoder).encode(((KV<?, ?>) element.getValue()).getKey(), encodedKeyOut);
+                ((Coder) keyCoder)
+                    .encode(
+                        ((KV<?, ?>) element.getValue()).getKey(),
+                        encodedKeyOut,
+                        Coder.Context.NESTED);
               } catch (IOException e) {
                 throw new IllegalStateException(e);
               }
@@ -164,7 +168,7 @@
     StateKey.Builder cacheKeyBuilder = StateKey.newBuilder();
     cacheKeyBuilder
         .getMultimapSideInputBuilder()
-        .setPtransformId(ptransformId)
+        .setTransformId(ptransformId)
         .setSideInputId(tag.getId())
         .setWindow(encodedWindow);
     return (T)
@@ -448,7 +452,7 @@
         .getBagUserStateBuilder()
         .setWindow(encodedCurrentWindowSupplier.get())
         .setKey(encodedCurrentKeySupplier.get())
-        .setPtransformId(ptransformId)
+        .setTransformId(ptransformId)
         .setUserStateId(stateId);
     return builder.build();
   }
diff --git a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/MultimapSideInput.java b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/MultimapSideInput.java
index 7ff0f7e..fff60e6 100644
--- a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/MultimapSideInput.java
+++ b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/state/MultimapSideInput.java
@@ -67,10 +67,10 @@
     }
     StateRequest.Builder requestBuilder = StateRequest.newBuilder();
     requestBuilder
-        .setInstructionReference(instructionId)
+        .setInstructionId(instructionId)
         .getStateKeyBuilder()
         .getMultimapSideInputBuilder()
-        .setPtransformId(ptransformId)
+        .setTransformId(ptransformId)
         .setSideInputId(sideInputId)
         .setWindow(encodedWindow)
         .setKey(output.toByteString());
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/FnApiDoFnRunnerTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/FnApiDoFnRunnerTest.java
index 088148d..7a14b38 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/FnApiDoFnRunnerTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/FnApiDoFnRunnerTest.java
@@ -111,7 +111,7 @@
 
   private static final Logger LOG = LoggerFactory.getLogger(FnApiDoFnRunnerTest.class);
 
-  public static final String TEST_PTRANSFORM_ID = "pTransformId";
+  public static final String TEST_TRANSFORM_ID = "pTransformId";
 
   private static class ConcatCombineFn extends CombineFn<String, String, String> {
     @Override
@@ -177,7 +177,7 @@
     PCollection<KV<String, String>> valuePCollection =
         p.apply(Create.of(KV.of("unused", "unused")));
     PCollection<String> outputPCollection =
-        valuePCollection.apply(TEST_PTRANSFORM_ID, ParDo.of(new TestStatefulDoFn()));
+        valuePCollection.apply(TEST_TRANSFORM_ID, ParDo.of(new TestStatefulDoFn()));
 
     SdkComponents sdkComponents = SdkComponents.create(p.getOptions());
     RunnerApi.Pipeline pProto = PipelineTranslation.toProto(p, sdkComponents);
@@ -187,10 +187,7 @@
         pProto
             .getComponents()
             .getTransformsOrThrow(
-                pProto
-                    .getComponents()
-                    .getTransformsOrThrow(TEST_PTRANSFORM_ID)
-                    .getSubtransforms(0));
+                pProto.getComponents().getTransformsOrThrow(TEST_TRANSFORM_ID).getSubtransforms(0));
 
     FakeBeamFnStateClient fakeClient =
         new FakeBeamFnStateClient(
@@ -206,7 +203,7 @@
             metricsContainerRegistry, mock(ExecutionStateTracker.class));
     consumers.register(
         outputPCollectionId,
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver) (FnDataReceiver<WindowedValue<String>>) mainOutputValues::add);
     PTransformFunctionRegistry startFunctionRegistry =
         new PTransformFunctionRegistry(
@@ -220,7 +217,7 @@
             PipelineOptionsFactory.create(),
             null /* beamFnDataClient */,
             fakeClient,
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             pTransform,
             Suppliers.ofInstance("57L")::get,
             pProto.getComponents().getPcollectionsMap(),
@@ -282,7 +279,7 @@
     return StateKey.newBuilder()
         .setBagUserState(
             StateKey.BagUserState.newBuilder()
-                .setPtransformId(TEST_PTRANSFORM_ID)
+                .setTransformId(TEST_TRANSFORM_ID)
                 .setUserStateId(userStateId)
                 .setKey(encode(key))
                 .setWindow(
@@ -334,7 +331,7 @@
     TupleTag<String> additionalOutput = new TupleTag<String>("additional") {};
     PCollectionTuple outputPCollection =
         valuePCollection.apply(
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             ParDo.of(
                     new TestSideInputDoFn(
                         defaultSingletonSideInputView,
@@ -354,7 +351,7 @@
         sdkComponents.registerPCollection(outputPCollection.get(additionalOutput));
 
     RunnerApi.PTransform pTransform =
-        pProto.getComponents().getTransformsOrThrow(TEST_PTRANSFORM_ID);
+        pProto.getComponents().getTransformsOrThrow(TEST_TRANSFORM_ID);
 
     ImmutableMap<StateKey, ByteString> stateData =
         ImmutableMap.of(
@@ -373,11 +370,11 @@
             metricsContainerRegistry, mock(ExecutionStateTracker.class));
     consumers.register(
         outputPCollectionId,
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver) (FnDataReceiver<WindowedValue<String>>) mainOutputValues::add);
     consumers.register(
         additionalPCollectionId,
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver) (FnDataReceiver<WindowedValue<String>>) additionalOutputValues::add);
     PTransformFunctionRegistry startFunctionRegistry =
         new PTransformFunctionRegistry(
@@ -391,7 +388,7 @@
             PipelineOptionsFactory.create(),
             null /* beamFnDataClient */,
             fakeClient,
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             pTransform,
             Suppliers.ofInstance("57L")::get,
             pProto.getComponents().getPcollectionsMap(),
@@ -478,7 +475,7 @@
         valuePCollection.apply(View.asIterable());
     PCollection<Iterable<String>> outputPCollection =
         valuePCollection.apply(
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             ParDo.of(new TestSideInputIsAccessibleForDownstreamCallersDoFn(iterableSideInputView))
                 .withSideInputs(iterableSideInputView));
 
@@ -491,10 +488,7 @@
         pProto
             .getComponents()
             .getTransformsOrThrow(
-                pProto
-                    .getComponents()
-                    .getTransformsOrThrow(TEST_PTRANSFORM_ID)
-                    .getSubtransforms(0));
+                pProto.getComponents().getTransformsOrThrow(TEST_TRANSFORM_ID).getSubtransforms(0));
 
     ImmutableMap<StateKey, ByteString> stateData =
         ImmutableMap.of(
@@ -514,7 +508,7 @@
             metricsContainerRegistry, mock(ExecutionStateTracker.class));
     consumers.register(
         Iterables.getOnlyElement(pTransform.getOutputsMap().values()),
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver) (FnDataReceiver<WindowedValue<Iterable<String>>>) mainOutputValues::add);
     PTransformFunctionRegistry startFunctionRegistry =
         new PTransformFunctionRegistry(
@@ -528,7 +522,7 @@
             PipelineOptionsFactory.create(),
             null /* beamFnDataClient */,
             fakeClient,
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             pTransform,
             Suppliers.ofInstance("57L")::get,
             pProto.getComponents().getPcollectionsMap(),
@@ -587,7 +581,7 @@
         valuePCollection.apply(View.asIterable());
     PCollection<Iterable<String>> outputPCollection =
         valuePCollection.apply(
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             ParDo.of(new TestSideInputIsAccessibleForDownstreamCallersDoFn(iterableSideInputView))
                 .withSideInputs(iterableSideInputView));
 
@@ -600,10 +594,7 @@
         pProto
             .getComponents()
             .getTransformsOrThrow(
-                pProto
-                    .getComponents()
-                    .getTransformsOrThrow(TEST_PTRANSFORM_ID)
-                    .getSubtransforms(0));
+                pProto.getComponents().getTransformsOrThrow(TEST_TRANSFORM_ID).getSubtransforms(0));
 
     ImmutableMap<StateKey, ByteString> stateData =
         ImmutableMap.of(
@@ -623,7 +614,7 @@
             metricsContainerRegistry, mock(ExecutionStateTracker.class));
     consumers.register(
         Iterables.getOnlyElement(pTransform.getOutputsMap().values()),
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver) (FnDataReceiver<WindowedValue<Iterable<String>>>) mainOutputValues::add);
     PTransformFunctionRegistry startFunctionRegistry =
         new PTransformFunctionRegistry(
@@ -637,7 +628,7 @@
             PipelineOptionsFactory.create(),
             null /* beamFnDataClient */,
             fakeClient,
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             pTransform,
             Suppliers.ofInstance("57L")::get,
             pProto.getComponents().getPcollectionsMap(),
@@ -686,7 +677,7 @@
         .setLabel(
             MonitoringInfoConstants.Labels.NAME,
             TestSideInputIsAccessibleForDownstreamCallersDoFn.USER_COUNTER_NAME);
-    builder.setLabel(MonitoringInfoConstants.Labels.PTRANSFORM, TEST_PTRANSFORM_ID);
+    builder.setLabel(MonitoringInfoConstants.Labels.PTRANSFORM, TEST_TRANSFORM_ID);
     builder.setInt64Value(2);
     expected.add(builder.build());
 
@@ -756,7 +747,7 @@
     PCollection<KV<String, String>> valuePCollection =
         p.apply(Create.of(KV.of("unused", "unused")));
     PCollection<String> outputPCollection =
-        valuePCollection.apply(TEST_PTRANSFORM_ID, ParDo.of(new TestTimerfulDoFn()));
+        valuePCollection.apply(TEST_TRANSFORM_ID, ParDo.of(new TestTimerfulDoFn()));
 
     SdkComponents sdkComponents = SdkComponents.create();
     sdkComponents.registerEnvironment(Environment.getDefaultInstance());
@@ -776,7 +767,7 @@
         pProto
             .getComponents()
             .getTransformsOrThrow(
-                pProto.getComponents().getTransformsOrThrow(TEST_PTRANSFORM_ID).getSubtransforms(0))
+                pProto.getComponents().getTransformsOrThrow(TEST_TRANSFORM_ID).getSubtransforms(0))
             .toBuilder()
             // We need to re-write the "output" PCollections that a runner would have inserted
             // on the way to a output sink.
@@ -800,16 +791,16 @@
             metricsContainerRegistry, mock(ExecutionStateTracker.class));
     consumers.register(
         outputPCollectionId,
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver) (FnDataReceiver<WindowedValue<String>>) mainOutputValues::add);
     consumers.register(
         eventTimerOutputPCollectionId,
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver)
             (FnDataReceiver<WindowedValue<KV<String, Timer>>>) eventTimerOutputValues::add);
     consumers.register(
         processingTimerOutputPCollectionId,
-        TEST_PTRANSFORM_ID,
+        TEST_TRANSFORM_ID,
         (FnDataReceiver)
             (FnDataReceiver<WindowedValue<KV<String, Timer>>>) processingTimerOutputValues::add);
 
@@ -825,7 +816,7 @@
             PipelineOptionsFactory.create(),
             null /* beamFnDataClient */,
             fakeClient,
-            TEST_PTRANSFORM_ID,
+            TEST_TRANSFORM_ID,
             pTransform,
             Suppliers.ofInstance("57L")::get,
             ImmutableMap.<String, RunnerApi.PCollection>builder()
@@ -967,7 +958,7 @@
     return StateKey.newBuilder()
         .setMultimapSideInput(
             StateKey.MultimapSideInput.newBuilder()
-                .setPtransformId(TEST_PTRANSFORM_ID)
+                .setTransformId(TEST_TRANSFORM_ID)
                 .setSideInputId(sideInputId)
                 .setKey(key)
                 .setWindow(windowKey))
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/control/ProcessBundleHandlerTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/control/ProcessBundleHandlerTest.java
index 3406b33..54c1d1e 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/control/ProcessBundleHandlerTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/control/ProcessBundleHandlerTest.java
@@ -143,8 +143,7 @@
         BeamFnApi.InstructionRequest.newBuilder()
             .setInstructionId("999L")
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("1L"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("1L"))
             .build());
 
     // Processing of transforms is performed in reverse order.
@@ -197,8 +196,7 @@
     handler.processBundle(
         BeamFnApi.InstructionRequest.newBuilder()
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("1L"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("1L"))
             .build());
   }
 
@@ -245,8 +243,7 @@
     handler.processBundle(
         BeamFnApi.InstructionRequest.newBuilder()
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("1L"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("1L"))
             .build());
   }
 
@@ -293,8 +290,7 @@
     handler.processBundle(
         BeamFnApi.InstructionRequest.newBuilder()
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("1L"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("1L"))
             .build());
   }
 
@@ -331,7 +327,7 @@
                         // Simulate sleeping which introduces a race which most of the time requires
                         // the ProcessBundleHandler to block.
                         Uninterruptibles.sleepUninterruptibly(500, TimeUnit.MILLISECONDS);
-                        switch (stateRequestBuilder.getInstructionReference()) {
+                        switch (stateRequestBuilder.getInstructionId()) {
                           case "SUCCESS":
                             completableFuture.complete(StateResponse.getDefaultInstance());
                             break;
@@ -378,18 +374,15 @@
 
                   private void doStateCalls(BeamFnStateClient beamFnStateClient) {
                     beamFnStateClient.handle(
-                        StateRequest.newBuilder().setInstructionReference("SUCCESS"),
-                        successfulResponse);
+                        StateRequest.newBuilder().setInstructionId("SUCCESS"), successfulResponse);
                     beamFnStateClient.handle(
-                        StateRequest.newBuilder().setInstructionReference("FAIL"),
-                        unsuccessfulResponse);
+                        StateRequest.newBuilder().setInstructionId("FAIL"), unsuccessfulResponse);
                   }
                 }));
     handler.processBundle(
         BeamFnApi.InstructionRequest.newBuilder()
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("1L"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("1L"))
             .build());
 
     assertTrue(successfulResponse.isDone());
@@ -442,15 +435,14 @@
                     thrown.expect(IllegalStateException.class);
                     thrown.expectMessage("State API calls are unsupported");
                     beamFnStateClient.handle(
-                        StateRequest.newBuilder().setInstructionReference("SUCCESS"),
+                        StateRequest.newBuilder().setInstructionId("SUCCESS"),
                         new CompletableFuture<>());
                   }
                 }));
     handler.processBundle(
         BeamFnApi.InstructionRequest.newBuilder()
             .setProcessBundle(
-                BeamFnApi.ProcessBundleRequest.newBuilder()
-                    .setProcessBundleDescriptorReference("1L"))
+                BeamFnApi.ProcessBundleRequest.newBuilder().setProcessBundleDescriptorId("1L"))
             .build());
   }
 
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClientTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClientTest.java
index ed2f8b5..6ebd961 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClientTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClientTest.java
@@ -77,8 +77,8 @@
           BeamFnApi.Elements.newBuilder()
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_A.getInstructionId())
-                      .setPtransformId(ENDPOINT_A.getPTransformId())
+                      .setInstructionId(ENDPOINT_A.getInstructionId())
+                      .setTransformId(ENDPOINT_A.getTransformId())
                       .setData(
                           ByteString.copyFrom(encodeToByteArray(CODER, valueInGlobalWindow("ABC")))
                               .concat(
@@ -89,22 +89,22 @@
           BeamFnApi.Elements.newBuilder()
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_A.getInstructionId())
-                      .setPtransformId(ENDPOINT_A.getPTransformId())
+                      .setInstructionId(ENDPOINT_A.getInstructionId())
+                      .setTransformId(ENDPOINT_A.getTransformId())
                       .setData(
                           ByteString.copyFrom(
                               encodeToByteArray(CODER, valueInGlobalWindow("GHI")))))
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_A.getInstructionId())
-                      .setPtransformId(ENDPOINT_A.getPTransformId()))
+                      .setInstructionId(ENDPOINT_A.getInstructionId())
+                      .setTransformId(ENDPOINT_A.getTransformId()))
               .build();
       ELEMENTS_B_1 =
           BeamFnApi.Elements.newBuilder()
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_B.getInstructionId())
-                      .setPtransformId(ENDPOINT_B.getPTransformId())
+                      .setInstructionId(ENDPOINT_B.getInstructionId())
+                      .setTransformId(ENDPOINT_B.getTransformId())
                       .setData(
                           ByteString.copyFrom(encodeToByteArray(CODER, valueInGlobalWindow("JKL")))
                               .concat(
@@ -112,8 +112,8 @@
                                       encodeToByteArray(CODER, valueInGlobalWindow("MNO"))))))
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_B.getInstructionId())
-                      .setPtransformId(ENDPOINT_B.getPTransformId()))
+                      .setInstructionId(ENDPOINT_B.getInstructionId())
+                      .setTransformId(ENDPOINT_B.getTransformId()))
               .build();
     } catch (Exception e) {
       throw new ExceptionInInitializerError(e);
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataInboundObserverTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataInboundObserverTest.java
index 2bc985d..aa45df6 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataInboundObserverTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/BeamFnDataInboundObserverTest.java
@@ -100,9 +100,7 @@
 
   private BeamFnApi.Elements.Data dataWith(String... values) throws Exception {
     BeamFnApi.Elements.Data.Builder builder =
-        BeamFnApi.Elements.Data.newBuilder()
-            .setInstructionReference("777L")
-            .setPtransformId("999L");
+        BeamFnApi.Elements.Data.newBuilder().setInstructionId("777L").setTransformId("999L");
     ByteString.Output output = ByteString.newOutput();
     for (String value : values) {
       CODER.encode(valueInGlobalWindow(value), output);
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClientTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClientTest.java
index d2fb062..8bcacfa 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClientTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/data/QueueingBeamFnDataClientTest.java
@@ -86,8 +86,8 @@
           BeamFnApi.Elements.newBuilder()
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_A.getInstructionId())
-                      .setPtransformId(ENDPOINT_A.getPTransformId())
+                      .setInstructionId(ENDPOINT_A.getInstructionId())
+                      .setTransformId(ENDPOINT_A.getTransformId())
                       .setData(
                           ByteString.copyFrom(encodeToByteArray(CODER, valueInGlobalWindow("ABC")))
                               .concat(
@@ -98,22 +98,22 @@
           BeamFnApi.Elements.newBuilder()
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_A.getInstructionId())
-                      .setPtransformId(ENDPOINT_A.getPTransformId())
+                      .setInstructionId(ENDPOINT_A.getInstructionId())
+                      .setTransformId(ENDPOINT_A.getTransformId())
                       .setData(
                           ByteString.copyFrom(
                               encodeToByteArray(CODER, valueInGlobalWindow("GHI")))))
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_A.getInstructionId())
-                      .setPtransformId(ENDPOINT_A.getPTransformId()))
+                      .setInstructionId(ENDPOINT_A.getInstructionId())
+                      .setTransformId(ENDPOINT_A.getTransformId()))
               .build();
       ELEMENTS_B_1 =
           BeamFnApi.Elements.newBuilder()
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_B.getInstructionId())
-                      .setPtransformId(ENDPOINT_B.getPTransformId())
+                      .setInstructionId(ENDPOINT_B.getInstructionId())
+                      .setTransformId(ENDPOINT_B.getTransformId())
                       .setData(
                           ByteString.copyFrom(encodeToByteArray(CODER, valueInGlobalWindow("JKL")))
                               .concat(
@@ -121,8 +121,8 @@
                                       encodeToByteArray(CODER, valueInGlobalWindow("MNO"))))))
               .addData(
                   BeamFnApi.Elements.Data.newBuilder()
-                      .setInstructionReference(ENDPOINT_B.getInstructionId())
-                      .setPtransformId(ENDPOINT_B.getPTransformId()))
+                      .setInstructionId(ENDPOINT_B.getInstructionId())
+                      .setTransformId(ENDPOINT_B.getTransformId()))
               .build();
     } catch (Exception e) {
       throw new ExceptionInInitializerError(e);
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BagUserStateTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BagUserStateTest.java
index 486c527..5b01c0f 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BagUserStateTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BagUserStateTest.java
@@ -119,7 +119,7 @@
     return StateKey.newBuilder()
         .setBagUserState(
             StateKey.BagUserState.newBuilder()
-                .setPtransformId("ptransformId")
+                .setTransformId("ptransformId")
                 .setUserStateId("stateId")
                 .setWindow(ByteString.copyFromUtf8("encodedWindow"))
                 .setKey(encode(id)))
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BeamFnStateGrpcClientCacheTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BeamFnStateGrpcClientCacheTest.java
index 1f5bdcb..e1feac1 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BeamFnStateGrpcClientCacheTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/BeamFnStateGrpcClientCacheTest.java
@@ -123,8 +123,8 @@
     CompletableFuture<StateResponse> successfulResponse = new CompletableFuture<>();
     CompletableFuture<StateResponse> unsuccessfulResponse = new CompletableFuture<>();
 
-    client.handle(StateRequest.newBuilder().setInstructionReference(SUCCESS), successfulResponse);
-    client.handle(StateRequest.newBuilder().setInstructionReference(FAIL), unsuccessfulResponse);
+    client.handle(StateRequest.newBuilder().setInstructionId(SUCCESS), successfulResponse);
+    client.handle(StateRequest.newBuilder().setInstructionId(FAIL), unsuccessfulResponse);
 
     // Wait for the client to connect.
     StreamObserver<StateResponse> outboundServerObserver = outboundServerObservers.take();
@@ -150,7 +150,7 @@
     BeamFnStateClient client = clientCache.forApiServiceDescriptor(apiServiceDescriptor);
 
     CompletableFuture<StateResponse> inflight = new CompletableFuture<>();
-    client.handle(StateRequest.newBuilder().setInstructionReference(SUCCESS), inflight);
+    client.handle(StateRequest.newBuilder().setInstructionId(SUCCESS), inflight);
 
     // Wait for the client to connect.
     StreamObserver<StateResponse> outboundServerObserver = outboundServerObservers.take();
@@ -167,7 +167,7 @@
 
     // Send a response after the client will have received an error.
     CompletableFuture<StateResponse> late = new CompletableFuture<>();
-    client.handle(StateRequest.newBuilder().setInstructionReference(SUCCESS), late);
+    client.handle(StateRequest.newBuilder().setInstructionId(SUCCESS), late);
 
     try {
       inflight.get();
@@ -182,7 +182,7 @@
     BeamFnStateClient client = clientCache.forApiServiceDescriptor(apiServiceDescriptor);
 
     CompletableFuture<StateResponse> inflight = new CompletableFuture<>();
-    client.handle(StateRequest.newBuilder().setInstructionReference(SUCCESS), inflight);
+    client.handle(StateRequest.newBuilder().setInstructionId(SUCCESS), inflight);
 
     // Wait for the client to connect.
     StreamObserver<StateResponse> outboundServerObserver = outboundServerObservers.take();
@@ -198,7 +198,7 @@
 
     // Send a response after the client will have received an error.
     CompletableFuture<StateResponse> late = new CompletableFuture<>();
-    client.handle(StateRequest.newBuilder().setInstructionReference(SUCCESS), late);
+    client.handle(StateRequest.newBuilder().setInstructionId(SUCCESS), late);
 
     try {
       inflight.get();
@@ -210,7 +210,7 @@
 
   private void handleServerRequest(
       StreamObserver<StateResponse> outboundObserver, StateRequest value) {
-    switch (value.getInstructionReference()) {
+    switch (value.getInstructionId()) {
       case SUCCESS:
         outboundObserver.onNext(StateResponse.newBuilder().setId(value.getId()).build());
         return;
diff --git a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/MultimapSideInputTest.java b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/MultimapSideInputTest.java
index b9d2f8f..9705267 100644
--- a/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/MultimapSideInputTest.java
+++ b/sdks/java/harness/src/test/java/org/apache/beam/fn/harness/state/MultimapSideInputTest.java
@@ -62,7 +62,7 @@
     return StateKey.newBuilder()
         .setMultimapSideInput(
             StateKey.MultimapSideInput.newBuilder()
-                .setPtransformId("ptransformId")
+                .setTransformId("ptransformId")
                 .setSideInputId("sideInputId")
                 .setWindow(ByteString.copyFromUtf8("encodedWindow"))
                 .setKey(encode(id)))
diff --git a/sdks/java/io/amazon-web-services/build.gradle b/sdks/java/io/amazon-web-services/build.gradle
index bbb7878..ca88447 100644
--- a/sdks/java/io/amazon-web-services/build.gradle
+++ b/sdks/java/io/amazon-web-services/build.gradle
@@ -19,7 +19,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.aws')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services"
 ext.summary = "IO library to read and write Amazon Web Services services from Beam."
diff --git a/sdks/java/io/amazon-web-services2/build.gradle b/sdks/java/io/amazon-web-services2/build.gradle
index e8b3a7c..4d78434 100644
--- a/sdks/java/io/amazon-web-services2/build.gradle
+++ b/sdks/java/io/amazon-web-services2/build.gradle
@@ -19,7 +19,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.aws2')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services 2"
 ext.summary = "IO library to read and write Amazon Web Services services from Beam."
diff --git a/sdks/java/io/amqp/build.gradle b/sdks/java/io/amqp/build.gradle
index d0667d6..a4c35a3 100644
--- a/sdks/java/io/amqp/build.gradle
+++ b/sdks/java/io/amqp/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.amqp')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: AMQP"
 ext.summary = "IO to read and write using AMQP 1.0 protocol (http://www.amqp.org)."
diff --git a/sdks/java/io/bigquery-io-perf-tests/build.gradle b/sdks/java/io/bigquery-io-perf-tests/build.gradle
index 7fae1f0..ce27468 100644
--- a/sdks/java/io/bigquery-io-perf-tests/build.gradle
+++ b/sdks/java/io/bigquery-io-perf-tests/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, publish: false)
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/cassandra/build.gradle b/sdks/java/io/cassandra/build.gradle
index 70cbaa8..ca3015c4 100644
--- a/sdks/java/io/cassandra/build.gradle
+++ b/sdks/java/io/cassandra/build.gradle
@@ -19,7 +19,7 @@
 plugins { id 'org.apache.beam.module' }
 
 // Do not relocate guava to avoid issues with Cassandra's version.
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.cassandra')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/clickhouse/build.gradle b/sdks/java/io/clickhouse/build.gradle
index dd5dc94..ee8d382 100644
--- a/sdks/java/io/clickhouse/build.gradle
+++ b/sdks/java/io/clickhouse/build.gradle
@@ -21,6 +21,7 @@
   id 'ca.coglinc.javacc'
 }
 applyJavaNature(
+    automaticModuleName: 'org.apache.beam.sdk.io.clickhouse',
     // javacc generated code produces lint warnings
     disableLintWarnings: ['dep-ann']
 )
diff --git a/sdks/java/io/common/build.gradle b/sdks/java/io/common/build.gradle
index 078d2e3..a17e2b7 100644
--- a/sdks/java/io/common/build.gradle
+++ b/sdks/java/io/common/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.sdk.io.common')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Common"
 ext.summary = "Code used by all Beam IOs"
diff --git a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/build.gradle b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/build.gradle
index e053498..0788f89 100644
--- a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/build.gradle
+++ b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    publish: false,
     archivesBaseName: 'beam-sdks-java-io-elasticsearch-tests-2'
 )
 provideIntegrationTestingDependencies()
diff --git a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/build.gradle b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/build.gradle
index fb4f5a8..543c0db 100644
--- a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/build.gradle
+++ b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    publish: false,
     archivesBaseName: 'beam-sdks-java-io-elasticsearch-tests-5'
 )
 provideIntegrationTestingDependencies()
diff --git a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/build.gradle b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/build.gradle
index 72829ad..93b59b5 100644
--- a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/build.gradle
+++ b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    publish: false,
     archivesBaseName: 'beam-sdks-java-io-elasticsearch-tests-6'
 )
 provideIntegrationTestingDependencies()
diff --git a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/build.gradle b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/build.gradle
index 8af1770..168fc5b 100644
--- a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/build.gradle
+++ b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    publish: false,
     archivesBaseName: 'beam-sdks-java-io-elasticsearch-tests-common'
 )
 
diff --git a/sdks/java/io/elasticsearch/build.gradle b/sdks/java/io/elasticsearch/build.gradle
index 4294583..6eca559 100644
--- a/sdks/java/io/elasticsearch/build.gradle
+++ b/sdks/java/io/elasticsearch/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.elasticsearch')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Elasticsearch"
 ext.summary = "IO to read and write on Elasticsearch"
diff --git a/sdks/java/io/file-based-io-tests/build.gradle b/sdks/java/io/file-based-io-tests/build.gradle
index 3b0503f..845a9a8 100644
--- a/sdks/java/io/file-based-io-tests/build.gradle
+++ b/sdks/java/io/file-based-io-tests/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, publish: false)
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/google-cloud-platform/build.gradle b/sdks/java/io/google-cloud-platform/build.gradle
index db0b225..0a9b8a9 100644
--- a/sdks/java/io/google-cloud-platform/build.gradle
+++ b/sdks/java/io/google-cloud-platform/build.gradle
@@ -20,6 +20,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.sdk.io.gcp',
   enableSpotbugs: false,
 )
 
diff --git a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index f17c390..06bf8c1 100644
--- a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -185,6 +185,12 @@
  * TypedRead#from(String)} and {@link TypedRead#fromQuery} respectively. Exactly one of these must
  * be specified.
  *
+ * <p>If you are reading from an authorized view wih {@link TypedRead#fromQuery}, you need to use
+ * {@link TypedRead#withQueryLocation(String)} to set the location of the BigQuery job. Otherwise,
+ * Beam will ty to determine that location by reading the metadata of the dataset that contains the
+ * underlying tables. With authorized views, that will result in a 403 error and the query will not
+ * be resolved.
+ *
  * <p><b>Type Conversion Table</b>
  *
  * <table border="1" cellspacing="1">
@@ -267,7 +273,7 @@
  * <p>Users can optionally specify a query priority using {@link TypedRead#withQueryPriority(
  * TypedRead.QueryPriority)} and a geographic location where the query will be executed using {@link
  * TypedRead#withQueryLocation(String)}. Query location must be specified for jobs that are not
- * executed in US or EU. See <a
+ * executed in US or EU, or if you are reading from an authorized view. See <a
  * href="https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query">BigQuery Jobs:
  * query</a>.
  *
@@ -1448,8 +1454,9 @@
      * BigQuery geographic location where the query <a
      * href="https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs">job</a> will be
      * executed. If not specified, Beam tries to determine the location by examining the tables
-     * referenced by the query. Location must be specified for queries not executed in US or EU. See
-     * <a href="https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query">BigQuery Jobs:
+     * referenced by the query. Location must be specified for queries not executed in US or EU, or
+     * when you are reading from an authorized view. See <a
+     * href="https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query">BigQuery Jobs:
      * query</a>.
      */
     public TypedRead<T> withQueryLocation(String location) {
diff --git a/sdks/java/io/hadoop-common/build.gradle b/sdks/java/io/hadoop-common/build.gradle
index beebd40..08f60c6 100644
--- a/sdks/java/io/hadoop-common/build.gradle
+++ b/sdks/java/io/hadoop-common/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.hadoop.common')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Hadoop Common"
 ext.summary = "Library to add shared Hadoop classes among Beam IOs."
diff --git a/sdks/java/io/hadoop-file-system/build.gradle b/sdks/java/io/hadoop-file-system/build.gradle
index 46b49df..8ebdc93 100644
--- a/sdks/java/io/hadoop-file-system/build.gradle
+++ b/sdks/java/io/hadoop-file-system/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.hdfs')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Hadoop File System"
 ext.summary = "Library to read and write Hadoop/HDFS file formats from Beam."
diff --git a/sdks/java/io/hadoop-format/build.gradle b/sdks/java/io/hadoop-format/build.gradle
index f66248a..20dba8d 100644
--- a/sdks/java/io/hadoop-format/build.gradle
+++ b/sdks/java/io/hadoop-format/build.gradle
@@ -19,7 +19,7 @@
 import groovy.json.JsonOutput
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.hadoop.format')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/hbase/build.gradle b/sdks/java/io/hbase/build.gradle
index 26498a1..882eb5e 100644
--- a/sdks/java/io/hbase/build.gradle
+++ b/sdks/java/io/hbase/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.hbase')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/hcatalog/build.gradle b/sdks/java/io/hcatalog/build.gradle
index 7977f6a..da0ee24 100644
--- a/sdks/java/io/hcatalog/build.gradle
+++ b/sdks/java/io/hcatalog/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.hcatalog')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: HCatalog"
 ext.summary = "IO to read and write for HCatalog source."
diff --git a/sdks/java/io/jdbc/build.gradle b/sdks/java/io/jdbc/build.gradle
index 23c53e3..d8ce5f2 100644
--- a/sdks/java/io/jdbc/build.gradle
+++ b/sdks/java/io/jdbc/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.jdbc')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/jms/build.gradle b/sdks/java/io/jms/build.gradle
index f473c5a..8056106 100644
--- a/sdks/java/io/jms/build.gradle
+++ b/sdks/java/io/jms/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.jms')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: JMS"
 ext.summary = """IO to read and write to JMS (Java Messaging Service)
diff --git a/sdks/java/io/kafka/build.gradle b/sdks/java/io/kafka/build.gradle
index 610e3a3..da8655e 100644
--- a/sdks/java/io/kafka/build.gradle
+++ b/sdks/java/io/kafka/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.kafka')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/kinesis/build.gradle b/sdks/java/io/kinesis/build.gradle
index 472ad02..a73c770 100644
--- a/sdks/java/io/kinesis/build.gradle
+++ b/sdks/java/io/kinesis/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.kinesis')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/kinesis/src/main/java/org/apache/beam/sdk/io/kinesis/KinesisIO.java b/sdks/java/io/kinesis/src/main/java/org/apache/beam/sdk/io/kinesis/KinesisIO.java
index c5152e4..9ea3ff5 100644
--- a/sdks/java/io/kinesis/src/main/java/org/apache/beam/sdk/io/kinesis/KinesisIO.java
+++ b/sdks/java/io/kinesis/src/main/java/org/apache/beam/sdk/io/kinesis/KinesisIO.java
@@ -31,6 +31,7 @@
 import com.google.auto.value.AutoValue;
 import com.google.common.util.concurrent.ListenableFuture;
 import java.io.IOException;
+import java.io.ObjectInputStream;
 import java.nio.ByteBuffer;
 import java.util.ArrayList;
 import java.util.Collections;
@@ -615,6 +616,7 @@
         putFutures = Collections.synchronizedList(new ArrayList<>());
         /** Keep only the first {@link MAX_NUM_FAILURES} occurred exceptions */
         failures = new LinkedBlockingDeque<>(MAX_NUM_FAILURES);
+        initKinesisProducer();
       }
 
       private synchronized void initKinesisProducer() {
@@ -630,7 +632,14 @@
         config.setCredentialsRefreshDelay(100);
 
         // Init Kinesis producer
-        producer = spec.getAWSClientsProvider().createKinesisProducer(config);
+        if (producer == null) {
+          producer = spec.getAWSClientsProvider().createKinesisProducer(config);
+        }
+      }
+
+      private void readObject(ObjectInputStream is) throws IOException, ClassNotFoundException {
+        is.defaultReadObject();
+        initKinesisProducer();
       }
 
       /**
@@ -754,6 +763,14 @@
                 i, logEntry.toString());
         throw new IOException(errorMessage);
       }
+
+      @Teardown
+      public void teardown() throws Exception {
+        if (producer != null && producer.getOutstandingRecordsCount() > 0) {
+          producer.flushSync();
+        }
+        producer = null;
+      }
     }
   }
 
diff --git a/sdks/java/io/kinesis/src/test/java/org/apache/beam/sdk/io/kinesis/KinesisProducerMock.java b/sdks/java/io/kinesis/src/test/java/org/apache/beam/sdk/io/kinesis/KinesisProducerMock.java
index 215beec..17c8c1d 100644
--- a/sdks/java/io/kinesis/src/test/java/org/apache/beam/sdk/io/kinesis/KinesisProducerMock.java
+++ b/sdks/java/io/kinesis/src/test/java/org/apache/beam/sdk/io/kinesis/KinesisProducerMock.java
@@ -125,6 +125,6 @@
 
   @Override
   public synchronized void flushSync() {
-    throw new UnsupportedOperationException("Not implemented");
+    flush();
   }
 }
diff --git a/sdks/java/io/kudu/build.gradle b/sdks/java/io/kudu/build.gradle
index 6de6c16..8619d67 100644
--- a/sdks/java/io/kudu/build.gradle
+++ b/sdks/java/io/kudu/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.sdk.io.kudu')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/mongodb/build.gradle b/sdks/java/io/mongodb/build.gradle
index 8d2fc88..d8ac11c 100644
--- a/sdks/java/io/mongodb/build.gradle
+++ b/sdks/java/io/mongodb/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.mongodb')
 provideIntegrationTestingDependencies()
 enableJavaPerformanceTesting()
 
diff --git a/sdks/java/io/mqtt/build.gradle b/sdks/java/io/mqtt/build.gradle
index 543a715..9d7b188 100644
--- a/sdks/java/io/mqtt/build.gradle
+++ b/sdks/java/io/mqtt/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.mqtt')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: MQTT"
 ext.summary = "IO to read and write to a MQTT broker."
diff --git a/sdks/java/io/parquet/build.gradle b/sdks/java/io/parquet/build.gradle
index 2d1a3ee..7a038f3 100644
--- a/sdks/java/io/parquet/build.gradle
+++ b/sdks/java/io/parquet/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.parquet')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Parquet"
 ext.summary = "IO to read and write on Parquet storage format."
diff --git a/sdks/java/io/rabbitmq/build.gradle b/sdks/java/io/rabbitmq/build.gradle
index 24a6a2d..52802c5 100644
--- a/sdks/java/io/rabbitmq/build.gradle
+++ b/sdks/java/io/rabbitmq/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.sdk.io.rabbitmq')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: RabbitMQ"
 ext.summary = "IO to read and write to a RabbitMQ broker."
diff --git a/sdks/java/io/redis/build.gradle b/sdks/java/io/redis/build.gradle
index 93efea8..2626400 100644
--- a/sdks/java/io/redis/build.gradle
+++ b/sdks/java/io/redis/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.redis')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Redis"
 ext.summary ="IO to read and write on a Redis keystore."
diff --git a/sdks/java/io/solr/build.gradle b/sdks/java/io/solr/build.gradle
index d4f1efb..928d6db 100644
--- a/sdks/java/io/solr/build.gradle
+++ b/sdks/java/io/solr/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.solr')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Solr"
 ext.summary = "IO to read and write from/to Solr."
diff --git a/sdks/java/io/synthetic/build.gradle b/sdks/java/io/synthetic/build.gradle
index 41d8c4e..52c794b 100644
--- a/sdks/java/io/synthetic/build.gradle
+++ b/sdks/java/io/synthetic/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.sdk.io.synthetic')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Synthetic"
 ext.summary = "Generators of Synthetic IO for Testing."
diff --git a/sdks/java/io/tika/build.gradle b/sdks/java/io/tika/build.gradle
index d10e04e..813a692 100644
--- a/sdks/java/io/tika/build.gradle
+++ b/sdks/java/io/tika/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.tika')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: Tika"
 ext.summary = "Tika Input to parse files."
diff --git a/sdks/java/io/xml/build.gradle b/sdks/java/io/xml/build.gradle
index bbdd5bb..2cb9077 100644
--- a/sdks/java/io/xml/build.gradle
+++ b/sdks/java/io/xml/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.xml')
 
 description = "Apache Beam :: SDKs :: Java :: IO :: XML"
 ext.summary = "IO to read and write XML files."
diff --git a/sdks/java/javadoc/build.gradle b/sdks/java/javadoc/build.gradle
index 8d7486e..d3d4a97 100644
--- a/sdks/java/javadoc/build.gradle
+++ b/sdks/java/javadoc/build.gradle
@@ -24,7 +24,7 @@
  * used as part of the beam-site source tree.
  */
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(publish: false)
 description = "Apache Beam :: SDKs :: Java :: Aggregated Javadoc"
 
 for (p in rootProject.subprojects) {
diff --git a/sdks/java/maven-archetypes/examples/build.gradle b/sdks/java/maven-archetypes/examples/build.gradle
index 1574c5c..beec649 100644
--- a/sdks/java/maven-archetypes/examples/build.gradle
+++ b/sdks/java/maven-archetypes/examples/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.maven.archetypes.examples')
 
 description = "Apache Beam :: SDKs :: Java :: Maven Archetypes :: Examples"
 ext.summary = """A Maven Archetype to create a project containing all the
diff --git a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/pom.xml b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/pom.xml
index c4eac22..eb9efc6 100644
--- a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/pom.xml
+++ b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/pom.xml
@@ -368,7 +368,7 @@
       <dependencies>
         <dependency>
           <groupId>org.apache.beam</groupId>
-          <artifactId>beam-runners-jet-experimental</artifactId>
+          <artifactId>beam-runners-jet</artifactId>
           <version>${beam.version}</version>
           <scope>runtime</scope>
         </dependency>
@@ -477,7 +477,7 @@
       <artifactId>hamcrest-core</artifactId>
       <version>${hamcrest.version}</version>
     </dependency>
-    
+
     <dependency>
       <groupId>org.hamcrest</groupId>
       <artifactId>hamcrest-library</artifactId>
@@ -497,7 +497,7 @@
       <version>${beam.version}</version>
       <scope>test</scope>
     </dependency>
-    
+
     <dependency>
       <groupId>org.mockito</groupId>
       <artifactId>mockito-core</artifactId>
diff --git a/sdks/java/maven-archetypes/starter/build.gradle b/sdks/java/maven-archetypes/starter/build.gradle
index 25c65e9..850c661 100644
--- a/sdks/java/maven-archetypes/starter/build.gradle
+++ b/sdks/java/maven-archetypes/starter/build.gradle
@@ -17,7 +17,7 @@
  */
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature(exportJavadoc: false)
+applyJavaNature(exportJavadoc: false, automaticModuleName: 'org.apache.beam.maven.archetypes.starter')
 
 description = "Apache Beam :: SDKs :: Java :: Maven Archetypes :: Starter"
 ext.summary = """A Maven archetype to create a simple starter pipeline to
diff --git a/sdks/java/testing/expansion-service/build.gradle b/sdks/java/testing/expansion-service/build.gradle
index 2caabd8..bfbcc23 100644
--- a/sdks/java/testing/expansion-service/build.gradle
+++ b/sdks/java/testing/expansion-service/build.gradle
@@ -18,7 +18,7 @@
 import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
 
 plugins { id 'org.apache.beam.module' }
-applyJavaNature()
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.expansion.service')
 
 description = "Apache Beam :: SDKs :: Java :: Test Expansion Service"
 ext.summary = """Testing Expansion Service used for executing cross-language transform tests."""
diff --git a/sdks/java/testing/load-tests/build.gradle b/sdks/java/testing/load-tests/build.gradle
index a36d42c..fbea1a7 100644
--- a/sdks/java/testing/load-tests/build.gradle
+++ b/sdks/java/testing/load-tests/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    publish: false,
     archivesBaseName: 'beam-sdks-java-load-tests',
     exportJavadoc: false
 )
diff --git a/sdks/java/testing/nexmark/build.gradle b/sdks/java/testing/nexmark/build.gradle
index d44ccbe..1fdfbed 100644
--- a/sdks/java/testing/nexmark/build.gradle
+++ b/sdks/java/testing/nexmark/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+    automaticModuleName: 'org.apache.beam.sdk.nexmark',
     exportJavadoc: false,
     archivesBaseName: 'beam-sdks-java-nexmark'
 )
@@ -56,7 +57,7 @@
   compile project(path: ":sdks:java:core", configuration: "shadow")
   compile project(":sdks:java:io:google-cloud-platform")
   compile project(":sdks:java:extensions:google-cloud-platform-core")
-  compile project(path: ":sdks:java:extensions:sql", configuration: "shadow")
+  compile project(":sdks:java:extensions:sql")
   compile project(":sdks:java:io:kafka")
   compile project(":sdks:java:testing:test-utils")
   compile library.java.google_api_services_bigquery
@@ -68,7 +69,6 @@
   compile library.java.slf4j_api
   compile library.java.commons_lang3
   compile library.java.kafka_clients
-  compile project(path: ":runners:direct-java", configuration: "shadow")
   provided library.java.junit
   provided library.java.hamcrest_core
   testRuntimeClasspath library.java.slf4j_jdk14
@@ -102,7 +102,7 @@
 //
 // Parameters:
 //   -Pnexmark.runner
-//       Specify a runner subproject, such as ":runners:spark" or ":runners:flink:1.5"
+//       Specify a runner subproject, such as ":runners:spark" or ":runners:flink:1.8"
 //       Defaults to ":runners:direct-java"
 //
 //   -Pnexmark.args
diff --git a/sdks/java/testing/nexmark/src/test/java/org/apache/beam/sdk/nexmark/model/sql/RowSizeTest.java b/sdks/java/testing/nexmark/src/test/java/org/apache/beam/sdk/nexmark/model/sql/RowSizeTest.java
index 5e783d0..aa1709a 100644
--- a/sdks/java/testing/nexmark/src/test/java/org/apache/beam/sdk/nexmark/model/sql/RowSizeTest.java
+++ b/sdks/java/testing/nexmark/src/test/java/org/apache/beam/sdk/nexmark/model/sql/RowSizeTest.java
@@ -29,7 +29,6 @@
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.testing.TestStream;
 import org.apache.beam.sdk.transforms.SerializableFunction;
-import org.apache.beam.sdk.transforms.SerializableFunctions;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.Row;
 import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Iterables;
@@ -90,11 +89,7 @@
   public void testParDoConvertsToRecordSize() throws Exception {
     PCollection<Row> rows =
         testPipeline.apply(
-            TestStream.create(
-                    SchemaCoder.of(
-                        ROW_TYPE,
-                        SerializableFunctions.identity(),
-                        SerializableFunctions.identity()))
+            TestStream.create(SchemaCoder.of(ROW_TYPE))
                 .addElements(ROW)
                 .advanceWatermarkToInfinity());
 
diff --git a/sdks/java/testing/test-utils/build.gradle b/sdks/java/testing/test-utils/build.gradle
index b2f50b3..45b007d 100644
--- a/sdks/java/testing/test-utils/build.gradle
+++ b/sdks/java/testing/test-utils/build.gradle
@@ -19,6 +19,7 @@
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
     exportJavadoc: false,
+    automaticModuleName: 'org.apache.beam.sdk.testutils',
     archivesBaseName: 'beam-sdks-java-test-utils'
 )
 
diff --git a/sdks/python/apache_beam/coders/coders.py b/sdks/python/apache_beam/coders/coders.py
index 208b359..216b432 100644
--- a/sdks/python/apache_beam/coders/coders.py
+++ b/sdks/python/apache_beam/coders/coders.py
@@ -88,6 +88,14 @@
     """Decodes the given byte string into the corresponding object."""
     raise NotImplementedError('Decode not implemented: %s.' % self)
 
+  def encode_nested(self, value):
+    """Uses the underlying implementation to encode in nested format."""
+    return self.get_impl().encode_nested(value)
+
+  def decode_nested(self, encoded):
+    """Uses the underlying implementation to decode in nested format."""
+    return self.get_impl().decode_nested(encoded)
+
   def is_deterministic(self):
     """Whether this coder is guaranteed to encode values deterministically.
 
diff --git a/sdks/python/apache_beam/io/mongodbio.py b/sdks/python/apache_beam/io/mongodbio.py
index 39fec8d..6004ca1 100644
--- a/sdks/python/apache_beam/io/mongodbio.py
+++ b/sdks/python/apache_beam/io/mongodbio.py
@@ -203,8 +203,8 @@
     res['uri'] = self.uri
     res['database'] = self.db
     res['collection'] = self.coll
-    res['filter'] = self.filter
-    res['project'] = self.projection
+    res['filter'] = json.dumps(self.filter)
+    res['projection'] = str(self.projection)
     res['mongo_client_spec'] = json.dumps(self.spec)
     return res
 
diff --git a/sdks/python/apache_beam/io/parquetio.py b/sdks/python/apache_beam/io/parquetio.py
index 4f0a2ef..a4e894cd 100644
--- a/sdks/python/apache_beam/io/parquetio.py
+++ b/sdks/python/apache_beam/io/parquetio.py
@@ -37,6 +37,8 @@
 from apache_beam.io.iobase import RangeTracker
 from apache_beam.io.iobase import Read
 from apache_beam.io.iobase import Write
+from apache_beam.transforms import DoFn
+from apache_beam.transforms import ParDo
 from apache_beam.transforms import PTransform
 
 try:
@@ -46,13 +48,87 @@
   pa = None
   pq = None
 
-__all__ = ['ReadFromParquet', 'ReadAllFromParquet', 'WriteToParquet']
+__all__ = ['ReadFromParquet', 'ReadAllFromParquet', 'ReadFromParquetBatched',
+           'ReadAllFromParquetBatched', 'WriteToParquet']
+
+
+class _ArrowTableToRowDictionaries(DoFn):
+  """ A DoFn that consumes an Arrow table and yields a python dictionary for
+  each row in the table."""
+  def process(self, table):
+    num_rows = table.num_rows
+    data_items = table.to_pydict().items()
+    for n in range(num_rows):
+      row = {}
+      for column, values in data_items:
+        row[column] = values[n]
+      yield row
+
+
+class ReadFromParquetBatched(PTransform):
+  """A :class:`~apache_beam.transforms.ptransform.PTransform` for reading
+     Parquet files as a `PCollection` of `pyarrow.Table`. This `PTransform` is
+     currently experimental. No backward-compatibility guarantees."""
+
+  def __init__(self, file_pattern=None, min_bundle_size=0,
+               validate=True, columns=None):
+    """ Initializes :class:`~ReadFromParquetBatched`
+
+    An alternative to :class:`~ReadFromParquet` that yields each row group from
+    the Parquet file as a `pyarrow.Table`.  These Table instances can be
+    processed directly, or converted to a pandas DataFrame for processing.  For
+    more information on supported types and schema, please see the pyarrow
+    documentation.
+
+    .. testcode::
+
+      with beam.Pipeline() as p:
+        dataframes = p \\
+            | 'Read' >> beam.io.ReadFromParquetBatched('/mypath/mypqfiles*') \\
+            | 'Convert to pandas' >> beam.Map(lambda table: table.to_pandas())
+
+    .. NOTE: We're not actually interested in this error; but if we get here,
+       it means that the way of calling this transform hasn't changed.
+
+    .. testoutput::
+      :hide:
+
+      Traceback (most recent call last):
+       ...
+      IOError: No files found based on the file pattern
+
+    See also: :class:`~ReadFromParquet`.
+
+    Args:
+      file_pattern (str): the file glob to read
+      min_bundle_size (int): the minimum size in bytes, to be considered when
+        splitting the input into bundles.
+      validate (bool): flag to verify that the files exist during the pipeline
+        creation time.
+      columns (List[str]): list of columns that will be read from files.
+        A column name may be a prefix of a nested field, e.g. 'a' will select
+        'a.b', 'a.c', and 'a.d.e'
+    """
+
+    super(ReadFromParquetBatched, self).__init__()
+    self._source = _create_parquet_source(
+        file_pattern,
+        min_bundle_size,
+        validate=validate,
+        columns=columns,
+    )
+
+  def expand(self, pvalue):
+    return pvalue.pipeline | Read(self._source)
+
+  def display_data(self):
+    return {'source_dd': self._source}
 
 
 class ReadFromParquet(PTransform):
   """A :class:`~apache_beam.transforms.ptransform.PTransform` for reading
-     Parquet files. This `PTransform` is currently experimental. No
-     backward-compatibility guarantees."""
+     Parquet files as a `PCollection` of dictionaries. This `PTransform` is
+     currently experimental. No backward-compatibility guarantees."""
 
   def __init__(self, file_pattern=None, min_bundle_size=0,
                validate=True, columns=None):
@@ -87,8 +163,9 @@
     that are of simple types will be mapped into corresponding Python types.
     Records that are of complex types like list and struct will be mapped to
     Python list and dictionary respectively. For more information on supported
-    types and schema, please see the pyarrow document.
+    types and schema, please see the pyarrow documentation.
 
+    See also: :class:`~ReadFromParquetBatched`.
 
     Args:
       file_pattern (str): the file glob to read
@@ -99,29 +176,29 @@
       columns (List[str]): list of columns that will be read from files.
         A column name may be a prefix of a nested field, e.g. 'a' will select
         'a.b', 'a.c', and 'a.d.e'
-"""
+    """
     super(ReadFromParquet, self).__init__()
     self._source = _create_parquet_source(
         file_pattern,
         min_bundle_size,
         validate=validate,
-        columns=columns
+        columns=columns,
     )
 
   def expand(self, pvalue):
-    return pvalue.pipeline | Read(self._source)
+    return pvalue | Read(self._source) | ParDo(_ArrowTableToRowDictionaries())
 
   def display_data(self):
     return {'source_dd': self._source}
 
 
-class ReadAllFromParquet(PTransform):
+class ReadAllFromParquetBatched(PTransform):
   """A ``PTransform`` for reading ``PCollection`` of Parquet files.
 
    Uses source ``_ParquetSource`` to read a ``PCollection`` of Parquet files or
-   file patterns and produce a ``PCollection`` of Parquet records. This
-   ``PTransform`` is currently experimental. No backward-compatibility
-   guarantees.
+   file patterns and produce a ``PCollection`` of ``pyarrow.Table``, one for
+   each Parquet file row group. This ``PTransform`` is currently experimental.
+   No backward-compatibility guarantees.
   """
 
   DEFAULT_DESIRED_BUNDLE_SIZE = 64 * 1024 * 1024  # 64MB
@@ -141,7 +218,7 @@
                        may be a prefix of a nested field, e.g. 'a' will select
                        'a.b', 'a.c', and 'a.d.e'
     """
-    super(ReadAllFromParquet, self).__init__()
+    super(ReadAllFromParquetBatched, self).__init__()
     source_from_file = partial(
         _create_parquet_source,
         min_bundle_size=min_bundle_size,
@@ -157,6 +234,14 @@
     return pvalue | self.label >> self._read_all_files
 
 
+class ReadAllFromParquet(PTransform):
+  def __init__(self, **kwargs):
+    self._read_batches = ReadAllFromParquetBatched(**kwargs)
+
+  def expand(self, pvalue):
+    return pvalue | self._read_batches | ParDo(_ArrowTableToRowDictionaries())
+
+
 def _create_parquet_source(file_pattern=None,
                            min_bundle_size=0,
                            validate=False,
@@ -166,7 +251,7 @@
         file_pattern=file_pattern,
         min_bundle_size=min_bundle_size,
         validate=validate,
-        columns=columns
+        columns=columns,
     )
 
 
@@ -245,13 +330,7 @@
         else:
           next_block_start = range_tracker.stop_position()
 
-        num_rows = table.num_rows
-        data_items = table.to_pydict().items()
-        for n in range(num_rows):
-          row = {}
-          for column, values in data_items:
-            row[column] = values[n]
-          yield row
+        yield table
 
 
 class WriteToParquet(PTransform):
diff --git a/sdks/python/apache_beam/io/parquetio_test.py b/sdks/python/apache_beam/io/parquetio_test.py
index e7cf3f4..d9b488d 100644
--- a/sdks/python/apache_beam/io/parquetio_test.py
+++ b/sdks/python/apache_beam/io/parquetio_test.py
@@ -35,7 +35,9 @@
 from apache_beam.io import source_test_utils
 from apache_beam.io.iobase import RangeTracker
 from apache_beam.io.parquetio import ReadAllFromParquet
+from apache_beam.io.parquetio import ReadAllFromParquetBatched
 from apache_beam.io.parquetio import ReadFromParquet
+from apache_beam.io.parquetio import ReadFromParquetBatched
 from apache_beam.io.parquetio import WriteToParquet
 from apache_beam.io.parquetio import _create_parquet_sink
 from apache_beam.io.parquetio import _create_parquet_source
@@ -113,6 +115,23 @@
       col_list.append(column)
     return col_list
 
+  def _records_as_arrow(self, schema=None, count=None):
+    if schema is None:
+      schema = self.SCHEMA
+
+    if count is None:
+      count = len(self.RECORDS)
+
+    len_records = len(self.RECORDS)
+    data = []
+    for i in range(count):
+      data.append(self.RECORDS[i % len_records])
+    col_data = self._record_to_columns(data, schema)
+    col_array = [
+        pa.array(c, schema.types[cn]) for cn, c in enumerate(col_data)
+    ]
+    return pa.Table.from_arrays(col_array, schema.names)
+
   def _write_data(self,
                   directory=None,
                   schema=None,
@@ -120,26 +139,12 @@
                   row_group_size=1000,
                   codec='none',
                   count=None):
-    if schema is None:
-      schema = self.SCHEMA
-
     if directory is None:
       directory = self.temp_dir
 
-    if count is None:
-      count = len(self.RECORDS)
-
     with tempfile.NamedTemporaryFile(
         delete=False, dir=directory, prefix=prefix) as f:
-      len_records = len(self.RECORDS)
-      data = []
-      for i in range(count):
-        data.append(self.RECORDS[i % len_records])
-      col_data = self._record_to_columns(data, schema)
-      col_array = [
-          pa.array(c, schema.types[cn]) for cn, c in enumerate(col_data)
-      ]
-      table = pa.Table.from_arrays(col_array, schema.names)
+      table = self._records_as_arrow(schema, count)
       pq.write_table(
           table, f, row_group_size=row_group_size, compression=codec,
           use_deprecated_int96_timestamps=True
@@ -177,12 +182,12 @@
 
   def test_read_without_splitting(self):
     file_name = self._write_data()
-    expected_result = self.RECORDS
+    expected_result = [self._records_as_arrow()]
     self._run_parquet_test(file_name, None, None, False, expected_result)
 
   def test_read_with_splitting(self):
     file_name = self._write_data()
-    expected_result = self.RECORDS
+    expected_result = [self._records_as_arrow()]
     self._run_parquet_test(file_name, None, 100, True, expected_result)
 
   def test_source_display_data(self):
@@ -205,12 +210,19 @@
       ReadFromParquet(
           file_name,
           validate=False)
-    dd = DisplayData.create_from(read)
+    read_batched = \
+      ReadFromParquetBatched(
+          file_name,
+          validate=False)
 
     expected_items = [
         DisplayDataItemMatcher('compression', 'auto'),
         DisplayDataItemMatcher('file_pattern', file_name)]
-    hc.assert_that(dd.items, hc.contains_inanyorder(*expected_items))
+
+    hc.assert_that(DisplayData.create_from(read).items,
+                   hc.contains_inanyorder(*expected_items))
+    hc.assert_that(DisplayData.create_from(read_batched).items,
+                   hc.contains_inanyorder(*expected_items))
 
   def test_sink_display_data(self):
     file_name = 'some_parquet_sink'
@@ -271,6 +283,8 @@
       path = dst.name
       # pylint: disable=c-extension-no-member
       with self.assertRaises(pl.ArrowInvalid):
+        # Should throw an error "ArrowInvalid: Casting from timestamp[ns] to
+        # timestamp[us] would lose data"
         with TestPipeline() as p:
           _ = p \
           | Create(self.RECORDS) \
@@ -293,6 +307,21 @@
             | Map(json.dumps)
         assert_that(readback, equal_to([json.dumps(r) for r in self.RECORDS]))
 
+  def test_batched_read(self):
+    with tempfile.NamedTemporaryFile() as dst:
+      path = dst.name
+      with TestPipeline() as p:
+        _ = p \
+        | Create(self.RECORDS) \
+        | WriteToParquet(
+            path, self.SCHEMA, num_shards=1, shard_name_template='')
+      with TestPipeline() as p:
+        # json used for stable sortability
+        readback = \
+            p \
+            | ReadFromParquetBatched(path)
+        assert_that(readback, equal_to([self._records_as_arrow()]))
+
   @parameterized.expand([
       param(compression_type='snappy'),
       param(compression_type='gzip'),
@@ -318,18 +347,28 @@
         assert_that(readback, equal_to([json.dumps(r) for r in self.RECORDS]))
 
   def test_read_reentrant(self):
-    file_name = self._write_data()
+    file_name = self._write_data(count=6, row_group_size=3)
     source = _create_parquet_source(file_name)
     source_test_utils.assert_reentrant_reads_succeed((source, None, None))
 
   def test_read_without_splitting_multiple_row_group(self):
-    file_name = self._write_data(count=12000)
-    expected_result = self.RECORDS * 2000
+    file_name = self._write_data(count=12000, row_group_size=1000)
+    # We expect 12000 elements, split into batches of 1000 elements. Create
+    # a list of pa.Table instances to model this expecation
+    expected_result = [
+        pa.Table.from_batches([batch]) for batch in self._records_as_arrow(
+            count=12000).to_batches(chunksize=1000)
+    ]
     self._run_parquet_test(file_name, None, None, False, expected_result)
 
   def test_read_with_splitting_multiple_row_group(self):
-    file_name = self._write_data(count=12000)
-    expected_result = self.RECORDS * 2000
+    file_name = self._write_data(count=12000, row_group_size=1000)
+    # We expect 12000 elements, split into batches of 1000 elements. Create
+    # a list of pa.Table instances to model this expecation
+    expected_result = [
+        pa.Table.from_batches([batch]) for batch in self._records_as_arrow(
+            count=12000).to_batches(chunksize=1000)
+    ]
     self._run_parquet_test(file_name, None, 10000, True, expected_result)
 
   def test_dynamic_work_rebalancing(self):
@@ -370,9 +409,11 @@
   def test_int96_type_conversion(self):
     file_name = self._write_data(
         count=120, row_group_size=20, schema=self.SCHEMA96)
+    orig = self._records_as_arrow(count=120, schema=self.SCHEMA96)
     expected_result = [
-        self._convert_to_timestamped_record(x) for x in self.RECORDS
-    ] * 20
+        pa.Table.from_batches([batch])
+        for batch in orig.to_batches(chunksize=20)
+    ]
     self._run_parquet_test(file_name, None, None, False, expected_result)
 
   def test_split_points(self):
@@ -397,16 +438,17 @@
     # When reading records of the first group, range_tracker.split_points()
     # should return (0, iobase.RangeTracker.SPLIT_POINTS_UNKNOWN)
     self.assertEqual(
-        split_points_report[:10],
-        [(0, RangeTracker.SPLIT_POINTS_UNKNOWN)] * 10)
-
-    # When reading records of last group, range_tracker.split_points() should
-    # return (3, 1)
-    self.assertEqual(split_points_report[-10:], [(3, 1)] * 10)
+        split_points_report,
+        [(0, RangeTracker.SPLIT_POINTS_UNKNOWN),
+         (1, RangeTracker.SPLIT_POINTS_UNKNOWN),
+         (2, RangeTracker.SPLIT_POINTS_UNKNOWN),
+         (3, 1),
+        ])
 
   def test_selective_columns(self):
     file_name = self._write_data()
-    expected_result = [{'name': r['name']} for r in self.RECORDS]
+    orig = self._records_as_arrow()
+    expected_result = [pa.Table.from_arrays([orig.column('name')])]
     self._run_parquet_test(file_name, ['name'], None, False, expected_result)
 
   def test_sink_transform_multiple_row_group(self):
@@ -430,6 +472,13 @@
           | ReadAllFromParquet(),
           equal_to(self.RECORDS))
 
+    with TestPipeline() as p:
+      assert_that(
+          p \
+          | Create([path]) \
+          | ReadAllFromParquetBatched(),
+          equal_to([self._records_as_arrow()]))
+
   def test_read_all_from_parquet_many_single_files(self):
     path1 = self._write_data()
     path2 = self._write_data()
@@ -440,6 +489,12 @@
           | Create([path1, path2, path3]) \
           | ReadAllFromParquet(),
           equal_to(self.RECORDS * 3))
+    with TestPipeline() as p:
+      assert_that(
+          p \
+          | Create([path1, path2, path3]) \
+          | ReadAllFromParquetBatched(),
+          equal_to([self._records_as_arrow()] * 3))
 
   def test_read_all_from_parquet_file_pattern(self):
     file_pattern = self._write_pattern(5)
@@ -449,6 +504,12 @@
           | Create([file_pattern]) \
           | ReadAllFromParquet(),
           equal_to(self.RECORDS * 5))
+    with TestPipeline() as p:
+      assert_that(
+          p \
+          | Create([file_pattern]) \
+          | ReadAllFromParquetBatched(),
+          equal_to([self._records_as_arrow()] * 5))
 
   def test_read_all_from_parquet_many_file_patterns(self):
     file_pattern1 = self._write_pattern(5)
@@ -460,6 +521,12 @@
           | Create([file_pattern1, file_pattern2, file_pattern3]) \
           | ReadAllFromParquet(),
           equal_to(self.RECORDS * 10))
+    with TestPipeline() as p:
+      assert_that(
+          p \
+          | Create([file_pattern1, file_pattern2, file_pattern3]) \
+          | ReadAllFromParquetBatched(),
+          equal_to([self._records_as_arrow()] * 10))
 
 
 if __name__ == '__main__':
diff --git a/sdks/python/apache_beam/options/pipeline_options.py b/sdks/python/apache_beam/options/pipeline_options.py
index 95e1bc8..fb65e26 100644
--- a/sdks/python/apache_beam/options/pipeline_options.py
+++ b/sdks/python/apache_beam/options/pipeline_options.py
@@ -680,6 +680,16 @@
          'enabled with this flag. Please sync with the owners of the runner '
          'before enabling any experiments.'))
 
+    parser.add_argument(
+        '--number_of_worker_harness_threads',
+        type=int,
+        default=None,
+        help=
+        ('Number of threads per worker to use on the runner. If left '
+         'unspecified, the runner will compute an appropriate number of '
+         'threads to use. Currently only enabled for DataflowRunner when '
+         'experiment \'use_unified_worker\' is enabled.'))
+
   def add_experiment(self, experiment):
     # pylint: disable=access-member-before-definition
     if self.experiments is None:
diff --git a/sdks/python/apache_beam/pipeline_test.py b/sdks/python/apache_beam/pipeline_test.py
index d1d9d0d..e01e100 100644
--- a/sdks/python/apache_beam/pipeline_test.py
+++ b/sdks/python/apache_beam/pipeline_test.py
@@ -602,6 +602,28 @@
           p | Create([1, 2]) | beam.Map(lambda _, t=DoFn.TimestampParam: t),
           equal_to([MIN_TIMESTAMP, MIN_TIMESTAMP]))
 
+  def test_incomparable_default(self):
+
+    class IncomparableType(object):
+
+      def __eq__(self, other):
+        raise RuntimeError()
+
+      def __ne__(self, other):
+        raise RuntimeError()
+
+      def __hash__(self):
+        raise RuntimeError()
+
+    # Ensure that we don't use default values in a context where they must be
+    # comparable (see BEAM-8301).
+    pipeline = TestPipeline()
+    pcoll = (pipeline
+             | beam.Create([None])
+             | Map(lambda e, x=IncomparableType(): (e, type(x).__name__)))
+    assert_that(pcoll, equal_to([(None, 'IncomparableType')]))
+    pipeline.run()
+
 
 class Bacon(PipelineOptions):
 
diff --git a/sdks/python/apache_beam/portability/python_urns.py b/sdks/python/apache_beam/portability/python_urns.py
index 980ca68..358c9b3 100644
--- a/sdks/python/apache_beam/portability/python_urns.py
+++ b/sdks/python/apache_beam/portability/python_urns.py
@@ -38,8 +38,8 @@
 EMBEDDED_PYTHON = "beam:env:embedded_python:v1"
 
 # Invoke UserFns in process, but over GRPC channels.
-# Payload: (optional) Number of worker threads, as a decimal string.
-# (Used for testing.)
+# Payload: (optional) Number of worker threads, followed by ',' and the size of
+# the state cache, as a decimal string, e.g. '2,1000'.
 EMBEDDED_PYTHON_GRPC = "beam:env:embedded_python_grpc:v1"
 
 # Instantiate SDK harness via a command line provided in the payload.
diff --git a/sdks/python/apache_beam/runners/common.py b/sdks/python/apache_beam/runners/common.py
index cd77112..541959a 100644
--- a/sdks/python/apache_beam/runners/common.py
+++ b/sdks/python/apache_beam/runners/common.py
@@ -175,11 +175,11 @@
       elif isinstance(v, core.DoFn.TimerParam):
         self.timer_args_to_replace[kw] = v.timer_spec
         self.has_userstate_arguments = True
-      elif v == core.DoFn.TimestampParam:
+      elif core.DoFn.TimestampParam == v:
         self.timestamp_arg_name = kw
-      elif v == core.DoFn.WindowParam:
+      elif core.DoFn.WindowParam == v:
         self.window_arg_name = kw
-      elif v == core.DoFn.KeyParam:
+      elif core.DoFn.KeyParam == v:
         self.key_arg_name = kw
       elif isinstance(v, core.DoFn.RestrictionParam):
         self.restriction_provider = v.restriction_provider
@@ -498,16 +498,16 @@
     # Fill the OtherPlaceholders for context, key, window or timestamp
     remaining_args_iter = iter(input_args[args_to_pick:])
     for a, d in zip(arg_names[-len(default_arg_values):], default_arg_values):
-      if d == core.DoFn.ElementParam:
+      if core.DoFn.ElementParam == d:
         args_with_placeholders.append(ArgPlaceholder(d))
-      elif d == core.DoFn.KeyParam:
+      elif core.DoFn.KeyParam == d:
         self.is_key_param_required = True
         args_with_placeholders.append(ArgPlaceholder(d))
-      elif d == core.DoFn.WindowParam:
+      elif core.DoFn.WindowParam == d:
         args_with_placeholders.append(ArgPlaceholder(d))
-      elif d == core.DoFn.TimestampParam:
+      elif core.DoFn.TimestampParam == d:
         args_with_placeholders.append(ArgPlaceholder(d))
-      elif d == core.DoFn.SideInputParam:
+      elif core.DoFn.SideInputParam == d:
         # If no more args are present then the value must be passed via kwarg
         try:
           args_with_placeholders.append(next(remaining_args_iter))
@@ -518,7 +518,7 @@
         args_with_placeholders.append(ArgPlaceholder(d))
       elif isinstance(d, core.DoFn.TimerParam):
         args_with_placeholders.append(ArgPlaceholder(d))
-      elif d == core.DoFn.BundleFinalizerParam:
+      elif isinstance(d, type) and core.DoFn.BundleFinalizerParam == d:
         args_with_placeholders.append(ArgPlaceholder(d))
       else:
         # If no more args are present then the value must be passed via kwarg
@@ -627,13 +627,13 @@
              'instead, got \'%s\'.') % (windowed_value.value,))
 
     for i, p in self.placeholders:
-      if p == core.DoFn.ElementParam:
+      if core.DoFn.ElementParam == p:
         args_for_process[i] = windowed_value.value
-      elif p == core.DoFn.KeyParam:
+      elif core.DoFn.KeyParam == p:
         args_for_process[i] = key
-      elif p == core.DoFn.WindowParam:
+      elif core.DoFn.WindowParam == p:
         args_for_process[i] = window
-      elif p == core.DoFn.TimestampParam:
+      elif core.DoFn.TimestampParam == p:
         args_for_process[i] = windowed_value.timestamp
       elif isinstance(p, core.DoFn.StateParam):
         args_for_process[i] = (
@@ -641,7 +641,7 @@
       elif isinstance(p, core.DoFn.TimerParam):
         args_for_process[i] = (
             self.user_state_context.get_timer(p.timer_spec, key, window))
-      elif p == core.DoFn.BundleFinalizerParam:
+      elif core.DoFn.BundleFinalizerParam == p:
         args_for_process[i] = self.bundle_finalizer_param
 
     if additional_kwargs:
diff --git a/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py b/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
index fb91d97..b53f1aa 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
@@ -252,6 +252,9 @@
       pool.subnetwork = self.worker_options.subnetwork
     pool.workerHarnessContainerImage = (
         get_container_image_from_options(options))
+    if self.debug_options.number_of_worker_harness_threads:
+      pool.numThreadsPerWorker = (
+          self.debug_options.number_of_worker_harness_threads)
     if self.worker_options.use_public_ips is not None:
       if self.worker_options.use_public_ips:
         pool.ipConfiguration = (
diff --git a/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py b/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
index fc465a3..b548be7 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
@@ -288,6 +288,18 @@
         env.proto.workerPools[0].ipConfiguration,
         dataflow.WorkerPool.IpConfigurationValueValuesEnum.WORKER_IP_PRIVATE)
 
+  def test_number_of_worker_harness_threads(self):
+    pipeline_options = PipelineOptions(
+        ['--temp_location', 'gs://any-location/temp',
+         '--number_of_worker_harness_threads', '2'])
+    env = apiclient.Environment([],
+                                pipeline_options,
+                                '2.0.0',
+                                FAKE_PIPELINE_URL)
+    self.assertEqual(
+        env.proto.workerPools[0].numThreadsPerWorker,
+        2)
+
   @mock.patch('apache_beam.runners.dataflow.internal.apiclient.'
               'beam_version.__version__', '2.2.0')
   def test_harness_override_present_in_released_sdks(self):
diff --git a/sdks/python/apache_beam/runners/portability/flink_runner.py b/sdks/python/apache_beam/runners/portability/flink_runner.py
index a904a3a..76c15ef 100644
--- a/sdks/python/apache_beam/runners/portability/flink_runner.py
+++ b/sdks/python/apache_beam/runners/portability/flink_runner.py
@@ -24,7 +24,7 @@
 from apache_beam.runners.portability import job_server
 from apache_beam.runners.portability import portable_runner
 
-PUBLISHED_FLINK_VERSIONS = ['1.6', '1.7', '1.8']
+PUBLISHED_FLINK_VERSIONS = ['1.7', '1.8']
 
 
 class FlinkRunner(portable_runner.PortableRunner):
@@ -42,6 +42,7 @@
                         help='Flink version to use.')
     parser.add_argument('--flink_job_server_jar',
                         help='Path or URL to a flink jobserver jar.')
+    parser.add_argument('--artifacts_dir', default=None)
 
 
 class FlinkJarJobServer(job_server.JavaJarJobServer):
@@ -51,18 +52,20 @@
     self._jar = options.flink_job_server_jar
     self._master_url = options.flink_master_url
     self._flink_version = options.flink_version
+    self._artifacts_dir = options.artifacts_dir
 
   def path_to_jar(self):
     if self._jar:
       return self._jar
     else:
-      return self.path_to_gradle_target_jar(
+      return self.path_to_beam_jar(
           'runners:flink:%s:job-server:shadowJar' % self._flink_version)
 
   def java_arguments(self, job_port, artifacts_dir):
     return [
         '--flink-master-url', self._master_url,
-        '--artifacts-dir', artifacts_dir,
+        '--artifacts-dir', (self._artifacts_dir
+                            if self._artifacts_dir else artifacts_dir),
         '--job-port', job_port,
         '--artifact-port', 0,
         '--expansion-port', 0
diff --git a/sdks/python/apache_beam/runners/portability/flink_runner_test.py b/sdks/python/apache_beam/runners/portability/flink_runner_test.py
index 5e94d9e..397297b 100644
--- a/sdks/python/apache_beam/runners/portability/flink_runner_test.py
+++ b/sdks/python/apache_beam/runners/portability/flink_runner_test.py
@@ -30,6 +30,7 @@
 import apache_beam as beam
 from apache_beam import Impulse
 from apache_beam import Map
+from apache_beam import Pipeline
 from apache_beam.io.external.generate_sequence import GenerateSequence
 from apache_beam.io.external.kafka import ReadFromKafka
 from apache_beam.io.external.kafka import WriteToKafka
@@ -140,7 +141,7 @@
       options = super(FlinkRunnerTest, self).create_options()
       options.view_as(DebugOptions).experiments = [
           'beam_fn_api'] + extra_experiments
-      options._all_options['parallelism'] = 1
+      options._all_options['parallelism'] = 2
       options._all_options['shutdown_sources_on_final_watermark'] = True
       options.view_as(PortableOptions).environment_type = (
           environment_type.upper())
@@ -231,28 +232,27 @@
         def process(self, v):
           self.counter.inc()
 
-      p = self.create_pipeline()
+      options = self.create_options()
+      # Test only supports parallelism of 1
+      options._all_options['parallelism'] = 1
       n = 100
-
-      # pylint: disable=expression-not-assigned
-      p \
-      | beam.Create(list(range(n))) \
-      | beam.ParDo(DoFn())
-
-      result = p.run()
-      result.wait_until_finish()
+      with Pipeline(self.get_runner(), options) as p:
+        # pylint: disable=expression-not-assigned
+        (p
+         | beam.Create(list(range(n)))
+         | beam.ParDo(DoFn()))
 
       with open(self.test_metrics_path, 'r') as f:
         lines = [line for line in f.readlines() if counter_name in line]
         self.assertEqual(
             len(lines), 1,
-            msg='Expected 1 line matching "%s":\n%s' % (
+            msg='Expected 1 line matching "{}":\n{}'.format(
                 counter_name, '\n'.join(lines))
         )
         line = lines[0]
         self.assertTrue(
-            '%s: 100' % counter_name in line,
-            msg='Failed to find expected counter %s in line %s' % (
+            '{}: {}'.format(counter_name in line, n),
+            msg='Failed to find expected counter {} in line {}'.format(
                 counter_name, line)
         )
 
diff --git a/sdks/python/apache_beam/runners/portability/fn_api_runner.py b/sdks/python/apache_beam/runners/portability/fn_api_runner.py
index 9b9cd1c..78e774f 100644
--- a/sdks/python/apache_beam/runners/portability/fn_api_runner.py
+++ b/sdks/python/apache_beam/runners/portability/fn_api_runner.py
@@ -70,6 +70,8 @@
 from apache_beam.runners.worker import data_plane
 from apache_beam.runners.worker import sdk_worker
 from apache_beam.runners.worker.channel_factory import GRPCChannelFactory
+from apache_beam.runners.worker.sdk_worker import _Future
+from apache_beam.runners.worker.statecache import StateCache
 from apache_beam.transforms import trigger
 from apache_beam.transforms.window import GlobalWindows
 from apache_beam.utils import profiler
@@ -82,6 +84,11 @@
     beam.coders.coders.GlobalWindowCoder()).get_impl().encode_nested(
         beam.transforms.window.GlobalWindows.windowed_value(b''))
 
+# State caching is enabled in the fn_api_runner for testing, except for one
+# test which runs without state caching (FnApiRunnerTestWithDisabledCaching).
+# The cache is disabled in production for other runners.
+STATE_CACHE_SIZE = 100
+
 
 class ControlConnection(object):
 
@@ -483,15 +490,15 @@
       for key, window, elements_data in elements_by_window.encoded_items():
         state_key = beam_fn_api_pb2.StateKey(
             multimap_side_input=beam_fn_api_pb2.StateKey.MultimapSideInput(
-                ptransform_id=transform_id,
+                transform_id=transform_id,
                 side_input_id=tag,
                 window=window,
                 key=key))
-        worker_handler.state.blocking_append(state_key, elements_data)
+        worker_handler.state.append_raw(state_key, elements_data)
 
   def _run_bundle_multiple_times_for_testing(
       self, worker_handler_list, process_bundle_descriptor, data_input,
-      data_output, get_input_coder_callable):
+      data_output, get_input_coder_callable, cache_token_generator):
 
     # all workers share state, so use any worker_handler.
     worker_handler = worker_handler_list[0]
@@ -501,7 +508,9 @@
         ParallelBundleManager(
             worker_handler_list, lambda pcoll_id: [],
             get_input_coder_callable, process_bundle_descriptor,
-            self._progress_frequency, k, num_workers=self._num_workers
+            self._progress_frequency, k,
+            num_workers=self._num_workers,
+            cache_token_generator=cache_token_generator
         ).process_bundle(data_input, data_output)
       finally:
         worker_handler.state.restore()
@@ -548,11 +557,11 @@
       for delayed_application in split.residual_roots:
         deferred_inputs[
             input_for_callable(
-                delayed_application.application.ptransform_id,
+                delayed_application.application.transform_id,
                 delayed_application.application.input_id)
         ].append(delayed_application.application.element)
       for channel_split in split.channel_splits:
-        coder_impl = get_input_coder_callable(channel_split.ptransform_id)
+        coder_impl = get_input_coder_callable(channel_split.transform_id)
         # TODO(SDF): This requires determanistic ordering of buffer iteration.
         # TODO(SDF): The return split is in terms of indices.  Ideally,
         # a runner could map these back to actual positions to effectively
@@ -567,15 +576,15 @@
 
         # Decode and recode to split the encoded buffer by element index.
         all_elements = list(coder_impl.decode_all(b''.join(last_sent[
-            channel_split.ptransform_id])))
+            channel_split.transform_id])))
         residual_elements = all_elements[
             channel_split.first_residual_element : prev_stops.get(
-                channel_split.ptransform_id, len(all_elements)) + 1]
+                channel_split.transform_id, len(all_elements)) + 1]
         if residual_elements:
-          deferred_inputs[channel_split.ptransform_id].append(
+          deferred_inputs[channel_split.transform_id].append(
               coder_impl.encode_all(residual_elements))
         prev_stops[
-            channel_split.ptransform_id] = channel_split.last_primary_element
+            channel_split.transform_id] = channel_split.last_primary_element
 
   @staticmethod
   def _extract_stage_data_endpoints(
@@ -640,7 +649,7 @@
       out = create_OutputStream()
       for element in values:
         element_coder_impl.encode_to_stream(element, out, True)
-      worker_handler.state.blocking_append(
+      worker_handler.state.append_raw(
           beam_fn_api_pb2.StateKey(
               runner=beam_fn_api_pb2.StateKey.Runner(key=token)),
           out.get())
@@ -724,28 +733,30 @@
           ).coder_id
       ]].get_impl()
 
-    self._run_bundle_multiple_times_for_testing(worker_handler_list,
-                                                process_bundle_descriptor,
-                                                data_input,
-                                                data_output,
-                                                get_input_coder_impl)
+    # Change cache token across bundle repeats
+    cache_token_generator = FnApiRunner.get_cache_token_generator(static=False)
+
+    self._run_bundle_multiple_times_for_testing(
+        worker_handler_list, process_bundle_descriptor, data_input, data_output,
+        get_input_coder_impl, cache_token_generator=cache_token_generator)
 
     bundle_manager = ParallelBundleManager(
         worker_handler_list, get_buffer, get_input_coder_impl,
         process_bundle_descriptor, self._progress_frequency,
-        num_workers=self._num_workers)
+        num_workers=self._num_workers,
+        cache_token_generator=cache_token_generator)
 
     result, splits = bundle_manager.process_bundle(data_input, data_output)
 
-    def input_for(ptransform_id, input_id):
+    def input_for(transform_id, input_id):
       input_pcoll = process_bundle_descriptor.transforms[
-          ptransform_id].inputs[input_id]
+          transform_id].inputs[input_id]
       for read_id, proto in process_bundle_descriptor.transforms.items():
         if (proto.spec.urn == bundle_processor.DATA_INPUT_URN
             and input_pcoll in proto.outputs.values()):
           return read_id
       raise RuntimeError(
-          'No IO transform feeds %s' % ptransform_id)
+          'No IO transform feeds %s' % transform_id)
 
     last_result = result
     last_sent = data_input
@@ -760,7 +771,7 @@
       for delayed_application in last_result.process_bundle.residual_roots:
         deferred_inputs[
             input_for(
-                delayed_application.application.ptransform_id,
+                delayed_application.application.transform_id,
                 delayed_application.application.input_id)
         ].append(delayed_application.application.element)
 
@@ -916,7 +927,7 @@
     def process_instruction_id(self, unused_instruction_id):
       yield
 
-    def blocking_get(self, state_key, continuation_token=None):
+    def get_raw(self, state_key, continuation_token=None):
       with self._lock:
         full_state = self._state[self._to_key(state_key)]
         if self._use_continuation_tokens:
@@ -937,13 +948,22 @@
           assert not continuation_token
           return b''.join(full_state), None
 
-    def blocking_append(self, state_key, data):
+    def append_raw(self, state_key, data):
       with self._lock:
         self._state[self._to_key(state_key)].append(data)
+      return _Future.done()
 
-    def blocking_clear(self, state_key):
+    def clear(self, state_key):
       with self._lock:
-        del self._state[self._to_key(state_key)]
+        try:
+          del self._state[self._to_key(state_key)]
+        except KeyError:
+          # This may happen with the caching layer across bundles. Caching may
+          # skip this storage layer for a blocking_get(key) request. Without
+          # the caching, the state for a key would be initialized via the
+          # defaultdict that _state uses.
+          pass
+      return _Future.done()
 
     @staticmethod
     def _to_key(state_key):
@@ -955,23 +975,23 @@
 
     def State(self, request_stream, context=None):
       # Note that this eagerly mutates state, assuming any failures are fatal.
-      # Thus it is safe to ignore instruction_reference.
+      # Thus it is safe to ignore instruction_id.
       for request in request_stream:
         request_type = request.WhichOneof('request')
         if request_type == 'get':
-          data, continuation_token = self._state.blocking_get(
+          data, continuation_token = self._state.get_raw(
               request.state_key, request.get.continuation_token)
           yield beam_fn_api_pb2.StateResponse(
               id=request.id,
               get=beam_fn_api_pb2.StateGetResponse(
                   data=data, continuation_token=continuation_token))
         elif request_type == 'append':
-          self._state.blocking_append(request.state_key, request.append.data)
+          self._state.append_raw(request.state_key, request.append.data)
           yield beam_fn_api_pb2.StateResponse(
               id=request.id,
               append=beam_fn_api_pb2.StateAppendResponse())
         elif request_type == 'clear':
-          self._state.blocking_clear(request.state_key)
+          self._state.clear(request.state_key)
           yield beam_fn_api_pb2.StateResponse(
               id=request.id,
               clear=beam_fn_api_pb2.StateClearResponse())
@@ -992,6 +1012,46 @@
       """Does nothing."""
       pass
 
+  @staticmethod
+  def get_cache_token_generator(static=True):
+    """A generator for cache tokens.
+       :arg static If True, generator always returns the same cache token
+                   If False, generator returns a new cache token each time
+       :return A generator which returns a cache token on next(generator)
+    """
+    def generate_token(identifier):
+      return beam_fn_api_pb2.ProcessBundleRequest.CacheToken(
+          user_state=beam_fn_api_pb2
+          .ProcessBundleRequest.CacheToken.UserState(),
+          token="cache_token_{}".format(identifier).encode("utf-8"))
+
+    class StaticGenerator(object):
+      def __init__(self):
+        self._token = generate_token(1)
+
+      def __iter__(self):
+        # pylint: disable=non-iterator-returned
+        return self
+
+      def __next__(self):
+        return self._token
+
+    class DynamicGenerator(object):
+      def __init__(self):
+        self._counter = 0
+        self._lock = threading.Lock()
+
+      def __iter__(self):
+        # pylint: disable=non-iterator-returned
+        return self
+
+      def __next__(self):
+        with self._lock:
+          self._counter += 1
+          return generate_token(self._counter)
+
+    return StaticGenerator() if static else DynamicGenerator()
+
 
 class WorkerHandler(object):
   """worker_handler for a worker.
@@ -1069,10 +1129,11 @@
         self, data_plane.InMemoryDataChannel(), state, provision_info)
     self.control_conn = self
     self.data_conn = self.data_plane_handler
-
     self.worker = sdk_worker.SdkWorker(
         sdk_worker.BundleProcessorCache(
-            FnApiRunner.SingletonStateHandlerFactory(self.state),
+            FnApiRunner.SingletonStateHandlerFactory(
+                sdk_worker.CachingMaterializingStateHandler(
+                    StateCache(STATE_CACHE_SIZE), state)),
             data_plane.InMemoryDataChannelFactory(
                 self.data_plane_handler.inverse()),
             {}))
@@ -1303,15 +1364,21 @@
 
 @WorkerHandler.register_environment(python_urns.EMBEDDED_PYTHON_GRPC, bytes)
 class EmbeddedGrpcWorkerHandler(GrpcWorkerHandler):
-  def __init__(self, num_workers_payload, state, provision_info, grpc_server):
+  def __init__(self, payload, state, provision_info, grpc_server):
     super(EmbeddedGrpcWorkerHandler, self).__init__(state, provision_info,
                                                     grpc_server)
-    self._num_threads = int(num_workers_payload) if num_workers_payload else 1
+    if payload:
+      num_workers, state_cache_size = payload.decode('ascii').split(',')
+      self._num_threads = int(num_workers)
+      self._state_cache_size = int(state_cache_size)
+    else:
+      self._num_threads = 1
+      self._state_cache_size = STATE_CACHE_SIZE
 
   def start_worker(self):
     self.worker = sdk_worker.SdkHarness(
         self.control_address, worker_count=self._num_threads,
-        worker_id=self.worker_id)
+        state_cache_size=self._state_cache_size, worker_id=self.worker_id)
     self.worker_thread = threading.Thread(
         name='run_worker', target=self.worker.run)
     self.worker_thread.daemon = True
@@ -1512,7 +1579,8 @@
 
   def __init__(
       self, worker_handler_list, get_buffer, get_input_coder_impl,
-      bundle_descriptor, progress_frequency=None, skip_registration=False):
+      bundle_descriptor, progress_frequency=None, skip_registration=False,
+      cache_token_generator=FnApiRunner.get_cache_token_generator()):
     """Set up a bundle manager.
 
     Args:
@@ -1530,6 +1598,7 @@
     self._registered = skip_registration
     self._progress_frequency = progress_frequency
     self._worker_handler = None
+    self._cache_token_generator = cache_token_generator
 
   def _send_input_to_worker(self,
                             process_bundle_id,
@@ -1599,7 +1668,7 @@
         split_request = beam_fn_api_pb2.InstructionRequest(
             process_bundle_split=
             beam_fn_api_pb2.ProcessBundleSplitRequest(
-                instruction_reference=process_bundle_id,
+                instruction_id=process_bundle_id,
                 desired_splits={
                     read_transform_id:
                     beam_fn_api_pb2.ProcessBundleSplitRequest.DesiredSplit(
@@ -1653,7 +1722,8 @@
     process_bundle_req = beam_fn_api_pb2.InstructionRequest(
         instruction_id=process_bundle_id,
         process_bundle=beam_fn_api_pb2.ProcessBundleRequest(
-            process_bundle_descriptor_reference=self._bundle_descriptor.id))
+            process_bundle_descriptor_id=self._bundle_descriptor.id,
+            cache_tokens=[next(self._cache_token_generator)]))
     result_future = self._worker_handler.control_conn.push(process_bundle_req)
 
     split_results = []
@@ -1670,10 +1740,10 @@
           expected_outputs.keys(),
           abort_callback=lambda: (result_future.is_done()
                                   and result_future.get().error)):
-        if output.ptransform_id in expected_outputs:
+        if output.transform_id in expected_outputs:
           with BundleManager._lock:
             self._get_buffer(
-                expected_outputs[output.ptransform_id]).append(output.data)
+                expected_outputs[output.transform_id]).append(output.data)
 
       logging.debug('Wait for the bundle %s to finish.' % process_bundle_id)
       result = result_future.get()
@@ -1685,7 +1755,7 @@
       finalize_request = beam_fn_api_pb2.InstructionRequest(
           finalize_bundle=
           beam_fn_api_pb2.FinalizeBundleRequest(
-              instruction_reference=process_bundle_id
+              instruction_id=process_bundle_id
           ))
       self._worker_handler.control_conn.push(finalize_request)
 
@@ -1697,10 +1767,11 @@
   def __init__(
       self, worker_handler_list, get_buffer, get_input_coder_impl,
       bundle_descriptor, progress_frequency=None, skip_registration=False,
-      **kwargs):
+      cache_token_generator=None, **kwargs):
     super(ParallelBundleManager, self).__init__(
         worker_handler_list, get_buffer, get_input_coder_impl,
-        bundle_descriptor, progress_frequency, skip_registration)
+        bundle_descriptor, progress_frequency, skip_registration,
+        cache_token_generator=cache_token_generator)
     self._num_workers = kwargs.pop('num_workers', 1)
 
   def process_bundle(self, inputs, expected_outputs):
@@ -1715,7 +1786,8 @@
       for result, split_result in executor.map(lambda part: BundleManager(
           self._worker_handler_list, self._get_buffer,
           self._get_input_coder_impl, self._bundle_descriptor,
-          self._progress_frequency, self._registered).process_bundle(
+          self._progress_frequency, self._registered,
+          cache_token_generator=self._cache_token_generator).process_bundle(
               part, expected_outputs), part_inputs):
 
         split_result_list += split_result
@@ -1764,7 +1836,7 @@
             beam_fn_api_pb2.InstructionRequest(
                 process_bundle_progress=
                 beam_fn_api_pb2.ProcessBundleProgressRequest(
-                    instruction_reference=self._instruction_id))).get()
+                    instruction_id=self._instruction_id))).get()
         self._latest_progress = progress_result.process_bundle_progress
         if self._callback:
           self._callback(self._latest_progress)
@@ -1836,9 +1908,9 @@
 
   def _to_metric_key(self, monitoring_info):
     # Right now this assumes that all metrics have a PTRANSFORM
-    ptransform_id = monitoring_info.labels['PTRANSFORM']
+    transform_id = monitoring_info.labels['PTRANSFORM']
     namespace, name = monitoring_infos.parse_namespace_and_name(monitoring_info)
-    return MetricKey(ptransform_id, MetricName(namespace, name))
+    return MetricKey(transform_id, MetricName(namespace, name))
 
   def query(self, filter=None):
     counters = [metrics.execution.MetricResult(k, v, v)
diff --git a/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py b/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py
index 7667f18..125b856 100644
--- a/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py
+++ b/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py
@@ -1176,7 +1176,18 @@
         runner=fn_api_runner.FnApiRunner(
             default_environment=beam_runner_api_pb2.Environment(
                 urn=python_urns.EMBEDDED_PYTHON_GRPC,
-                payload=b'2')))
+                payload=b'2,%d' % fn_api_runner.STATE_CACHE_SIZE)))
+
+
+class FnApiRunnerTestWithDisabledCaching(FnApiRunnerTest):
+
+  def create_pipeline(self):
+    return beam.Pipeline(
+        runner=fn_api_runner.FnApiRunner(
+            default_environment=beam_runner_api_pb2.Environment(
+                urn=python_urns.EMBEDDED_PYTHON_GRPC,
+                # number of workers, state cache size
+                payload=b'2,0')))
 
 
 class FnApiRunnerTestWithMultiWorkers(FnApiRunnerTest):
diff --git a/sdks/python/apache_beam/runners/portability/job_server.py b/sdks/python/apache_beam/runners/portability/job_server.py
index e70b6d9..04edde8 100644
--- a/sdks/python/apache_beam/runners/portability/job_server.py
+++ b/sdks/python/apache_beam/runners/portability/job_server.py
@@ -18,23 +18,19 @@
 from __future__ import absolute_import
 
 import atexit
-import logging
 import os
 import shutil
 import signal
-import socket
 import subprocess
 import sys
 import tempfile
 import threading
-import time
 
 import grpc
-from future.moves.urllib.error import URLError
-from future.moves.urllib.request import urlopen
 
 from apache_beam.portability.api import beam_job_api_pb2_grpc
 from apache_beam.runners.portability import local_job_service
+from apache_beam.utils import subprocess_server
 from apache_beam.version import __version__ as beam_version
 
 
@@ -97,61 +93,26 @@
 class SubprocessJobServer(JobServer):
   """An abstract base class for JobServers run as an external process."""
   def __init__(self):
-    self._process_lock = threading.RLock()
-    self._process = None
     self._local_temp_root = None
+    self._server = None
 
   def subprocess_cmd_and_endpoint(self):
     raise NotImplementedError(type(self))
 
   def start(self):
-    with self._process_lock:
-      if self._process:
-        self.stop()
+    if self._server is None:
+      self._local_temp_root = tempfile.mkdtemp(prefix='beam-temp')
       cmd, endpoint = self.subprocess_cmd_and_endpoint()
-      logging.debug("Starting job service with %s", cmd)
-      try:
-        self._process = subprocess.Popen([str(arg) for arg in cmd])
-        self._local_temp_root = tempfile.mkdtemp(prefix='beam-temp')
-        wait_secs = .1
-        channel = grpc.insecure_channel(endpoint)
-        channel_ready = grpc.channel_ready_future(channel)
-        while True:
-          if self._process.poll() is not None:
-            logging.error("Starting job service with %s", cmd)
-            raise RuntimeError(
-                'Job service failed to start up with error %s' %
-                self._process.poll())
-          try:
-            channel_ready.result(timeout=wait_secs)
-            break
-          except (grpc.FutureTimeoutError, grpc._channel._Rendezvous):
-            wait_secs *= 1.2
-            logging.log(logging.WARNING if wait_secs > 1 else logging.DEBUG,
-                        'Waiting for jobs grpc channel to be ready at %s.',
-                        endpoint)
-        return beam_job_api_pb2_grpc.JobServiceStub(channel)
-      except:  # pylint: disable=bare-except
-        logging.exception("Error bringing up job service")
-        self.stop()
-        raise
+      port = int(endpoint.split(':')[-1])
+      self._server = subprocess_server.SubprocessServer(
+          beam_job_api_pb2_grpc.JobServiceStub, cmd, port=port)
+    return self._server.start()
 
   def stop(self):
-    with self._process_lock:
-      if not self._process:
-        return
-      for _ in range(5):
-        if self._process.poll() is not None:
-          break
-        logging.debug("Sending SIGINT to job_server")
-        self._process.send_signal(signal.SIGINT)
-        time.sleep(1)
-      if self._process.poll() is None:
-        self._process.kill()
-      self._process = None
-      if self._local_temp_root:
-        shutil.rmtree(self._local_temp_root)
-        self._local_temp_root = None
+    if self._local_temp_root:
+      shutil.rmtree(self._local_temp_root)
+      self._local_temp_root = None
+    return self._server.stop()
 
   def local_temp_dir(self, **kwargs):
     return tempfile.mkdtemp(dir=self._local_temp_root, **kwargs)
@@ -168,66 +129,23 @@
   def path_to_jar(self):
     raise NotImplementedError(type(self))
 
-  @classmethod
-  def path_to_gradle_target_jar(cls, target):
-    gradle_package = target[:target.rindex(':')]
-    jar_name = '-'.join([
-        'beam', gradle_package.replace(':', '-'), beam_version + '.jar'])
+  @staticmethod
+  def path_to_beam_jar(gradle_target):
+    return subprocess_server.JavaJarServer.path_to_beam_jar(gradle_target)
 
-    if beam_version.endswith('.dev'):
-      # TODO: Attempt to use nightly snapshots?
-      project_root = os.path.sep.join(__file__.split(os.path.sep)[:-6])
-      dev_path = os.path.join(
-          project_root,
-          gradle_package.replace(':', os.path.sep),
-          'build',
-          'libs',
-          jar_name.replace('.dev', '').replace('.jar', '-SNAPSHOT.jar'))
-      if os.path.exists(dev_path):
-        logging.warning(
-            'Using pre-built job server snapshot at %s', dev_path)
-        return dev_path
-      else:
-        raise RuntimeError(
-            'Please build the job server with \n  cd %s; ./gradlew %s' % (
-                os.path.abspath(project_root), target))
-    else:
-      return '/'.join([
-          cls.MAVEN_REPOSITORY,
-          'beam-' + gradle_package.replace(':', '-'),
-          beam_version,
-          jar_name])
+  @staticmethod
+  def local_jar(url):
+    return subprocess_server.JavaJarServer.local_jar(url)
 
   def subprocess_cmd_and_endpoint(self):
     jar_path = self.local_jar(self.path_to_jar())
     artifacts_dir = self.local_temp_dir(prefix='artifacts')
-    job_port, = _pick_port(None)
+    job_port, = subprocess_server.pick_port(None)
     return (
         ['java', '-jar', jar_path] + list(
             self.java_arguments(job_port, artifacts_dir)),
         'localhost:%s' % job_port)
 
-  def local_jar(self, url):
-    # TODO: Verify checksum?
-    if os.path.exists(url):
-      return url
-    else:
-      logging.warning('Downloading job server jar from %s' % url)
-      cached_jar = os.path.join(self.JAR_CACHE, os.path.basename(url))
-      if not os.path.exists(cached_jar):
-        if not os.path.exists(self.JAR_CACHE):
-          os.makedirs(self.JAR_CACHE)
-          # TODO: Clean up this cache according to some policy.
-        try:
-          url_read = urlopen(url)
-          with open(cached_jar + '.tmp', 'wb') as jar_write:
-            shutil.copyfileobj(url_read, jar_write, length=1 << 20)
-          os.rename(cached_jar + '.tmp', cached_jar)
-        except URLError as e:
-          raise RuntimeError(
-              'Unable to fetch remote job server jar at %s: %s' % (url, e))
-      return cached_jar
-
 
 class DockerizedJobServer(SubprocessJobServer):
   """
@@ -260,8 +178,9 @@
            "-v", ':'.join([docker_path, "/bin/docker"]),
            "-v", "/var/run/docker.sock:/var/run/docker.sock"]
 
-    self.job_port, self.artifact_port, self.expansion_port = _pick_port(
-        self.job_port, self.artifact_port, self.expansion_port)
+    self.job_port, self.artifact_port, self.expansion_port = (
+        subprocess_server.pick_port(
+            self.job_port, self.artifact_port, self.expansion_port))
 
     args = ['--job-host', self.job_host,
             '--job-port', str(self.job_port),
@@ -287,27 +206,3 @@
     cmd.append(job_server_image_name)
 
     return cmd + args, '%s:%s' % (self.job_host, self.job_port)
-
-
-def _pick_port(*ports):
-  """
-  Returns a list of ports, same length as input ports list, but replaces
-  all None or 0 ports with a random free port.
-  """
-  sockets = []
-
-  def find_free_port(port):
-    if port:
-      return port
-    else:
-      s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
-      sockets.append(s)
-      s.bind(('localhost', 0))
-      _, free_port = s.getsockname()
-      return free_port
-
-  ports = list(map(find_free_port, ports))
-  # Close sockets only now to avoid the same port to be chosen twice
-  for s in sockets:
-    s.close()
-  return ports
diff --git a/sdks/python/apache_beam/runners/portability/portable_runner.py b/sdks/python/apache_beam/runners/portability/portable_runner.py
index 8651c62..1e0c591 100644
--- a/sdks/python/apache_beam/runners/portability/portable_runner.py
+++ b/sdks/python/apache_beam/runners/portability/portable_runner.py
@@ -177,6 +177,7 @@
       portable_options.environment_config, server = (
           worker_pool_main.BeamFnExternalWorkerPoolServicer.start(
               sdk_worker_main._get_worker_count(options),
+              state_cache_size=sdk_worker_main._get_state_cache_size(options),
               use_process=use_loopback_process_worker))
       cleanup_callbacks = [functools.partial(server.stop, 1)]
     else:
diff --git a/sdks/python/apache_beam/runners/portability/portable_runner_test.py b/sdks/python/apache_beam/runners/portability/portable_runner_test.py
index 427b713..dbe393f 100644
--- a/sdks/python/apache_beam/runners/portability/portable_runner_test.py
+++ b/sdks/python/apache_beam/runners/portability/portable_runner_test.py
@@ -179,6 +179,8 @@
     # Override the default environment type for testing.
     options.view_as(PortableOptions).environment_type = (
         python_urns.EMBEDDED_PYTHON)
+    # Enable caching (disabled by default)
+    options.view_as(DebugOptions).add_experiment('state_cache_size=100')
     return options
 
   def create_pipeline(self):
@@ -193,6 +195,7 @@
   def create_options(self):
     options = super(PortableRunnerOptimized, self).create_options()
     options.view_as(DebugOptions).add_experiment('pre_optimize=all')
+    options.view_as(DebugOptions).add_experiment('state_cache_size=100')
     return options
 
 
@@ -201,7 +204,8 @@
   @classmethod
   def setUpClass(cls):
     cls._worker_address, cls._worker_server = (
-        worker_pool_main.BeamFnExternalWorkerPoolServicer.start())
+        worker_pool_main.BeamFnExternalWorkerPoolServicer.start(
+            state_cache_size=100))
 
   @classmethod
   def tearDownClass(cls):
@@ -224,6 +228,8 @@
     options.view_as(PortableOptions).environment_config = (
         b'%s -m apache_beam.runners.worker.sdk_worker_main' %
         sys.executable.encode('ascii')).decode('utf-8')
+    # Enable caching (disabled by default)
+    options.view_as(DebugOptions).add_experiment('state_cache_size=100')
     return options
 
   @classmethod
diff --git a/sdks/python/apache_beam/runners/worker/bundle_processor.py b/sdks/python/apache_beam/runners/worker/bundle_processor.py
index e9dbfef..cb685bf 100644
--- a/sdks/python/apache_beam/runners/worker/bundle_processor.py
+++ b/sdks/python/apache_beam/runners/worker/bundle_processor.py
@@ -199,26 +199,19 @@
 
 
 class _StateBackedIterable(object):
-  def __init__(self, state_handler, state_key, coder_or_impl):
+  def __init__(self, state_handler, state_key, coder_or_impl,
+               is_cached=False):
     self._state_handler = state_handler
     self._state_key = state_key
     if isinstance(coder_or_impl, coders.Coder):
       self._coder_impl = coder_or_impl.get_impl()
     else:
       self._coder_impl = coder_or_impl
+    self._is_cached = is_cached
 
   def __iter__(self):
-    # This is the continuation token this might be useful
-    data, continuation_token = self._state_handler.blocking_get(self._state_key)
-    while True:
-      input_stream = coder_impl.create_InputStream(data)
-      while input_stream.size() > 0:
-        yield self._coder_impl.decode_from_stream(input_stream, True)
-      if not continuation_token:
-        break
-      else:
-        data, continuation_token = self._state_handler.blocking_get(
-            self._state_key, continuation_token)
+    return self._state_handler.blocking_get(
+        self._state_key, self._coder_impl, is_cached=self._is_cached)
 
   def __reduce__(self):
     return list, (list(self),)
@@ -244,7 +237,7 @@
     if target_window not in self._cache:
       state_key = beam_fn_api_pb2.StateKey(
           multimap_side_input=beam_fn_api_pb2.StateKey.MultimapSideInput(
-              ptransform_id=self._transform_id,
+              transform_id=self._transform_id,
               side_input_id=self._tag,
               window=self._target_window_coder.encode(target_window),
               key=b''))
@@ -294,6 +287,7 @@
 
 
 class CombiningValueRuntimeState(userstate.CombiningValueRuntimeState):
+
   def __init__(self, underlying_bag_state, combinefn):
     self._combinefn = combinefn
     self._underlying_bag_state = underlying_bag_state
@@ -347,8 +341,8 @@
 coder_impl.FastPrimitivesCoderImpl.register_iterable_like_type(_ConcatIterable)
 
 
-# TODO(BEAM-5428): Implement cross-bundle state caching.
 class SynchronousBagRuntimeState(userstate.BagRuntimeState):
+
   def __init__(self, state_handler, state_key, value_coder):
     self._state_handler = state_handler
     self._state_key = state_key
@@ -359,7 +353,8 @@
   def read(self):
     return _ConcatIterable(
         [] if self._cleared else _StateBackedIterable(
-            self._state_handler, self._state_key, self._value_coder),
+            self._state_handler, self._state_key, self._value_coder,
+            is_cached=True),
         self._added_elements)
 
   def add(self, value):
@@ -370,17 +365,20 @@
     self._added_elements = []
 
   def _commit(self):
+    to_await = None
     if self._cleared:
-      self._state_handler.blocking_clear(self._state_key)
+      to_await = self._state_handler.clear(self._state_key, is_cached=True)
     if self._added_elements:
-      value_coder_impl = self._value_coder.get_impl()
-      out = coder_impl.create_OutputStream()
-      for element in self._added_elements:
-        value_coder_impl.encode_to_stream(element, out, True)
-      self._state_handler.blocking_append(self._state_key, out.get())
+      to_await = self._state_handler.extend(
+          self._state_key,
+          self._value_coder.get_impl(),
+          self._added_elements,
+          is_cached=True)
+    if to_await:
+      # To commit, we need to wait on the last state request future to complete.
+      to_await.get()
 
 
-# TODO(BEAM-5428): Implement cross-bundle state caching.
 class SynchronousSetRuntimeState(userstate.SetRuntimeState):
 
   def __init__(self, state_handler, state_key, value_coder):
@@ -393,17 +391,17 @@
   def _compact_data(self, rewrite=True):
     accumulator = set(_ConcatIterable(
         set() if self._cleared else _StateBackedIterable(
-            self._state_handler, self._state_key, self._value_coder),
+            self._state_handler, self._state_key, self._value_coder,
+            is_cached=True),
         self._added_elements))
 
     if rewrite and accumulator:
-      self._state_handler.blocking_clear(self._state_key)
-
-      value_coder_impl = self._value_coder.get_impl()
-      out = coder_impl.create_OutputStream()
-      for element in accumulator:
-        value_coder_impl.encode_to_stream(element, out, True)
-      self._state_handler.blocking_append(self._state_key, out.get())
+      self._state_handler.clear(self._state_key, is_cached=True)
+      self._state_handler.extend(
+          self._state_key,
+          self._value_coder.get_impl(),
+          accumulator,
+          is_cached=True)
 
       # Since everthing is already committed so we can safely reinitialize
       # added_elements here.
@@ -417,7 +415,7 @@
   def add(self, value):
     if self._cleared:
       # This is a good time explicitly clear.
-      self._state_handler.blocking_clear(self._state_key)
+      self._state_handler.clear(self._state_key, is_cached=True)
       self._cleared = False
 
     self._added_elements.add(value)
@@ -430,13 +428,13 @@
 
   def _commit(self):
     if self._cleared:
-      self._state_handler.blocking_clear(self._state_key)
+      self._state_handler.clear(self._state_key, is_cached=True).get()
     if self._added_elements:
-      value_coder_impl = self._value_coder.get_impl()
-      out = coder_impl.create_OutputStream()
-      for element in self._added_elements:
-        value_coder_impl.encode_to_stream(element, out, True)
-      self._state_handler.blocking_append(self._state_key, out.get())
+      self._state_handler.extend(
+          self._state_key,
+          self._value_coder.get_impl(),
+          self._added_elements,
+          is_cached=True).get()
 
 
 class OutputTimer(object):
@@ -505,10 +503,11 @@
           self._state_handler,
           state_key=beam_fn_api_pb2.StateKey(
               bag_user_state=beam_fn_api_pb2.StateKey.BagUserState(
-                  ptransform_id=self._transform_id,
+                  transform_id=self._transform_id,
                   user_state_id=state_spec.name,
                   window=self._window_coder.encode(window),
-                  key=self._key_coder.encode(key))),
+                  # State keys are expected in nested encoding format
+                  key=self._key_coder.encode_nested(key))),
           value_coder=state_spec.coder)
       if isinstance(state_spec, userstate.BagStateSpec):
         return bag_state
@@ -519,10 +518,11 @@
           self._state_handler,
           state_key=beam_fn_api_pb2.StateKey(
               bag_user_state=beam_fn_api_pb2.StateKey.BagUserState(
-                  ptransform_id=self._transform_id,
+                  transform_id=self._transform_id,
                   user_state_id=state_spec.name,
                   window=self._window_coder.encode(window),
-                  key=self._key_coder.encode(key))),
+                  # State keys are expected in nested encoding format
+                  key=self._key_coder.encode_nested(key))),
           value_coder=state_spec.coder)
     else:
       raise NotImplementedError(state_spec)
@@ -660,7 +660,7 @@
         for data in data_channel.input_elements(
             instruction_id, expected_transforms):
           input_op_by_transform_id[
-              data.ptransform_id].process_encoded(data.data)
+              data.transform_id].process_encoded(data.data)
 
       # Finish all operations.
       for op in self.ops.values():
@@ -707,14 +707,14 @@
                     self.delayed_bundle_application(*element_residual))
               split_response.channel_splits.extend([
                   beam_fn_api_pb2.ProcessBundleSplitResponse.ChannelSplit(
-                      ptransform_id=op.transform_id,
+                      transform_id=op.transform_id,
                       last_primary_element=primary_end,
                       first_residual_element=residual_start)])
 
     return split_response
 
   def delayed_bundle_application(self, op, deferred_remainder):
-    ptransform_id, main_input_tag, main_input_coder, outputs = op.input_info
+    transform_id, main_input_tag, main_input_coder, outputs = op.input_info
     # TODO(SDF): For non-root nodes, need main_input_coder + residual_coder.
     element_and_restriction, watermark = deferred_remainder
     if watermark:
@@ -725,7 +725,7 @@
       output_watermarks = None
     return beam_fn_api_pb2.DelayedBundleApplication(
         application=beam_fn_api_pb2.BundleApplication(
-            ptransform_id=ptransform_id,
+            transform_id=transform_id,
             input_id=main_input_tag,
             output_watermarks=output_watermarks,
             element=main_input_coder.get_impl().encode_nested(
diff --git a/sdks/python/apache_beam/runners/worker/data_plane.py b/sdks/python/apache_beam/runners/worker/data_plane.py
index 8502f4e..8324e6b 100644
--- a/sdks/python/apache_beam/runners/worker/data_plane.py
+++ b/sdks/python/apache_beam/runners/worker/data_plane.py
@@ -145,7 +145,7 @@
                      abort_callback=None):
     other_inputs = []
     for data in self._inputs:
-      if data.instruction_reference == instruction_id:
+      if data.instruction_id == instruction_id:
         if data.data:
           yield data
       else:
@@ -156,8 +156,8 @@
     def add_to_inverse_output(data):
       self._inverse._inputs.append(  # pylint: disable=protected-access
           beam_fn_api_pb2.Elements.Data(
-              instruction_reference=instruction_id,
-              ptransform_id=transform_id,
+              instruction_id=instruction_id,
+              transform_id=transform_id,
               data=data))
     return ClosableOutputStream(
         add_to_inverse_output, flush_callback=add_to_inverse_output)
@@ -220,10 +220,10 @@
             t, v, tb = self._exc_info
             raise_(t, v, tb)
         else:
-          if not data.data and data.ptransform_id in expected_transforms:
-            done_transforms.append(data.ptransform_id)
+          if not data.data and data.transform_id in expected_transforms:
+            done_transforms.append(data.transform_id)
           else:
-            assert data.ptransform_id not in done_transforms
+            assert data.transform_id not in done_transforms
             yield data
     finally:
       # Instruction_ids are not reusable so Clean queue once we are done with
@@ -235,8 +235,8 @@
       if data:
         self._to_send.put(
             beam_fn_api_pb2.Elements.Data(
-                instruction_reference=instruction_id,
-                ptransform_id=transform_id,
+                instruction_id=instruction_id,
+                transform_id=transform_id,
                 data=data))
 
     def close_callback(data):
@@ -244,8 +244,8 @@
       # End of stream marker.
       self._to_send.put(
           beam_fn_api_pb2.Elements.Data(
-              instruction_reference=instruction_id,
-              ptransform_id=transform_id,
+              instruction_id=instruction_id,
+              transform_id=transform_id,
               data=b''))
     return ClosableOutputStream(
         close_callback, flush_callback=add_to_send_queue)
@@ -271,7 +271,7 @@
     try:
       for elements in elements_iterator:
         for data in elements.data:
-          self._receiving_queue(data.instruction_reference).put(data)
+          self._receiving_queue(data.instruction_id).put(data)
     except:  # pylint: disable=bare-except
       if not self._closed:
         logging.exception('Failed to read inputs in the data plane.')
diff --git a/sdks/python/apache_beam/runners/worker/data_plane_test.py b/sdks/python/apache_beam/runners/worker/data_plane_test.py
index 5f2831c..d11390a 100644
--- a/sdks/python/apache_beam/runners/worker/data_plane_test.py
+++ b/sdks/python/apache_beam/runners/worker/data_plane_test.py
@@ -109,8 +109,8 @@
     self.assertEqual(
         list(to_channel.input_elements('0', [transform_1])),
         [beam_fn_api_pb2.Elements.Data(
-            instruction_reference='0',
-            ptransform_id=transform_1,
+            instruction_id='0',
+            transform_id=transform_1,
             data=b'abc')])
 
     # Multiple interleaved writes to multiple instructions.
@@ -119,19 +119,19 @@
     self.assertEqual(
         list(to_channel.input_elements('1', [transform_1])),
         [beam_fn_api_pb2.Elements.Data(
-            instruction_reference='1',
-            ptransform_id=transform_1,
+            instruction_id='1',
+            transform_id=transform_1,
             data=b'abc')])
     send('2', transform_2, b'ghi')
     self.assertEqual(
         list(to_channel.input_elements('2', [transform_1, transform_2])),
         [beam_fn_api_pb2.Elements.Data(
-            instruction_reference='2',
-            ptransform_id=transform_1,
+            instruction_id='2',
+            transform_id=transform_1,
             data=b'def'),
          beam_fn_api_pb2.Elements.Data(
-             instruction_reference='2',
-             ptransform_id=transform_2,
+             instruction_id='2',
+             transform_id=transform_2,
              data=b'ghi')])
 
 
diff --git a/sdks/python/apache_beam/runners/worker/sdk_worker.py b/sdks/python/apache_beam/runners/worker/sdk_worker.py
index 3dfaed6..9bcfe97 100644
--- a/sdks/python/apache_beam/runners/worker/sdk_worker.py
+++ b/sdks/python/apache_beam/runners/worker/sdk_worker.py
@@ -37,11 +37,13 @@
 from future.utils import raise_
 from future.utils import with_metaclass
 
+from apache_beam.coders import coder_impl
 from apache_beam.portability.api import beam_fn_api_pb2
 from apache_beam.portability.api import beam_fn_api_pb2_grpc
 from apache_beam.runners.worker import bundle_processor
 from apache_beam.runners.worker import data_plane
 from apache_beam.runners.worker.channel_factory import GRPCChannelFactory
+from apache_beam.runners.worker.statecache import StateCache
 from apache_beam.runners.worker.worker_id_interceptor import WorkerIdInterceptor
 
 
@@ -50,7 +52,11 @@
   SCHEDULING_DELAY_THRESHOLD_SEC = 5*60  # 5 Minutes
 
   def __init__(
-      self, control_address, worker_count, credentials=None, worker_id=None,
+      self, control_address, worker_count,
+      credentials=None,
+      worker_id=None,
+      # Caching is disabled by default
+      state_cache_size=0,
       profiler_factory=None):
     self._alive = True
     self._worker_count = worker_count
@@ -71,7 +77,8 @@
         self._control_channel, WorkerIdInterceptor(self._worker_id))
     self._data_channel_factory = data_plane.GrpcClientDataChannelFactory(
         credentials, self._worker_id)
-    self._state_handler_factory = GrpcStateHandlerFactory(credentials)
+    self._state_handler_factory = GrpcStateHandlerFactory(state_cache_size,
+                                                          credentials)
     self._profiler_factory = profiler_factory
     self._fns = {}
     # BundleProcessor cache across all workers.
@@ -106,8 +113,8 @@
     for _ in range(self._worker_count):
       # SdkHarness manage function registration and share self._fns with all
       # the workers. This is needed because function registration (register)
-      # and exceution(process_bundle) are send over different request and we
-      # do not really know which woker is going to process bundle
+      # and execution (process_bundle) are send over different request and we
+      # do not really know which worker is going to process bundle
       # for a function till we get process_bundle request. Moreover
       # same function is reused by different process bundle calls and
       # potentially get executed by different worker. Hence we need a
@@ -208,10 +215,10 @@
   def _request_process_bundle_action(self, request):
 
     def task():
-      instruction_reference = getattr(
-          request, request.WhichOneof('request')).instruction_reference
+      instruction_id = getattr(
+          request, request.WhichOneof('request')).instruction_id
       # only process progress/split request when a bundle is in processing.
-      if (instruction_reference in
+      if (instruction_id in
           self._bundle_processor_cache.active_bundle_processors):
         self._execute(
             lambda: self.progress_worker.do_instruction(request), request)
@@ -219,9 +226,9 @@
         self._execute(lambda: beam_fn_api_pb2.InstructionResponse(
             instruction_id=request.instruction_id, error=(
                 'Process bundle request not yet scheduled for instruction {}' if
-                instruction_reference in self._unscheduled_process_bundle else
+                instruction_id in self._unscheduled_process_bundle else
                 'Unknown process bundle instruction {}').format(
-                    instruction_reference)), request)
+                    instruction_id)), request)
 
     self._progress_thread_pool.submit(task)
 
@@ -331,7 +338,8 @@
 
 class SdkWorker(object):
 
-  def __init__(self, bundle_processor_cache, profiler_factory=None):
+  def __init__(self, bundle_processor_cache,
+               profiler_factory=None):
     self.bundle_processor_cache = bundle_processor_cache
     self.profiler_factory = profiler_factory
 
@@ -360,10 +368,10 @@
 
   def process_bundle(self, request, instruction_id):
     bundle_processor = self.bundle_processor_cache.get(
-        instruction_id, request.process_bundle_descriptor_reference)
+        instruction_id, request.process_bundle_descriptor_id)
     try:
       with bundle_processor.state_handler.process_instruction_id(
-          instruction_id):
+          instruction_id, request.cache_tokens):
         with self.maybe_profile(instruction_id):
           delayed_applications, requests_finalization = (
               bundle_processor.process_bundle(instruction_id))
@@ -385,7 +393,7 @@
 
   def process_bundle_split(self, request, instruction_id):
     processor = self.bundle_processor_cache.lookup(
-        request.instruction_reference)
+        request.instruction_id)
     if processor:
       return beam_fn_api_pb2.InstructionResponse(
           instruction_id=instruction_id,
@@ -398,7 +406,7 @@
   def process_bundle_progress(self, request, instruction_id):
     # It is an error to get progress for a not-in-flight bundle.
     processor = self.bundle_processor_cache.lookup(
-        request.instruction_reference)
+        request.instruction_id)
     return beam_fn_api_pb2.InstructionResponse(
         instruction_id=instruction_id,
         process_bundle_progress=beam_fn_api_pb2.ProcessBundleProgressResponse(
@@ -407,16 +415,16 @@
 
   def finalize_bundle(self, request, instruction_id):
     processor = self.bundle_processor_cache.lookup(
-        request.instruction_reference)
+        request.instruction_id)
     if processor:
       try:
         finalize_response = processor.finalize_bundle()
-        self.bundle_processor_cache.release(request.instruction_reference)
+        self.bundle_processor_cache.release(request.instruction_id)
         return beam_fn_api_pb2.InstructionResponse(
             instruction_id=instruction_id,
             finalize_bundle=finalize_response)
       except:
-        self.bundle_processor_cache.discard(request.instruction_reference)
+        self.bundle_processor_cache.discard(request.instruction_id)
         raise
     else:
       return beam_fn_api_pb2.InstructionResponse(
@@ -459,11 +467,12 @@
   Caches the created channels by ``state descriptor url``.
   """
 
-  def __init__(self, credentials=None):
+  def __init__(self, state_cache_size, credentials=None):
     self._state_handler_cache = {}
     self._lock = threading.Lock()
     self._throwing_state_handler = ThrowingStateHandler()
     self._credentials = credentials
+    self._state_cache = StateCache(state_cache_size)
 
   def create_state_handler(self, api_service_descriptor):
     if not api_service_descriptor:
@@ -489,8 +498,10 @@
           # Add workerId to the grpc channel
           grpc_channel = grpc.intercept_channel(grpc_channel,
                                                 WorkerIdInterceptor())
-          self._state_handler_cache[url] = GrpcStateHandler(
-              beam_fn_api_pb2_grpc.BeamFnStateStub(grpc_channel))
+          self._state_handler_cache[url] = CachingMaterializingStateHandler(
+              self._state_cache,
+              GrpcStateHandler(
+                  beam_fn_api_pb2_grpc.BeamFnStateStub(grpc_channel)))
     return self._state_handler_cache[url]
 
   def close(self):
@@ -498,28 +509,26 @@
     for _, state_handler in self._state_handler_cache.items():
       state_handler.done()
     self._state_handler_cache.clear()
+    self._state_cache.evict_all()
 
 
 class ThrowingStateHandler(object):
   """A state handler that errors on any requests."""
 
-  def blocking_get(self, state_key, instruction_reference):
+  def blocking_get(self, state_key, coder):
     raise RuntimeError(
         'Unable to handle state requests for ProcessBundleDescriptor without '
-        'out state ApiServiceDescriptor for instruction %s and state key %s.'
-        % (state_key, instruction_reference))
+        'state ApiServiceDescriptor for state key %s.' % state_key)
 
-  def blocking_append(self, state_key, data, instruction_reference):
+  def append(self, state_key, coder, elements):
     raise RuntimeError(
         'Unable to handle state requests for ProcessBundleDescriptor without '
-        'out state ApiServiceDescriptor for instruction %s and state key %s.'
-        % (state_key, instruction_reference))
+        'state ApiServiceDescriptor for state key %s.' % state_key)
 
-  def blocking_clear(self, state_key, instruction_reference):
+  def clear(self, state_key):
     raise RuntimeError(
         'Unable to handle state requests for ProcessBundleDescriptor without '
-        'out state ApiServiceDescriptor for instruction %s and state key %s.'
-        % (state_key, instruction_reference))
+        'state ApiServiceDescriptor for state key %s.' % state_key)
 
 
 class GrpcStateHandler(object):
@@ -527,7 +536,6 @@
   _DONE = object()
 
   def __init__(self, state_stub):
-    self._lock = threading.Lock()
     self._state_stub = state_stub
     self._requests = queue.Queue()
     self._responses_by_id = {}
@@ -562,7 +570,8 @@
     def pull_responses():
       try:
         for response in responses:
-          self._responses_by_id[response.id].set(response)
+          future = self._responses_by_id.pop(response.id)
+          future.set(response)
           if self._done:
             break
       except:  # pylint: disable=bare-except
@@ -577,7 +586,7 @@
     self._done = True
     self._requests.put(self._DONE)
 
-  def blocking_get(self, state_key, continuation_token=None):
+  def get_raw(self, state_key, continuation_token=None):
     response = self._blocking_request(
         beam_fn_api_pb2.StateRequest(
             state_key=state_key,
@@ -585,31 +594,34 @@
                 continuation_token=continuation_token)))
     return response.get.data, response.get.continuation_token
 
-  def blocking_append(self, state_key, data):
-    self._blocking_request(
+  def append_raw(self, state_key, data):
+    return self._request(
         beam_fn_api_pb2.StateRequest(
             state_key=state_key,
             append=beam_fn_api_pb2.StateAppendRequest(data=data)))
 
-  def blocking_clear(self, state_key):
-    self._blocking_request(
+  def clear(self, state_key):
+    return self._request(
         beam_fn_api_pb2.StateRequest(
             state_key=state_key,
             clear=beam_fn_api_pb2.StateClearRequest()))
 
-  def _blocking_request(self, request):
+  def _request(self, request):
     request.id = self._next_id()
-    request.instruction_reference = self._context.process_instruction_id
+    request.instruction_id = self._context.process_instruction_id
     self._responses_by_id[request.id] = future = _Future()
     self._requests.put(request)
-    while not future.wait(timeout=1):
+    return future
+
+  def _blocking_request(self, request):
+    req_future = self._request(request)
+    while not req_future.wait(timeout=1):
       if self._exc_info:
         t, v, tb = self._exc_info
         raise_(t, v, tb)
       elif self._done:
         raise RuntimeError()
-    del self._responses_by_id[request.id]
-    response = future.get()
+    response = req_future.get()
     if response.error:
       raise RuntimeError(response.error)
     else:
@@ -620,6 +632,101 @@
     return str(self._last_id)
 
 
+class CachingMaterializingStateHandler(object):
+  """ A State handler which retrieves and caches state. """
+
+  def __init__(self, global_state_cache, underlying_state):
+    self._underlying = underlying_state
+    self._state_cache = global_state_cache
+    self._context = threading.local()
+
+  @contextlib.contextmanager
+  def process_instruction_id(self, bundle_id, cache_tokens):
+    if getattr(self._context, 'cache_token', None) is not None:
+      raise RuntimeError(
+          'Cache tokens already set to %s' % self._context.cache_token)
+    # TODO Also handle cache tokens for side input, if present:
+    # https://issues.apache.org/jira/browse/BEAM-8298
+    user_state_cache_token = None
+    for cache_token_struct in cache_tokens:
+      if cache_token_struct.HasField("user_state"):
+        # There should only be one user state token present
+        assert not user_state_cache_token
+        user_state_cache_token = cache_token_struct.token
+    try:
+      self._context.cache_token = user_state_cache_token
+      with self._underlying.process_instruction_id(bundle_id):
+        yield
+    finally:
+      self._context.cache_token = None
+
+  def blocking_get(self, state_key, coder, is_cached=False):
+    if not self._should_be_cached(is_cached):
+      # Cache disabled / no cache token. Can't do a lookup/store in the cache.
+      # Fall back to lazily materializing the state, one element at a time.
+      return self._materialize_iter(state_key, coder)
+    # Cache lookup
+    cache_state_key = self._convert_to_cache_key(state_key)
+    cached_value = self._state_cache.get(cache_state_key,
+                                         self._context.cache_token)
+    if cached_value is None:
+      # Cache miss, need to retrieve from the Runner
+      # TODO If caching is enabled, this materializes the entire state.
+      # Further size estimation or the use of the continuation token on the
+      # runner side could fall back to materializing one item at a time.
+      # https://jira.apache.org/jira/browse/BEAM-8297
+      materialized = cached_value = list(
+          self._materialize_iter(state_key, coder))
+      self._state_cache.put(
+          cache_state_key,
+          self._context.cache_token,
+          materialized)
+    return iter(cached_value)
+
+  def extend(self, state_key, coder, elements, is_cached=False):
+    if self._should_be_cached(is_cached):
+      # Update the cache
+      cache_key = self._convert_to_cache_key(state_key)
+      self._state_cache.extend(cache_key, self._context.cache_token, elements)
+    # Write to state handler
+    out = coder_impl.create_OutputStream()
+    for element in elements:
+      coder.encode_to_stream(element, out, True)
+    return self._underlying.append_raw(state_key, out.get())
+
+  def clear(self, state_key, is_cached=False):
+    if self._should_be_cached(is_cached):
+      cache_key = self._convert_to_cache_key(state_key)
+      self._state_cache.clear(cache_key, self._context.cache_token)
+    return self._underlying.clear(state_key)
+
+  def done(self):
+    self._underlying.done()
+
+  def _materialize_iter(self, state_key, coder):
+    """Materializes the state lazily, one element at a time.
+       :return A generator which returns the next element if advanced.
+    """
+    continuation_token = None
+    while True:
+      data, continuation_token = \
+          self._underlying.get_raw(state_key, continuation_token)
+      input_stream = coder_impl.create_InputStream(data)
+      while input_stream.size() > 0:
+        yield coder.decode_from_stream(input_stream, True)
+      if not continuation_token:
+        break
+
+  def _should_be_cached(self, request_is_cached):
+    return (self._state_cache.is_cache_enabled() and
+            request_is_cached and
+            self._context.cache_token)
+
+  @staticmethod
+  def _convert_to_cache_key(state_key):
+    return state_key.SerializeToString()
+
+
 class _Future(object):
   """A simple future object to implement blocking requests.
   """
@@ -639,3 +746,11 @@
   def set(self, value):
     self._value = value
     self._event.set()
+
+  @classmethod
+  def done(cls):
+    if not hasattr(cls, 'DONE'):
+      done_future = _Future()
+      done_future.set(None)
+      cls.DONE = done_future
+    return cls.DONE
diff --git a/sdks/python/apache_beam/runners/worker/sdk_worker_main.py b/sdks/python/apache_beam/runners/worker/sdk_worker_main.py
index 94ce343..c6cb8ed 100644
--- a/sdks/python/apache_beam/runners/worker/sdk_worker_main.py
+++ b/sdks/python/apache_beam/runners/worker/sdk_worker_main.py
@@ -149,6 +149,7 @@
         control_address=service_descriptor.url,
         worker_count=_get_worker_count(sdk_pipeline_options),
         worker_id=_worker_id,
+        state_cache_size=_get_state_cache_size(sdk_pipeline_options),
         profiler_factory=profiler.Profile.factory_from_options(
             sdk_pipeline_options.view_as(ProfilingOptions))
     ).run()
@@ -205,6 +206,28 @@
   return 12
 
 
+def _get_state_cache_size(pipeline_options):
+  """Defines the upper number of state items to cache.
+
+  Note: state_cache_size is an experimental flag and might not be available in
+  future releases.
+
+  Returns:
+    an int indicating the maximum number of items to cache.
+      Default is 0 (disabled)
+  """
+  experiments = pipeline_options.view_as(DebugOptions).experiments
+  experiments = experiments if experiments else []
+
+  for experiment in experiments:
+    # There should only be 1 match so returning from the loop
+    if re.match(r'state_cache_size=', experiment):
+      return int(
+          re.match(r'state_cache_size=(?P<state_cache_size>.*)',
+                   experiment).group('state_cache_size'))
+  return 0
+
+
 def _load_main_session(semi_persistent_directory):
   """Loads a pickled main session from the path specified."""
   if semi_persistent_directory:
diff --git a/sdks/python/apache_beam/runners/worker/sdk_worker_test.py b/sdks/python/apache_beam/runners/worker/sdk_worker_test.py
index 9b094b7..71263a8 100644
--- a/sdks/python/apache_beam/runners/worker/sdk_worker_test.py
+++ b/sdks/python/apache_beam/runners/worker/sdk_worker_test.py
@@ -100,7 +100,8 @@
       server.start()
 
       harness = sdk_worker.SdkHarness(
-          "localhost:%s" % test_port, worker_count=worker_count)
+          "localhost:%s" % test_port, worker_count=worker_count,
+          state_cache_size=100)
       harness.run()
 
       for worker in harness.workers.queue:
diff --git a/sdks/python/apache_beam/runners/worker/statecache.py b/sdks/python/apache_beam/runners/worker/statecache.py
new file mode 100644
index 0000000..a4902c6
--- /dev/null
+++ b/sdks/python/apache_beam/runners/worker/statecache.py
@@ -0,0 +1,122 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+"""A module for caching state reads/writes in Beam applications."""
+from __future__ import absolute_import
+
+import collections
+import logging
+from threading import Lock
+
+
+class StateCache(object):
+  """ Cache for Beam state access, scoped by state key and cache_token.
+
+  For a given state_key, caches a (cache_token, value) tuple and allows to
+    a) read from the cache (get),
+           if the currently stored cache_token matches the provided
+    a) write to the cache (put),
+           storing the new value alongside with a cache token
+    c) append to the cache (extend),
+           if the currently stored cache_token matches the provided
+
+  The operations on the cache are thread-safe for use by multiple workers.
+
+  :arg max_entries The maximum number of entries to store in the cache.
+  TODO Memory-based caching: https://issues.apache.org/jira/browse/BEAM-8297
+  """
+
+  def __init__(self, max_entries):
+    logging.info('Creating state cache with size %s', max_entries)
+    self._cache = self.LRUCache(max_entries, (None, None))
+    self._lock = Lock()
+
+  def get(self, state_key, cache_token):
+    assert cache_token and self.is_cache_enabled()
+    with self._lock:
+      token, value = self._cache.get(state_key)
+    return value if token == cache_token else None
+
+  def put(self, state_key, cache_token, value):
+    assert cache_token and self.is_cache_enabled()
+    with self._lock:
+      return self._cache.put(state_key, (cache_token, value))
+
+  def extend(self, state_key, cache_token, elements):
+    assert cache_token and self.is_cache_enabled()
+    with self._lock:
+      token, value = self._cache.get(state_key)
+      if token in [cache_token, None]:
+        if value is None:
+          value = []
+        value.extend(elements)
+        self._cache.put(state_key, (cache_token, value))
+      else:
+        # Discard cached state if tokens do not match
+        self._cache.evict(state_key)
+
+  def clear(self, state_key, cache_token):
+    assert cache_token and self.is_cache_enabled()
+    with self._lock:
+      token, _ = self._cache.get(state_key)
+      if token in [cache_token, None]:
+        self._cache.put(state_key, (cache_token, []))
+      else:
+        # Discard cached state if tokens do not match
+        self._cache.evict(state_key)
+
+  def evict(self, state_key):
+    assert self.is_cache_enabled()
+    with self._lock:
+      self._cache.evict(state_key)
+
+  def evict_all(self):
+    with self._lock:
+      self._cache.evict_all()
+
+  def is_cache_enabled(self):
+    return self._cache._max_entries > 0
+
+  def __len__(self):
+    return len(self._cache)
+
+  class LRUCache(object):
+
+    def __init__(self, max_entries, default_entry):
+      self._max_entries = max_entries
+      self._default_entry = default_entry
+      self._cache = collections.OrderedDict()
+
+    def get(self, key):
+      value = self._cache.pop(key, self._default_entry)
+      if value != self._default_entry:
+        self._cache[key] = value
+      return value
+
+    def put(self, key, value):
+      self._cache[key] = value
+      while len(self._cache) > self._max_entries:
+        self._cache.popitem(last=False)
+
+    def evict(self, key):
+      self._cache.pop(key, self._default_entry)
+
+    def evict_all(self):
+      self._cache.clear()
+
+    def __len__(self):
+      return len(self._cache)
diff --git a/sdks/python/apache_beam/runners/worker/statecache_test.py b/sdks/python/apache_beam/runners/worker/statecache_test.py
new file mode 100644
index 0000000..8fedeaf
--- /dev/null
+++ b/sdks/python/apache_beam/runners/worker/statecache_test.py
@@ -0,0 +1,155 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+"""Tests for state caching."""
+from __future__ import absolute_import
+
+import logging
+import unittest
+
+from apache_beam.runners.worker.statecache import StateCache
+
+
+class StateCacheTest(unittest.TestCase):
+
+  def test_empty_cache_get(self):
+    cache = StateCache(5)
+    self.assertEqual(cache.get("key", 'cache_token'), None)
+    with self.assertRaises(Exception):
+      self.assertEqual(cache.get("key", None), None)
+
+  def test_put_get(self):
+    cache = StateCache(5)
+    cache.put("key", "cache_token", "value")
+    self.assertEqual(len(cache), 1)
+    self.assertEqual(cache.get("key", "cache_token"), "value")
+    self.assertEqual(cache.get("key", "cache_token2"), None)
+    with self.assertRaises(Exception):
+      self.assertEqual(cache.get("key", None), None)
+
+  def test_overwrite(self):
+    cache = StateCache(2)
+    cache.put("key", "cache_token", "value")
+    cache.put("key", "cache_token2", "value2")
+    self.assertEqual(len(cache), 1)
+    self.assertEqual(cache.get("key", "cache_token"), None)
+    self.assertEqual(cache.get("key", "cache_token2"), "value2")
+
+  def test_extend(self):
+    cache = StateCache(3)
+    cache.put("key", "cache_token", ['val'])
+    # test extend for existing key
+    cache.extend("key", "cache_token", ['yet', 'another', 'val'])
+    self.assertEqual(len(cache), 1)
+    self.assertEqual(cache.get("key", "cache_token"),
+                     ['val', 'yet', 'another', 'val'])
+    # test extend without existing key
+    cache.extend("key2", "cache_token", ['another', 'val'])
+    self.assertEqual(len(cache), 2)
+    self.assertEqual(cache.get("key2", "cache_token"), ['another', 'val'])
+    # test eviction in case the cache token changes
+    cache.extend("key2", "new_token", ['new_value'])
+    self.assertEqual(cache.get("key2", "new_token"), None)
+    self.assertEqual(len(cache), 1)
+
+  def test_clear(self):
+    cache = StateCache(5)
+    cache.clear("new-key", "cache_token")
+    cache.put("key", "cache_token", ["value"])
+    self.assertEqual(len(cache), 2)
+    self.assertEqual(cache.get("new-key", "new_token"), None)
+    self.assertEqual(cache.get("key", "cache_token"), ['value'])
+    # test clear without existing key/token
+    cache.clear("non-existing", "token")
+    self.assertEqual(len(cache), 3)
+    self.assertEqual(cache.get("non-existing", "token"), [])
+    # test eviction in case the cache token changes
+    cache.clear("new-key", "wrong_token")
+    self.assertEqual(len(cache), 2)
+    self.assertEqual(cache.get("new-key", "cache_token"), None)
+    self.assertEqual(cache.get("new-key", "wrong_token"), None)
+
+  def test_max_size(self):
+    cache = StateCache(2)
+    cache.put("key", "cache_token", "value")
+    cache.put("key2", "cache_token", "value")
+    self.assertEqual(len(cache), 2)
+    cache.put("key2", "cache_token", "value")
+    self.assertEqual(len(cache), 2)
+    cache.put("key", "cache_token", "value")
+    self.assertEqual(len(cache), 2)
+
+  def test_evict_all(self):
+    cache = StateCache(5)
+    cache.put("key", "cache_token", "value")
+    cache.put("key2", "cache_token", "value2")
+    self.assertEqual(len(cache), 2)
+    cache.evict_all()
+    self.assertEqual(len(cache), 0)
+    self.assertEqual(cache.get("key", "cache_token"), None)
+    self.assertEqual(cache.get("key2", "cache_token"), None)
+
+  def test_lru(self):
+    cache = StateCache(5)
+    cache.put("key", "cache_token", "value")
+    cache.put("key2", "cache_token2", "value2")
+    cache.put("key3", "cache_token", "value0")
+    cache.put("key3", "cache_token", "value3")
+    cache.put("key4", "cache_token4", "value4")
+    cache.put("key5", "cache_token", "value0")
+    cache.put("key5", "cache_token", ["value5"])
+    self.assertEqual(len(cache), 5)
+    self.assertEqual(cache.get("key", "cache_token"), "value")
+    self.assertEqual(cache.get("key2", "cache_token2"), "value2")
+    self.assertEqual(cache.get("key3", "cache_token"), "value3")
+    self.assertEqual(cache.get("key4", "cache_token4"), "value4")
+    self.assertEqual(cache.get("key5", "cache_token"), ["value5"])
+    # insert another key to trigger cache eviction
+    cache.put("key6", "cache_token2", "value7")
+    self.assertEqual(len(cache), 5)
+    # least recently used key should be gone ("key")
+    self.assertEqual(cache.get("key", "cache_token"), None)
+    # trigger a read on "key2"
+    cache.get("key2", "cache_token")
+    # insert another key to trigger cache eviction
+    cache.put("key7", "cache_token", "value7")
+    self.assertEqual(len(cache), 5)
+    # least recently used key should be gone ("key3")
+    self.assertEqual(cache.get("key3", "cache_token"), None)
+    # trigger a put on "key2"
+    cache.put("key2", "cache_token", "put")
+    self.assertEqual(len(cache), 5)
+    # insert another key to trigger cache eviction
+    cache.put("key8", "cache_token", "value8")
+    self.assertEqual(len(cache), 5)
+    # least recently used key should be gone ("key4")
+    self.assertEqual(cache.get("key4", "cache_token"), None)
+    # make "key5" used by appending to it
+    cache.extend("key5", "cache_token", ["another"])
+    # least recently used key should be gone ("key6")
+    self.assertEqual(cache.get("key6", "cache_token"), None)
+
+  def test_is_cached_enabled(self):
+    cache = StateCache(1)
+    self.assertEqual(cache.is_cache_enabled(), True)
+    cache = StateCache(0)
+    self.assertEqual(cache.is_cache_enabled(), False)
+
+
+if __name__ == '__main__':
+  logging.getLogger().setLevel(logging.INFO)
+  unittest.main()
diff --git a/sdks/python/apache_beam/runners/worker/worker_pool_main.py b/sdks/python/apache_beam/runners/worker/worker_pool_main.py
index 94e8ec5..ef9e005 100644
--- a/sdks/python/apache_beam/runners/worker/worker_pool_main.py
+++ b/sdks/python/apache_beam/runners/worker/worker_pool_main.py
@@ -47,21 +47,26 @@
 class BeamFnExternalWorkerPoolServicer(
     beam_fn_api_pb2_grpc.BeamFnExternalWorkerPoolServicer):
 
-  def __init__(self, worker_threads, use_process=False,
-               container_executable=None):
+  def __init__(self, worker_threads,
+               use_process=False,
+               container_executable=None,
+               state_cache_size=0):
     self._worker_threads = worker_threads
     self._use_process = use_process
     self._container_executable = container_executable
+    self._state_cache_size = state_cache_size
     self._worker_processes = {}
 
   @classmethod
   def start(cls, worker_threads=1, use_process=False, port=0,
-            container_executable=None):
+            state_cache_size=0, container_executable=None):
     worker_server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
     worker_address = 'localhost:%s' % worker_server.add_insecure_port(
         '[::]:%s' % port)
-    worker_pool = cls(worker_threads, use_process=use_process,
-                      container_executable=container_executable)
+    worker_pool = cls(worker_threads,
+                      use_process=use_process,
+                      container_executable=container_executable,
+                      state_cache_size=state_cache_size)
     beam_fn_api_pb2_grpc.add_BeamFnExternalWorkerPoolServicer_to_server(
         worker_pool,
         worker_server)
@@ -81,10 +86,17 @@
         command = ['python', '-c',
                    'from apache_beam.runners.worker.sdk_worker '
                    'import SdkHarness; '
-                   'SdkHarness("%s",worker_count=%d,worker_id="%s").run()' % (
+                   'SdkHarness('
+                   '"%s",'
+                   'worker_count=%d,'
+                   'worker_id="%s",'
+                   'state_cache_size=%d'
+                   ')'
+                   '.run()' % (
                        start_worker_request.control_endpoint.url,
                        self._worker_threads,
-                       start_worker_request.worker_id)]
+                       start_worker_request.worker_id,
+                       self._state_cache_size)]
         if self._container_executable:
           # command as per container spec
           # the executable is responsible to handle concurrency
@@ -109,7 +121,8 @@
         worker = sdk_worker.SdkHarness(
             start_worker_request.control_endpoint.url,
             worker_count=self._worker_threads,
-            worker_id=start_worker_request.worker_id)
+            worker_id=start_worker_request.worker_id,
+            state_cache_size=self._state_cache_size)
         worker_thread = threading.Thread(
             name='run_worker_%s' % start_worker_request.worker_id,
             target=worker.run)
diff --git a/sdks/python/apache_beam/testing/util.py b/sdks/python/apache_beam/testing/util.py
index 32c16db..6d77ee7 100644
--- a/sdks/python/apache_beam/testing/util.py
+++ b/sdks/python/apache_beam/testing/util.py
@@ -121,15 +121,21 @@
 
     # Try to compare actual and expected by sorting. This fails with a
     # TypeError in Python 3 if different types are present in the same
-    # collection.
+    # collection. It can also raise false negatives for types that don't have
+    # a deterministic sort order, like pyarrow Tables as of 0.14.1
     try:
       sorted_expected = sorted(expected)
       sorted_actual = sorted(actual)
       if sorted_expected != sorted_actual:
         raise BeamAssertException(
             'Failed assert: %r == %r' % (sorted_expected, sorted_actual))
-    # Fall back to slower method which works for different types on Python 3.
-    except TypeError:
+    # Slower method, used in two cases:
+    # 1) If sorted expected != actual, use this method to verify the inequality.
+    #    This ensures we don't raise any false negatives for types that don't
+    #    have a deterministic sort order.
+    # 2) As a fallback if we encounter a TypeError in python 3. this method
+    #    works on collections that have different types.
+    except (BeamAssertException, TypeError):
       for element in actual:
         try:
           expected_list.remove(element)
diff --git a/sdks/python/apache_beam/transforms/core.py b/sdks/python/apache_beam/transforms/core.py
index 7b0c323..a6e6669 100644
--- a/sdks/python/apache_beam/transforms/core.py
+++ b/sdks/python/apache_beam/transforms/core.py
@@ -329,7 +329,7 @@
   args = [name for name, p in signature.parameters.items()
           if p.kind in _SUPPORTED_ARG_TYPES]
   defaults = [p.default for p in signature.parameters.values()
-              if p.kind in _SUPPORTED_ARG_TYPES and p.default != p.empty]
+              if p.kind in _SUPPORTED_ARG_TYPES and p.default is not p.empty]
 
   return args, defaults
 
diff --git a/sdks/python/apache_beam/transforms/external.py b/sdks/python/apache_beam/transforms/external.py
index fd79fcf..75fe766 100644
--- a/sdks/python/apache_beam/transforms/external.py
+++ b/sdks/python/apache_beam/transforms/external.py
@@ -45,6 +45,7 @@
 try:
   import grpc
   from apache_beam.portability.api import beam_expansion_api_pb2_grpc
+  from apache_beam.utils import subprocess_server
 except ImportError:
   grpc = None
 # pylint: enable=wrong-import-order, wrong-import-position, ungrouped-imports
@@ -227,16 +228,24 @@
   _EXPANDED_TRANSFORM_UNIQUE_NAME = 'root'
   _IMPULSE_PREFIX = 'impulse'
 
-  def __init__(self, urn, payload, endpoint=None):
-    endpoint = endpoint or DEFAULT_EXPANSION_SERVICE
-    if grpc is None and isinstance(endpoint, str):
+  def __init__(self, urn, payload, expansion_service=None):
+    """Wrapper for an external transform with the given urn and payload.
+
+    :param urn: the unique beam identifier for this transform
+    :param payload: the payload, either as a byte string or a PayloadBuilder
+    :param expansion_service: an expansion service implementing the beam
+        ExpansionService protocol, either as an object with an Expand method
+        or an address (as a str) to a grpc server that provides this method.
+    """
+    expansion_service = expansion_service or DEFAULT_EXPANSION_SERVICE
+    if grpc is None and isinstance(expansion_service, str):
       raise NotImplementedError('Grpc required for external transforms.')
-    # TODO: Start an endpoint given an environment?
     self._urn = urn
-    self._payload = payload.payload() \
-      if isinstance(payload, PayloadBuilder) \
-      else payload
-    self._endpoint = endpoint
+    self._payload = (
+        payload.payload()
+        if isinstance(payload, PayloadBuilder)
+        else payload)
+    self._expansion_service = expansion_service
     self._namespace = self._fresh_namespace()
 
   def __post_init__(self, expansion_service):
@@ -305,12 +314,12 @@
         namespace=self._namespace,
         transform=transform_proto)
 
-    if isinstance(self._endpoint, str):
-      with grpc.insecure_channel(self._endpoint) as channel:
+    if isinstance(self._expansion_service, str):
+      with grpc.insecure_channel(self._expansion_service) as channel:
         response = beam_expansion_api_pb2_grpc.ExpansionServiceStub(
             channel).Expand(request)
     else:
-      response = self._endpoint.Expand(request, None)
+      response = self._expansion_service.Expand(request, None)
 
     if response.error:
       raise RuntimeError(response.error)
@@ -409,6 +418,44 @@
             for tag, pcoll in self._expanded_transform.outputs.items()})
 
 
+class JavaJarExpansionService(object):
+  """An expansion service based on an Java Jar file.
+
+  This can be passed into an ExternalTransform as the expansion_service
+  argument which will spawn a subprocess using this jar to expand the
+  transform.
+  """
+  def __init__(self, path_to_jar, extra_args=None):
+    if extra_args is None:
+      extra_args = ['{{PORT}}']
+    self._path_to_jar = path_to_jar
+    self._extra_args = extra_args
+
+  def Expand(self, request, context):
+    self._path_to_jar = subprocess_server.JavaJarServer.local_jar(
+        self._path_to_jar)
+    # Consider memoizing these servers (with some timeout).
+    with subprocess_server.JavaJarServer(
+        beam_expansion_api_pb2_grpc.ExpansionServiceStub,
+        self._path_to_jar,
+        self._extra_args) as service:
+      return service.Expand(request, context)
+
+
+class BeamJarExpansionService(JavaJarExpansionService):
+  """An expansion service based on an Beam Java Jar file.
+
+  Attempts to use a locally-build copy of the jar based on the gradle target,
+  if it exists, otherwise attempts to download it (with caching) from the
+  apache maven repository.
+  """
+  def __init__(self, gradle_target, extra_args=None, gradle_appendix=None):
+    path_to_jar = subprocess_server.JavaJarServer.path_to_beam_jar(
+        gradle_target,
+        gradle_appendix)
+    super(BeamJarExpansionService, self).__init__(path_to_jar, extra_args)
+
+
 def memoize(func):
   cache = {}
 
diff --git a/sdks/python/apache_beam/transforms/external_test.py b/sdks/python/apache_beam/transforms/external_test.py
index 6576419..ba315f9 100644
--- a/sdks/python/apache_beam/transforms/external_test.py
+++ b/sdks/python/apache_beam/transforms/external_test.py
@@ -20,6 +20,7 @@
 from __future__ import absolute_import
 
 import argparse
+import logging
 import os
 import subprocess
 import sys
@@ -310,6 +311,9 @@
   def test_java_expansion_portable_runner(self):
     ExternalTransformTest.expansion_service_port = os.environ.get(
         'EXPANSION_PORT')
+    if ExternalTransformTest.expansion_service_port:
+      ExternalTransformTest.expansion_service_port = int(
+          ExternalTransformTest.expansion_service_port)
 
     ExternalTransformTest.run_pipeline_with_portable_runner(None)
 
@@ -348,7 +352,7 @@
 
   @staticmethod
   def run_pipeline(
-      pipeline_options, expansion_service_port, wait_until_finish=True):
+      pipeline_options, expansion_service, wait_until_finish=True):
     # The actual definitions of these transforms is in
     # org.apache.beam.runners.core.construction.TestExpansionService.
     TEST_COUNT_URN = "beam:transforms:xlang:count"
@@ -357,15 +361,18 @@
     # Run a simple count-filtered-letters pipeline.
     p = TestPipeline(options=pipeline_options)
 
-    address = 'localhost:%s' % str(expansion_service_port)
+    if isinstance(expansion_service, int):
+      # Only the port was specified.
+      expansion_service = 'localhost:%s' % str(expansion_service)
+
     res = (
         p
         | beam.Create(list('aaabccxyyzzz'))
         | beam.Map(unicode)
         # TODO(BEAM-6587): Use strings directly rather than ints.
         | beam.Map(lambda x: int(ord(x)))
-        | beam.ExternalTransform(TEST_FILTER_URN, b'middle', address)
-        | beam.ExternalTransform(TEST_COUNT_URN, None, address)
+        | beam.ExternalTransform(TEST_FILTER_URN, b'middle', expansion_service)
+        | beam.ExternalTransform(TEST_COUNT_URN, None, expansion_service)
         # # TODO(BEAM-6587): Remove when above is removed.
         | beam.Map(lambda kv: (chr(kv[0]), kv[1]))
         | beam.Map(lambda kv: '%s: %s' % kv))
@@ -378,9 +385,12 @@
 
 
 if __name__ == '__main__':
+  logging.getLogger().setLevel(logging.INFO)
   parser = argparse.ArgumentParser()
   parser.add_argument('--expansion_service_jar')
   parser.add_argument('--expansion_service_port')
+  parser.add_argument('--expansion_service_target')
+  parser.add_argument('--expansion_service_target_appendix')
   known_args, pipeline_args = parser.parse_known_args(sys.argv)
 
   if known_args.expansion_service_jar:
@@ -390,6 +400,13 @@
         known_args.expansion_service_port)
     pipeline_options = PipelineOptions(pipeline_args)
     ExternalTransformTest.run_pipeline_with_portable_runner(pipeline_options)
+  elif known_args.expansion_service_target:
+    pipeline_options = PipelineOptions(pipeline_args)
+    ExternalTransformTest.run_pipeline(
+        pipeline_options,
+        beam.transforms.external.BeamJarExpansionService(
+            known_args.expansion_service_target,
+            gradle_appendix=known_args.expansion_service_target_appendix))
   else:
     sys.argv = pipeline_args
     unittest.main()
diff --git a/sdks/python/apache_beam/typehints/decorators.py b/sdks/python/apache_beam/typehints/decorators.py
index 054ee8d..4c485fe 100644
--- a/sdks/python/apache_beam/typehints/decorators.py
+++ b/sdks/python/apache_beam/typehints/decorators.py
@@ -557,7 +557,7 @@
         bound_args[param.name] = _ANY_VAR_POSITIONAL
       elif param.kind == param.VAR_KEYWORD:
         bound_args[param.name] = _ANY_VAR_KEYWORD
-      elif param.default != param.empty:
+      elif param.default is not param.empty:
         # Declare unbound parameters with defaults to be Any.
         bound_args[param.name] = typehints.Any
       else:
diff --git a/sdks/python/apache_beam/utils/subprocess_server.py b/sdks/python/apache_beam/utils/subprocess_server.py
new file mode 100644
index 0000000..737e220
--- /dev/null
+++ b/sdks/python/apache_beam/utils/subprocess_server.py
@@ -0,0 +1,227 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+from __future__ import absolute_import
+
+import logging
+import os
+import shutil
+import signal
+import socket
+import subprocess
+import tempfile
+import threading
+import time
+
+import grpc
+from future.moves.urllib.error import URLError
+from future.moves.urllib.request import urlopen
+
+from apache_beam.version import __version__ as beam_version
+
+
+class SubprocessServer(object):
+  """An abstract base class for running GRPC Servers as an external process.
+
+  This class acts as a context which will start up a server, provides a stub
+  to connect to it, and then shuts the server down.  For example::
+
+      with SubprocessServer(GrpcStubClass, [executable, arg, ...]) as stub:
+          stub.CallService(...)
+  """
+  def __init__(self, stub_class, cmd, port=None):
+    """Creates the server object.
+
+    :param stub_class: the auto-generated GRPC client stub class used for
+        connecting to the GRPC service
+    :param cmd: command (including arguments) for starting up the server,
+        suitable for passing to `subprocess.POpen`.
+    :param port: (optional) the port at which the subprocess will serve its
+        service.  If not given, one will be randomly chosen and the special
+        string "{{PORT}}" will be substituted in the command line arguments
+        with the chosen port.
+    """
+    self._process_lock = threading.RLock()
+    self._process = None
+    self._stub_class = stub_class
+    self._cmd = [str(arg) for arg in cmd]
+    self._port = port
+
+  def __enter__(self):
+    return self.start()
+
+  def __exit__(self, *unused_args):
+    self.stop()
+
+  def start(self):
+    with self._process_lock:
+      if self._process:
+        self.stop()
+      if self._port:
+        port = self._port
+        cmd = self._cmd
+      else:
+        port, = pick_port(None)
+        cmd = [arg.replace('{{PORT}}', str(port)) for arg in self._cmd]
+      endpoint = 'localhost:%s' % port
+      logging.warn("Starting service with %s", str(cmd).replace("',", "'"))
+      try:
+        self._process = subprocess.Popen(cmd)
+        wait_secs = .1
+        channel = grpc.insecure_channel(endpoint)
+        channel_ready = grpc.channel_ready_future(channel)
+        while True:
+          if self._process.poll() is not None:
+            logging.error("Starting job service with %s", cmd)
+            raise RuntimeError(
+                'Service failed to start up with error %s' %
+                self._process.poll())
+          try:
+            channel_ready.result(timeout=wait_secs)
+            break
+          except (grpc.FutureTimeoutError, grpc._channel._Rendezvous):
+            wait_secs *= 1.2
+            logging.log(logging.WARNING if wait_secs > 1 else logging.DEBUG,
+                        'Waiting for grpc channel to be ready at %s.',
+                        endpoint)
+        return self._stub_class(channel)
+      except:  # pylint: disable=bare-except
+        logging.exception("Error bringing up service")
+        self.stop()
+        raise
+
+  def stop(self):
+    with self._process_lock:
+      if not self._process:
+        return
+      for _ in range(5):
+        if self._process.poll() is not None:
+          break
+        logging.debug("Sending SIGINT to job_server")
+        self._process.send_signal(signal.SIGINT)
+        time.sleep(1)
+      if self._process.poll() is None:
+        self._process.kill()
+      self._process = None
+
+  def local_temp_dir(self, **kwargs):
+    return tempfile.mkdtemp(dir=self._local_temp_root, **kwargs)
+
+
+class JavaJarServer(SubprocessServer):
+
+  APACHE_REPOSITORY = 'https://repo.maven.apache.org/maven2'
+  BEAM_GROUP_ID = 'org.apache.beam'
+  JAR_CACHE = os.path.expanduser("~/.apache_beam/cache/jars")
+
+  def __init__(self, stub_class, path_to_jar, java_arguments):
+    super(JavaJarServer, self).__init__(
+        stub_class, ['java', '-jar', path_to_jar] + list(java_arguments))
+
+  @classmethod
+  def jar_name(cls, artifact_id, version, classifier=None, appendix=None):
+    return '-'.join(filter(
+        None, [artifact_id, appendix, version, classifier]))  + '.jar'
+
+  @classmethod
+  def path_to_maven_jar(
+      cls,
+      artifact_id,
+      group_id,
+      version,
+      repository=APACHE_REPOSITORY,
+      classifier=None):
+    return '/'.join([
+        repository,
+        group_id.replace('.', '/'),
+        artifact_id,
+        version,
+        cls.jar_name(artifact_id, version, classifier)])
+
+  @classmethod
+  def path_to_beam_jar(cls, gradle_target, appendix=None):
+    gradle_package = gradle_target.strip(':')[:gradle_target.rindex(':')]
+    artifact_id = 'beam-' + gradle_package.replace(':', '-')
+    project_root = os.path.sep.join(
+        os.path.abspath(__file__).split(os.path.sep)[:-5])
+    local_path = os.path.join(
+        project_root,
+        gradle_package.replace(':', os.path.sep),
+        'build',
+        'libs',
+        cls.jar_name(
+            artifact_id,
+            beam_version.replace('.dev', ''),
+            classifier='SNAPSHOT',
+            appendix=appendix))
+    if os.path.exists(local_path):
+      logging.info('Using pre-built snapshot at %s', local_path)
+      return local_path
+    elif '.dev' in beam_version:
+      # TODO: Attempt to use nightly snapshots?
+      raise RuntimeError(
+          'Please build the server with \n  cd %s; ./gradlew %s' % (
+              os.path.abspath(project_root), gradle_target))
+    else:
+      return cls.path_to_maven_jar(
+          artifact_id, cls.BEAM_GROUP_ID, beam_version, cls.APACHE_REPOSITORY)
+
+  @classmethod
+  def local_jar(cls, url):
+    # TODO: Verify checksum?
+    if os.path.exists(url):
+      return url
+    else:
+      logging.warning('Downloading job server jar from %s' % url)
+      cached_jar = os.path.join(cls.JAR_CACHE, os.path.basename(url))
+      if not os.path.exists(cached_jar):
+        if not os.path.exists(cls.JAR_CACHE):
+          os.makedirs(cls.JAR_CACHE)
+          # TODO: Clean up this cache according to some policy.
+        try:
+          url_read = urlopen(url)
+          with open(cached_jar + '.tmp', 'wb') as jar_write:
+            shutil.copyfileobj(url_read, jar_write, length=1 << 20)
+          os.rename(cached_jar + '.tmp', cached_jar)
+        except URLError as e:
+          raise RuntimeError(
+              'Unable to fetch remote job server jar at %s: %s' % (url, e))
+      return cached_jar
+
+
+def pick_port(*ports):
+  """
+  Returns a list of ports, same length as input ports list, but replaces
+  all None or 0 ports with a random free port.
+  """
+  sockets = []
+
+  def find_free_port(port):
+    if port:
+      return port
+    else:
+      s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+      sockets.append(s)
+      s.bind(('localhost', 0))
+      _, free_port = s.getsockname()
+      return free_port
+
+  ports = list(map(find_free_port, ports))
+  # Close sockets only now to avoid the same port to be chosen twice
+  for s in sockets:
+    s.close()
+  return ports
diff --git a/sdks/python/scripts/generate_pydoc.sh b/sdks/python/scripts/generate_pydoc.sh
index 5decf4a..eab8aad 100755
--- a/sdks/python/scripts/generate_pydoc.sh
+++ b/sdks/python/scripts/generate_pydoc.sh
@@ -120,7 +120,7 @@
 intersphinx_mapping = {
   'python': ('https://docs.python.org/2', None),
   'hamcrest': ('https://pyhamcrest.readthedocs.io/en/stable/', None),
-  'google-cloud': ('https://google-cloud-python.readthedocs.io/en/stable/', None),
+  'google-cloud-datastore': ('https://googleapis.dev/python/datastore/latest/', None),
 }
 
 # Since private classes are skipped by sphinx, if there is any cross reference
diff --git a/sdks/python/setup.py b/sdks/python/setup.py
index 1971e62..3e42485 100644
--- a/sdks/python/setup.py
+++ b/sdks/python/setup.py
@@ -105,7 +105,8 @@
     'avro>=1.8.1,<2.0.0; python_version < "3.0"',
     'avro-python3>=1.8.1,<2.0.0; python_version >= "3.0"',
     'crcmod>=1.7,<2.0',
-    'dill>=0.2.9,<0.4.0',
+    # Dill doesn't guarantee comatibility between releases within minor version.
+    'dill>=0.3.0,<0.3.1',
     'fastavro>=0.21.4,<0.22',
     'funcsigs>=1.0.2,<2; python_version < "3.0"',
     'future>=0.16.0,<1.0.0',
@@ -125,7 +126,6 @@
     'pytz>=2018.3',
     # [BEAM-5628] Beam VCF IO is not supported in Python 3.
     'pyvcf>=0.6.8,<0.7.0; python_version < "3.0"',
-    'pyyaml>=3.12,<4.0.0',
     'typing>=3.6.0,<3.7.0; python_version < "3.5.0"',
     ]
 
@@ -142,6 +142,7 @@
     'pandas>=0.23.4,<0.25',
     'parameterized>=0.6.0,<0.7.0',
     'pyhamcrest>=1.9,<2.0',
+    'pyyaml>=3.12,<6.0.0',
     'tenacity>=5.0.2,<6.0',
     ]
 
diff --git a/sdks/python/test-suites/direct/py35/build.gradle b/sdks/python/test-suites/direct/py35/build.gradle
index b2ab5f0..c8b672d 100644
--- a/sdks/python/test-suites/direct/py35/build.gradle
+++ b/sdks/python/test-suites/direct/py35/build.gradle
@@ -54,3 +54,34 @@
     }
   }
 }
+
+task mongodbioIT {
+  dependsOn 'setupVirtualenv'
+
+  Random r = new Random()
+  def port = r.nextInt(1000) + 27017
+  def containerName = "mongoioit" + port
+
+  def options = [
+          "--mongo_uri=mongodb://localhost:" + port
+  ]
+
+  // Pull the latest mongodb docker image and run
+  doFirst {
+    exec {
+      executable 'sh'
+      args '-c', "docker pull mongo && docker run --name ${containerName} -p ${port}:27017 -d mongo:latest"
+    }
+  }
+
+  doLast {
+    exec {
+      executable 'sh'
+      args '-c', ". ${envdir}/bin/activate && pip install -e ${rootDir}/sdks/python/[test] && python -m apache_beam.io.mongodbio_it_test ${options.join(' ')}"
+    }
+    exec {
+      executable 'sh'
+      args '-c', "docker stop ${containerName} && docker rm ${containerName}"
+    }
+  }
+}
diff --git a/sdks/python/test-suites/portable/common.gradle b/sdks/python/test-suites/portable/common.gradle
new file mode 100644
index 0000000..690c09d
--- /dev/null
+++ b/sdks/python/test-suites/portable/common.gradle
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+def pythonRootDir = "${rootDir}/sdks/python"
+def pythonContainerSuffix = project.ext.pythonVersion == '2.7' ? '2' : project.ext.pythonVersion.replace('.', '')
+def pythonContainerTask = ":sdks:python:container:py${pythonContainerSuffix}:docker"
+
+class CompatibilityMatrixConfig {
+  // Execute batch or streaming pipelines.
+  boolean streaming = false
+  // Execute on Docker or Process based environment.
+  SDK_WORKER_TYPE workerType = SDK_WORKER_TYPE.DOCKER
+
+  enum SDK_WORKER_TYPE {
+    DOCKER, PROCESS, LOOPBACK
+  }
+
+  // Whether to pre-optimize the pipeline with the Python optimizer.
+  boolean preOptimize = false
+}
+
+def flinkCompatibilityMatrix = {
+  def config = it ? it as CompatibilityMatrixConfig : new CompatibilityMatrixConfig()
+  def workerType = config.workerType.name()
+  def streaming = config.streaming
+  def environment_config = config.workerType == CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS ? "--environment_config='{\"command\": \"${buildDir.absolutePath}/sdk_worker.sh\"}'" : ""
+  def name = "flinkCompatibilityMatrix${streaming ? 'Streaming' : 'Batch'}${config.preOptimize ? 'PreOptimize' : ''}${workerType}"
+  def extra_experiments = []
+  if (config.preOptimize)
+    extra_experiments.add('pre_optimize=all')
+  tasks.create(name: name) {
+    dependsOn 'setupVirtualenv'
+    dependsOn ':runners:flink:1.8:job-server:shadowJar'
+    if (workerType.toLowerCase() == 'docker')
+      dependsOn pythonContainerTask
+    else if (workerType.toLowerCase() == 'process')
+      dependsOn 'createProcessWorker'
+    doLast {
+      exec {
+        executable 'sh'
+        args '-c', ". ${envdir}/bin/activate && cd ${pythonRootDir} && pip install -e .[test] && python -m apache_beam.runners.portability.flink_runner_test --flink_job_server_jar=${project(":runners:flink:1.8:job-server:").shadowJar.archivePath} --environment_type=${workerType} ${environment_config} ${streaming ? '--streaming' : ''} ${extra_experiments ? '--extra_experiments=' + extra_experiments.join(',') : ''}"
+      }
+    }
+  }
+}
+
+task flinkCompatibilityMatrixDocker() {
+  dependsOn flinkCompatibilityMatrix(streaming: false)
+  dependsOn flinkCompatibilityMatrix(streaming: true)
+}
+
+task flinkCompatibilityMatrixProcess() {
+  dependsOn flinkCompatibilityMatrix(streaming: false, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS)
+  dependsOn flinkCompatibilityMatrix(streaming: true, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS)
+}
+
+task flinkCompatibilityMatrixLoopback() {
+  dependsOn flinkCompatibilityMatrix(streaming: false, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.LOOPBACK)
+  dependsOn flinkCompatibilityMatrix(streaming: true, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.LOOPBACK)
+  dependsOn flinkCompatibilityMatrix(streaming: true, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.LOOPBACK, preOptimize: true)
+}
+
+task flinkValidatesRunner() {
+  dependsOn 'flinkCompatibilityMatrixLoopback'
+}
diff --git a/sdks/python/test-suites/portable/py2/build.gradle b/sdks/python/test-suites/portable/py2/build.gradle
index 9a87f70..5ceac52 100644
--- a/sdks/python/test-suites/portable/py2/build.gradle
+++ b/sdks/python/test-suites/portable/py2/build.gradle
@@ -28,12 +28,14 @@
 addPortableWordCountTasks()
 
 task preCommitPy2() {
-  dependsOn ':runners:flink:1.5:job-server-container:docker'
+  dependsOn ':runners:flink:1.8:job-server-container:docker'
   dependsOn ':sdks:python:container:py2:docker'
   dependsOn portableWordCountBatch
   dependsOn portableWordCountStreaming
 }
 
+// TODO: Move the rest of this file into ../common.gradle.
+
 // Before running this, you need to:
 //
 // 1. Build the SDK container:
@@ -43,12 +45,12 @@
 // 2. Either a) or b)
 //  a) If you want the Job Server to run in a Docker container:
 //
-//    ./gradlew :runners:flink:1.5:job-server-container:docker
+//    ./gradlew :runners:flink:1.8:job-server-container:docker
 //
 //  b) Otherwise, start a local JobService, for example, the Portable Flink runner
 //    (in a separate shell since it continues to run):
 //
-//    ./gradlew :runners:flink:1.5:job-server:runShadow
+//    ./gradlew :runners:flink:1.8:job-server:runShadow
 //
 // Then you can run this example:
 //
@@ -72,10 +74,9 @@
   dependsOn ':sdks:java:testing:expansion-service:buildTestExpansionServiceJar'
 
   doLast {
-    def testServiceExpansionJar = project(":sdks:java:testing:expansion-service:").buildTestExpansionServiceJar.archivePath
     def options = [
-        "--expansion_service_port=8096",
-        "--expansion_service_jar=${testServiceExpansionJar}",
+        "--expansion_service_target=sdks:java:testing:expansion-service:buildTestExpansionServiceJar",
+        "--expansion_service_target_appendix=testExpansionService",
     ]
     exec {
       executable 'sh'
@@ -86,7 +87,7 @@
 
 task crossLanguagePythonJavaFlink {
   dependsOn 'setupVirtualenv'
-  dependsOn ':runners:flink:1.5:job-server-container:docker'
+  dependsOn ':runners:flink:1.8:job-server-container:docker'
   dependsOn ':sdks:python:container:py2:docker'
   dependsOn ':sdks:java:container:docker'
   dependsOn ':sdks:java:testing:expansion-service:buildTestExpansionServiceJar'
@@ -111,7 +112,7 @@
 
 task crossLanguagePortableWordCount {
   dependsOn 'setupVirtualenv'
-  dependsOn ':runners:flink:1.5:job-server-container:docker'
+  dependsOn ':runners:flink:1.8:job-server-container:docker'
   dependsOn ':sdks:python:container:py2:docker'
   dependsOn ':sdks:java:container:docker'
   dependsOn ':sdks:java:testing:expansion-service:buildTestExpansionServiceJar'
@@ -190,63 +191,4 @@
   }
 }
 
-/*************************************************************************************************/
-
-class CompatibilityMatrixConfig {
-  // Execute batch or streaming pipelines.
-  boolean streaming = false
-  // Execute on Docker or Process based environment.
-  SDK_WORKER_TYPE workerType = SDK_WORKER_TYPE.DOCKER
-
-  enum SDK_WORKER_TYPE {
-    DOCKER, PROCESS, LOOPBACK
-  }
-
-  // Whether to pre-optimize the pipeline with the Python optimizer.
-  boolean preOptimize = false
-}
-
-def flinkCompatibilityMatrix = {
-  def config = it ? it as CompatibilityMatrixConfig : new CompatibilityMatrixConfig()
-  def workerType = config.workerType.name()
-  def streaming = config.streaming
-  def environment_config = config.workerType == CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS ? "--environment_config='{\"command\": \"${buildDir.absolutePath}/sdk_worker.sh\"}'" : ""
-  def name = "flinkCompatibilityMatrix${streaming ? 'Streaming' : 'Batch'}${config.preOptimize ? 'PreOptimize' : ''}${workerType}"
-  def extra_experiments = []
-  if (config.preOptimize)
-    extra_experiments.add('pre_optimize=all')
-  tasks.create(name: name) {
-    dependsOn 'setupVirtualenv'
-    dependsOn ':runners:flink:1.5:job-server:shadowJar'
-    if (workerType.toLowerCase() == 'docker')
-      dependsOn ':sdks:python:container:py2:docker'
-    else if (workerType.toLowerCase() == 'process')
-      dependsOn 'createProcessWorker'
-    doLast {
-      exec {
-        executable 'sh'
-        args '-c', ". ${envdir}/bin/activate && cd ${pythonRootDir} && pip install -e .[test] && python -m apache_beam.runners.portability.flink_runner_test --flink_job_server_jar=${project(":runners:flink:1.5:job-server:").shadowJar.archivePath} --environment_type=${workerType} ${environment_config} ${streaming ? '--streaming' : ''} ${extra_experiments ? '--extra_experiments=' + extra_experiments.join(',') : ''}"
-      }
-    }
-  }
-}
-
-task flinkCompatibilityMatrixDocker() {
-  dependsOn flinkCompatibilityMatrix(streaming: false)
-  dependsOn flinkCompatibilityMatrix(streaming: true)
-}
-
-task flinkCompatibilityMatrixProcess() {
-  dependsOn flinkCompatibilityMatrix(streaming: false, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS)
-  dependsOn flinkCompatibilityMatrix(streaming: true, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS)
-}
-
-task flinkCompatibilityMatrixLoopback() {
-  dependsOn flinkCompatibilityMatrix(streaming: false, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.LOOPBACK)
-  dependsOn flinkCompatibilityMatrix(streaming: true, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.LOOPBACK)
-  dependsOn flinkCompatibilityMatrix(streaming: true, workerType: CompatibilityMatrixConfig.SDK_WORKER_TYPE.LOOPBACK, preOptimize: true)
-}
-
-task flinkValidatesRunner() {
-  dependsOn 'flinkCompatibilityMatrixLoopback'
-}
+apply from: "../common.gradle"
diff --git a/sdks/python/test-suites/portable/py35/build.gradle b/sdks/python/test-suites/portable/py35/build.gradle
index 389fbb4..b0d670c 100644
--- a/sdks/python/test-suites/portable/py35/build.gradle
+++ b/sdks/python/test-suites/portable/py35/build.gradle
@@ -20,13 +20,13 @@
 applyPythonNature()
 // Required to setup a Python 3.5 virtualenv.
 pythonVersion = '3.5'
+apply from: "../common.gradle"
 
 addPortableWordCountTasks()
 
 task preCommitPy35() {
-    dependsOn ':runners:flink:1.5:job-server-container:docker'
+    dependsOn ':runners:flink:1.8:job-server-container:docker'
     dependsOn ':sdks:python:container:py35:docker'
     dependsOn portableWordCountBatch
     dependsOn portableWordCountStreaming
 }
-
diff --git a/sdks/python/test-suites/portable/py36/build.gradle b/sdks/python/test-suites/portable/py36/build.gradle
index 8a4d947..70fbdce 100644
--- a/sdks/python/test-suites/portable/py36/build.gradle
+++ b/sdks/python/test-suites/portable/py36/build.gradle
@@ -20,11 +20,12 @@
 applyPythonNature()
 // Required to setup a Python 3.6 virtualenv.
 pythonVersion = '3.6'
+apply from: "../common.gradle"
 
 addPortableWordCountTasks()
 
 task preCommitPy36() {
-    dependsOn ':runners:flink:1.5:job-server-container:docker'
+    dependsOn ':runners:flink:1.8:job-server-container:docker'
     dependsOn ':sdks:python:container:py36:docker'
     dependsOn portableWordCountBatch
     dependsOn portableWordCountStreaming
diff --git a/sdks/python/test-suites/portable/py37/build.gradle b/sdks/python/test-suites/portable/py37/build.gradle
index 3bb1038..fa2ead2 100644
--- a/sdks/python/test-suites/portable/py37/build.gradle
+++ b/sdks/python/test-suites/portable/py37/build.gradle
@@ -20,11 +20,12 @@
 applyPythonNature()
 // Required to setup a Python 3.7 virtualenv.
 pythonVersion = '3.7'
+apply from: "../common.gradle"
 
 addPortableWordCountTasks()
 
 task preCommitPy37() {
-    dependsOn ':runners:flink:1.5:job-server-container:docker'
+    dependsOn ':runners:flink:1.8:job-server-container:docker'
     dependsOn ':sdks:python:container:py37:docker'
     dependsOn portableWordCountBatch
     dependsOn portableWordCountStreaming
diff --git a/settings.gradle b/settings.gradle
index 0830374..343c638 100644
--- a/settings.gradle
+++ b/settings.gradle
@@ -31,14 +31,6 @@
 include ":runners:direct-java"
 include ":runners:extensions-java:metrics"
 /* Begin Flink Runner related settings */
-// Flink 1.5 (with Scala 2.11 suffix)
-include ":runners:flink:1.5"
-include ":runners:flink:1.5:job-server"
-include ":runners:flink:1.5:job-server-container"
-// Flink 1.6
-include ":runners:flink:1.6"
-include ":runners:flink:1.6:job-server"
-include ":runners:flink:1.6:job-server-container"
 // Flink 1.7
 include ":runners:flink:1.7"
 include ":runners:flink:1.7:job-server"
@@ -53,7 +45,7 @@
 include ":runners:google-cloud-dataflow-java:examples"
 include ":runners:google-cloud-dataflow-java:examples-streaming"
 include ":runners:java-fn-execution"
-include ":runners:jet-experimental"
+include ":runners:jet"
 include ":runners:local-java"
 include ":runners:reference:java"
 include ":runners:spark"
@@ -150,8 +142,8 @@
 include ":vendor:grpc-1_21_0"
 include ":vendor:bytebuddy-1_9_3"
 include ":vendor:calcite-1_20_0"
-include ":vendor:sdks-java-extensions-protobuf"
 include ":vendor:guava-26_0-jre"
+include ":vendor:sdks-java-extensions-protobuf"
 include ":website"
 include ":runners:google-cloud-dataflow-java:worker:legacy-worker"
 include ":runners:google-cloud-dataflow-java:worker"
diff --git a/vendor/sdks-java-extensions-protobuf/build.gradle b/vendor/sdks-java-extensions-protobuf/build.gradle
index 8cc4285..e3f0c94 100644
--- a/vendor/sdks-java-extensions-protobuf/build.gradle
+++ b/vendor/sdks-java-extensions-protobuf/build.gradle
@@ -18,6 +18,7 @@
 
 plugins { id 'org.apache.beam.module' }
 applyJavaNature(
+  automaticModuleName: 'org.apache.beam.vendor.sdks.java.extensions.protobuf',
   exportJavadoc: false,
   shadowClosure: {
     dependencies {
diff --git a/website/notebooks/generate.py b/website/notebooks/generate.py
index a01922b..76e69a7 100644
--- a/website/notebooks/generate.py
+++ b/website/notebooks/generate.py
@@ -29,16 +29,20 @@
 # This creates the output notebooks in the `examples/notebooks` directory.
 # You have to commit the generated notebooks after generating them.
 
-import argparse
+import logging
 import md2ipynb
 import nbformat
 import os
+import sys
 import yaml
 
 docs_logo_url = 'https://beam.apache.org/images/logos/full-color/name-bottom/beam-logo-full-color-name-bottom-100.png'
 
 
-def run(docs, variables=None, inputs_dir='.', outputs_dir='.', imports_dir='.'):
+def run(docs, variables=None,
+        inputs_dir='.', outputs_dir='.', imports_dir='.', include_dir='.'):
+
+  errors = []
   for basename, doc in docs.items():
     languages=doc.get('languages', 'py java go').split()
     for lang in languages:
@@ -54,27 +58,53 @@
       imports[0].insert(0, os.path.join(imports_dir, 'license.md'))
 
       # Create a new notebook from the Markdown file contents.
+      input_file = basename + '.md'
       ipynb_file = '/'.join([outputs_dir, '{}-{}.ipynb'.format(basename, lang)])
-      notebook = md2ipynb.new_notebook(
-          input_file=os.path.join(inputs_dir, basename + '.md'),
-          variables=variables,
-          imports=imports,
-          notebook_title=doc.get('title', os.path.basename(basename).replace('-', ' ')),
-          keep_classes=['language-' + lang, 'shell-sh'],
-          docs_url='https://beam.apache.org/' + basename.replace('-', ''),
-          docs_logo_url=docs_logo_url,
-          github_ipynb_url='https://github.com/apache/beam/blob/master/' + ipynb_file,
-      )
+      try:
+        notebook = md2ipynb.new_notebook(
+            input_file=os.path.join(inputs_dir, input_file),
+            variables=variables,
+            imports=imports,
+            notebook_title=doc.get('title', os.path.basename(basename).replace('-', ' ')),
+            keep_classes=['language-' + lang, 'shell-sh'],
+            docs_url='https://beam.apache.org/' + basename.replace('-', ''),
+            docs_logo_url=docs_logo_url,
+            github_ipynb_url='https://github.com/apache/beam/blob/master/' + ipynb_file,
+            include_dir=include_dir,
+        )
+        logging.info('{} succeeded'.format(input_file))
 
-      # Write the notebook to file.
-      output_dir = os.path.dirname(ipynb_file)
-      if not os.path.exists(output_dir):
-        os.makedirs(output_dir)
-      with open(ipynb_file, 'w') as f:
-        nbformat.write(notebook, f)
+        # Write the notebook to file.
+        output_dir = os.path.dirname(ipynb_file)
+        if not os.path.exists(output_dir):
+          os.makedirs(output_dir)
+        with open(ipynb_file, 'w') as f:
+          nbformat.write(notebook, f)
+      except Exception as e:
+        logging.error('{} failed: {}'.format(input_file, e))
+        errors.append((input_file, e))
+
+  if errors:
+    import traceback
+    sys.stdout.flush()
+    sys.stderr.flush()
+    print('')
+    print('=' * 60)
+    print(' Errors')
+    for input_file, e in errors:
+      print('')
+      print(input_file)
+      print('-' * len(input_file))
+      traceback.print_tb(e.__traceback__)
+
+  print('')
+  print('{} files processed ({} succeeded, {} failed)'.format(
+    len(docs), len(docs) - len(errors), len(errors)))
 
 
 if __name__ == '__main__':
+  logging.basicConfig(level=logging.INFO)
+
   script_dir = os.path.dirname(os.path.realpath(__file__))
   root_dir = os.path.realpath(os.path.join(script_dir, '..', '..'))
 
@@ -87,7 +117,11 @@
     variables = {'site': yaml.load(f.read())}
     variables['site']['baseurl'] = variables['site']['url']
 
-  inputs_dir = os.path.join(root_dir, 'website', 'src')
-  outputs_dir = os.path.join(root_dir, 'examples', 'notebooks')
-  imports_dir = os.path.join(script_dir, 'imports')
-  run(docs, variables, inputs_dir, outputs_dir, imports_dir)
+  run(
+      docs=docs,
+      variables=variables,
+      inputs_dir=os.path.join(root_dir, 'website', 'src'),
+      outputs_dir=os.path.join(root_dir, 'examples', 'notebooks'),
+      imports_dir=os.path.join(script_dir, 'imports'),
+      include_dir=os.path.join(root_dir, 'website', 'src', '_includes'),
+  )
diff --git a/website/src/_includes/button-pydoc.md b/website/src/_includes/button-pydoc.md
new file mode 100644
index 0000000..c0135aa
--- /dev/null
+++ b/website/src/_includes/button-pydoc.md
@@ -0,0 +1,23 @@
+<!--
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+
+{% capture button_url %}https://beam.apache.org/releases/pydoc/current/{{ include.path }}.html#{{ include.path }}.{{ include.class }}{% endcapture %}
+
+{% include button.md
+  url=button_url
+  logo="https://beam.apache.org/images/logos/sdks/python.png"
+  text="Pydoc"
+%}
+
+<br><br><br>
diff --git a/website/src/_includes/button.md b/website/src/_includes/button.md
new file mode 100644
index 0000000..0413771
--- /dev/null
+++ b/website/src/_includes/button.md
@@ -0,0 +1,21 @@
+<!--
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+
+<div>
+<table align="left" style="margin-right:1em">
+  <td>
+    <a class="button" target="_blank" href="{{ include.url }}">{% if include.logo %}<img src="{{ include.logo }}" width="32px" height="32px" alt="{{ include.text }}" /> {% endif %}{{ include.text }}</a>
+  </td>
+</table>
+</div>
diff --git a/website/src/_includes/buttons-code-snippet.md b/website/src/_includes/buttons-code-snippet.md
new file mode 100644
index 0000000..055a730
--- /dev/null
+++ b/website/src/_includes/buttons-code-snippet.md
@@ -0,0 +1,32 @@
+<!--
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+
+{% capture notebook_url %}https://colab.research.google.com/github/{{ site.branch_repo }}/{{ include.notebook }}{% endcapture %}
+
+{% capture code_url %}https://github.com/{{ site.branch_repo }}/{{ include.code }}{% endcapture %}
+
+{:.notebook-skip}
+{% include button.md
+  url=notebook_url
+  logo="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
+  text="Run code now"
+%}
+
+{% include button.md
+  url=code_url
+  logo="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
+  text="View source code"
+%}
+
+<br><br><br>
diff --git a/website/src/_includes/section-menu/documentation.html b/website/src/_includes/section-menu/documentation.html
index a496ff3..0a56d8c 100644
--- a/website/src/_includes/section-menu/documentation.html
+++ b/website/src/_includes/section-menu/documentation.html
@@ -247,10 +247,11 @@
 
   <ul class="section-nav-list">
     <li><a href="{{ site.baseurl }}/documentation/patterns/overview/">Overview</a></li>
-    <li><a href="{{ site.baseurl }}/documentation/patterns/file-processing-patterns/">File processing patterns</a></li>
-    <li><a href="{{ site.baseurl }}/documentation/patterns/side-input-patterns/">Side input patterns</a></li>
-    <li><a href="{{ site.baseurl }}/documentation/patterns/pipeline-option-patterns/">Pipeline option patterns</a></li>
-    <li><a href="{{ site.baseurl }}/documentation/patterns/custom-io-patterns/">Custom I/O patterns</a></li>
+    <li><a href="{{ site.baseurl }}/documentation/patterns/file-processing/">File processing</a></li>
+    <li><a href="{{ site.baseurl }}/documentation/patterns/side-inputs/">Side inputs</a></li>
+    <li><a href="{{ site.baseurl }}/documentation/patterns/pipeline-options/">Pipeline options</a></li>
+    <li><a href="{{ site.baseurl }}/documentation/patterns/custom-io/">Custom I/O</a></li>
+    <li><a href="{{ site.baseurl }}/documentation/patterns/custom-windows/">Custom windows</a></li>
   </ul>
 </li>
 
diff --git a/website/src/contribute/release-guide.md b/website/src/contribute/release-guide.md
index ddf9e4a..03fb6cc 100644
--- a/website/src/contribute/release-guide.md
+++ b/website/src/contribute/release-guide.md
@@ -993,7 +993,7 @@
   ```
   Flink Local Runner
   ```
-  ./gradlew :runners:flink:1.5:runQuickstartJavaFlinkLocal \
+  ./gradlew :runners:flink:1.8:runQuickstartJavaFlinkLocal \
   -Prepourl=https://repository.apache.org/content/repositories/orgapachebeam-${KEY} \
   -Pver=${RELEASE_VERSION}
   ```
diff --git a/website/src/documentation/dsls/sql/shell.md b/website/src/documentation/dsls/sql/shell.md
index 69326e5..1317575 100644
--- a/website/src/documentation/dsls/sql/shell.md
+++ b/website/src/documentation/dsls/sql/shell.md
@@ -31,7 +31,7 @@
 To use Beam SQL shell, you must first clone the [Beam SDK repository](https://github.com/apache/beam). Then, from the root of the repository clone, execute the following commands to run the shell:
 
 ```
-./gradlew -p sdks/java/extensions/sql/shell -Pbeam.sql.shell.bundled=':runners:flink:1.5,:sdks:java:io:kafka' installDist
+./gradlew -p sdks/java/extensions/sql/shell -Pbeam.sql.shell.bundled=':runners:flink:1.8,:sdks:java:io:kafka' installDist
 
 ./sdks/java/extensions/sql/shell/build/install/shell/bin/shell
 ```
@@ -119,7 +119,7 @@
 1.  Make sure the SQL shell includes the desired runner. Add the corresponding project id to the `-Pbeam.sql.shell.bundled` parameter of the Gradle invocation ([source code](https://github.com/apache/beam/blob/master/sdks/java/extensions/sql/shell/build.gradle), [project ids](https://github.com/apache/beam/blob/master/settings.gradle)). For example, use the following command to include Flink runner and KafkaIO:
 
     ```
-    ./gradlew -p sdks/java/extensions/sql/shell -Pbeam.sql.shell.bundled=':runners:flink:1.5,:sdks:java:io:kafka' installDist
+    ./gradlew -p sdks/java/extensions/sql/shell -Pbeam.sql.shell.bundled=':runners:flink:1.8,:sdks:java:io:kafka' installDist
     ```
 
     _Note: You can bundle multiple runners (using a comma-separated list) or other additional components in the same manner. For example, you can add support for more I/Os._
@@ -145,7 +145,7 @@
 You can also build your own standalone package for SQL shell using `distZip` or `distTar` tasks. For example:
 
 ```
-./gradlew -p sdks/java/extensions/sql/shell -Pbeam.sql.shell.bundled=':runners:flink:1.5,:sdks:java:io:kafka' distZip
+./gradlew -p sdks/java/extensions/sql/shell -Pbeam.sql.shell.bundled=':runners:flink:1.8,:sdks:java:io:kafka' distZip
 
 ls ./sdks/java/extensions/sql/shell/build/distributions/
 beam-sdks-java-extensions-sql-shell-2.6.0-SNAPSHOT.tar beam-sdks-java-extensions-sql-shell-2.6.0-SNAPSHOT.zip
diff --git a/website/src/documentation/io/testing.md b/website/src/documentation/io/testing.md
index b00d6df..1e945ae 100644
--- a/website/src/documentation/io/testing.md
+++ b/website/src/documentation/io/testing.md
@@ -141,117 +141,20 @@
 However, when working locally, there is no requirement to use Kubernetes. All of the test infrastructure allows you to pass in connection info, so developers can use their preferred hosting infrastructure for local development.
 
 
-### Running integration tests {#running-integration-tests}
+### Running integration tests on your machine {#running-integration-tests-on-your-machine}
 
-The high level steps for running an integration test are:
+You can always run the IO integration tests on your own machine. The high level steps for running an integration test are:
 1.  Set up the data store corresponding to the test being run.
 1.  Run the test, passing it connection info from the just created data store.
 1.  Clean up the data store.
 
-Since setting up data stores and running the tests involves a number of steps, and we wish to time these tests when running performance benchmarks, we use PerfKit Benchmarker to manage the process end to end. With a single command, you can go from an empty Kubernetes cluster to a running integration test.
 
-However, **PerfKit Benchmarker is not required for running integration tests**. Therefore, we have listed the steps for both using PerfKit Benchmarker, and manually running the tests below.
-
-
-#### Using PerfKit Benchmarker {#using-perfkit-benchmarker}
-
-Prerequisites:
-1.  [Install PerfKit Benchmarker](https://github.com/GoogleCloudPlatform/PerfKitBenchmarker)
-1.  Have a running Kubernetes cluster you can connect to locally using kubectl. We recommend using Google Kubernetes Engine - it's proven working for all the use cases we tested.  
-
-You won’t need to invoke PerfKit Benchmarker directly. Run `./gradlew performanceTest` task in project's root directory, passing kubernetes scripts of your choice (located in .test_infra/kubernetes directory). It will setup PerfKitBenchmarker for you.  
-
-Example run with the [Direct]({{ site.baseurl }}/documentation/runners/direct/) runner:
-```
-./gradlew performanceTest -DpkbLocation="/Users/me/PerfKitBenchmarker/pkb.py" -DintegrationTestPipelineOptions='["--numberOfRecords=1000"]' -DitModule=sdks/java/io/jdbc/ -DintegrationTest=org.apache.beam.sdk.io.jdbc.JdbcIOIT -DkubernetesScripts="/Users/me/beam/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml" -DbeamITOptions="/Users/me/beam/.test-infra/kubernetes/postgres/pkb-config-local.yml" -DintegrationTestRunner=direct
-```
-
-
-Example run with the [Google Cloud Dataflow]({{ site.baseurl }}/documentation/runners/dataflow/) runner:
-```
-./gradlew performanceTest -DpkbLocation="/Users/me/PerfKitBenchmarker/pkb.py" -DintegrationTestPipelineOptions='["--numberOfRecords=1000", "--project=GOOGLE_CLOUD_PROJECT", "--tempRoot=GOOGLE_STORAGE_BUCKET"]' -DitModule=sdks/java/io/jdbc/ -DintegrationTest=org.apache.beam.sdk.io.jdbc.JdbcIOIT -DkubernetesScripts="/Users/me/beam/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml" -DbeamITOptions="/Users/me/beam/.test-infra/kubernetes/postgres/pkb-config-local.yml" -DintegrationTestRunner=dataflow
-```
-
-Example run with the HDFS filesystem and Cloud Dataflow runner:
-
-```
-./gradlew performanceTest -DpkbLocation="/Users/me/PerfKitBenchmarker/pkb.py" -DintegrationTestPipelineOptions='["--numberOfRecords=100000", "--project=GOOGLE_CLOUD_PROJECT", "--tempRoot=GOOGLE_STORAGE_BUCKET"]' -DitModule=sdks/java/io/file-based-io-tests/ -DintegrationTest=org.apache.beam.sdk.io.text.TextIOIT -DkubernetesScripts=".test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml,.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster-for-local-dev.yml" -DbeamITOptions=".test-infra/kubernetes/hadoop/LargeITCluster/pkb-config.yml" -DintegrationTestRunner=dataflow -DbeamExtraProperties='[filesystem=hdfs]'
-```
-
-NOTE: When using Direct runner along with HDFS cluster, please set `export HADOOP_USER_NAME=root` before runnning `performanceTest` task.
-
-Parameter descriptions:
-
-
-<table class="table">
-  <thead>
-    <tr>
-     <td>
-      <strong>Option</strong>
-     </td>
-     <td>
-       <strong>Function</strong>
-     </td>
-    </tr>
-  </thead>
-  <tbody>
-    <tr>
-     <td>-DpkbLocation
-     </td>
-     <td>Path to PerfKit Benchmarker project.
-     </td>
-    </tr>
-    <tr>
-     <td>-DintegrationTestPipelineOptions
-     </td>
-     <td>Passes pipeline options directly to the test being run. Note that some pipeline options may be runner specific (like "--project" or "--tempRoot"). 
-     </td>
-    </tr>
-    <tr>
-     <td>-DitModule
-     </td>
-     <td>Specifies the project submodule of the I/O to test.
-     </td>
-    </tr>
-    <tr>
-     <td>-DintegrationTest
-     </td>
-     <td>Specifies the test to be run (fully qualified reference to class/test method).
-     </td>
-    </tr>
-    <tr>
-     <td>-DkubernetesScripts
-     </td>
-     <td>Paths to scripts with necessary kubernetes infrastructure.
-     </td>
-    </tr>
-    <tr>
-      <td>-DbeamITOptions
-      </td>
-      <td>Path to file with Benchmark configuration (static and dynamic pipeline options. See below for description).
-      </td>
-    </tr>
-    <tr>
-      <td>-DintegrationTestRunner
-      </td>
-      <td>Runner to be used for running the test. Currently possible options are: direct, dataflow.
-      </td>
-    </tr>
-    <tr>
-      <td>-DbeamExtraProperties
-      </td>
-      <td>Any other "extra properties" to be passed to Gradle, eg. "'[filesystem=hdfs]'". 
-      </td>
-    </tr>
-  </tbody>
-</table>
-
-#### Without PerfKit Benchmarker {#without-perfkit-benchmarker}
+#### Data store setup/cleanup {#datastore-setup-cleanup}
 
 If you're using Kubernetes scripts to host data stores, make sure you can connect to your cluster locally using kubectl. If you have your own data stores already setup, you just need to execute step 3 from below list.
 
 1.  Set up the data store corresponding to the test you wish to run. You can find Kubernetes scripts for all currently supported data stores in [.test-infra/kubernetes](https://github.com/apache/beam/tree/master/.test-infra/kubernetes).
-    1.  In some cases, there is a setup script (*.sh). In other cases, you can just run ``kubectl create -f [scriptname]`` to create the data store.
+    1.  In some cases, there is a dedicated setup script (*.sh). In other cases, you can just run ``kubectl create -f [scriptname]`` to create the data store. You can also let [kubernetes.sh](https://github.com/apache/beam/blob/master/.test-infra/kubernetes/kubernetes.sh) script perform some standard steps for you. 
     1.  Convention dictates there will be:
         1.  A yml script for the data store itself, plus a `NodePort` service. The `NodePort` service opens a port to the data store for anyone who connects to the Kubernetes cluster's machines from within same subnetwork. Such scripts are typically useful when running the scripts on Minikube Kubernetes Engine.
         1.  A separate script, with LoadBalancer service. Such service will expose an _external ip_ for the datastore. Such scripts are needed when external access is required (eg. on Jenkins). 
@@ -266,10 +169,9 @@
     1.  JDBC: `kubectl delete -f .test-infra/kubernetes/postgres/postgres.yml`
     1.  Elasticsearch: `bash .test-infra/kubernetes/elasticsearch/teardown.sh`
 
-##### integrationTest Task {#integration-test-task}
+#### Running a particular test {#running-a-test}
 
-Since `performanceTest` task involved running PerfkitBenchmarker, we can't use it to run the tests manually. For such purposes a more "low-level" task called `integrationTest` was introduced.  
-
+`integrationTest` is a dedicated gradle task for running IO integration tests.    
 
 Example usage on Cloud Dataflow runner: 
 
@@ -335,9 +237,11 @@
   </tbody>
 </table>
 
-#### Running Integration Tests on Pull Requests {#running-on-pull-requests}
+### Running Integration Tests on Pull Requests {#running-integration-tests-on-pull-requests}
 
-Thanks to [ghprb](https://github.com/janinko/ghprb) plugin it is possible to run Jenkins jobs when specific phrase is typed in a Github Pull Request's comment. Integration tests that have Jenkins job defined can be triggered this way. You can run integration tests using these phrases:
+Most of the IO integration tests have dedicated Jenkins jobs that run periodically to collect metrics and avoid regressions. Thanks to [ghprb](https://github.com/janinko/ghprb) plugin it is also possible to trigger these jobs on demand once a specific phrase is typed in a Github Pull Request's comment. This way tou can check if your contribution to a certain IO is an improvement or if it makes things worse (hopefully not!). 
+
+To run IO Integration Tests type the following comments in your Pull Request:
 
 <table class="table">
   <thead>
@@ -437,7 +341,7 @@
 
 ### Performance testing dashboard {#performance-testing-dashboard}
 
-We measure the performance of IOITs by gathering test execution times from Jenkins jobs that run periodically. The consequent results are stored in a database (BigQuery), therefore we can display them in a form of plots. 
+As mentioned before, we measure the performance of IOITs by gathering test execution times from Jenkins jobs that run periodically. The consequent results are stored in a database (BigQuery), therefore we can display them in a form of plots. 
 
 The dashboard gathering all the results is available here: [Performance Testing Dashboard](https://s.apache.org/io-test-dashboards)
 
@@ -446,9 +350,9 @@
 There are three components necessary to implement an integration test:
 *   **Test code**: the code that does the actual testing: interacting with the I/O transform, reading and writing data, and verifying the data.
 *   **Kubernetes scripts**: a Kubernetes script that sets up the data store that will be used by the test code.
-*   **Integrate with PerfKit Benchmarker**: this allows users to easily invoke PerfKit Benchmarker, creating the Kubernetes resources and running the test code.
+*   **Jenkins jobs**: a Jenkins Job DSL script that performs all necessary steps for setting up the data sources, running and cleaning up after the test.
 
-These three pieces are discussed in detail below.
+These two pieces are discussed in detail below.
 
 #### Test Code {#test-code}
 
@@ -492,255 +396,16 @@
         1.  Official Docker images, because they have security fixes and guaranteed maintenance.
         1.  Non-official Docker images, or images from other providers that have good maintainers (e.g. [quay.io](http://quay.io/)).
 
+#### Jenkins jobs {#jenkins-jobs}
 
-#### Integrate with PerfKit Benchmarker {#integrate-with-perfkit-benchmarker}
+You can find examples of existing IOIT jenkins job definitions in [.test-infra/jenkins](https://github.com/apache/beam/tree/master/.test-infra/jenkins) directory. Look for files caled job_PerformanceTest_*.groovy. The most prominent examples are: 
+* [JDBC](https://github.com/apache/beam/blob/master/.test-infra/jenkins/job_PerformanceTests_JDBC.groovy) IOIT job
+* [MongoDB](https://github.com/apache/beam/blob/master/.test-infra/jenkins/job_PerformanceTests_MongoDBIO_IT.groovy) IOIT job
+* [File-based](https://github.com/apache/beam/blob/master/.test-infra/jenkins/job_PerformanceTests_FileBasedIO_IT.groovy) IOIT jobs
+    
+Notice that there is a utility class helpful in creating the jobs easily without forgetting important steps or repeating code. See [Kubernetes.groovy](https://github.com/apache/beam/blob/master/.test-infra/jenkins/Kubernetes.groovy) for more details.  
 
-To allow developers to easily invoke your I/O integration test, you should create a PerfKit Benchmarker benchmark configuration file for the data store. Each pipeline option needed by the integration test should have a configuration entry. This is to be passed to perfkit via "beamITOptions" option in "performanceTest" task (described above). The goal is that a checked in config has defaults such that other developers can run the test without changing the configuration.
-
-
-#### Defining the benchmark configuration file {#defining-the-benchmark-configuration-file}
-
-The benchmark configuration file is a yaml file that defines the set of pipeline options for a specific data store. Some of these pipeline options are **static** - they are known ahead of time, before the data store is created (e.g. username/password). Others options are **dynamic** - they are only known once the data store is created (or after we query the Kubernetes cluster for current status).
-
-All known cases of dynamic pipeline options are for extracting the IP address that the test needs to connect to. For I/O integration tests, we must allow users to specify:
-
-
-
-*   The type of the IP address to get (load balancer/node address)
-*   The pipeline option to pass that IP address to
-*   How to find the Kubernetes resource with that value (ie. what load balancer service name? what node selector?)
-
-The style of dynamic pipeline options used here should support a variety of other types of values derived from Kubernetes, but we do not have specific examples.
-
-The dynamic pipeline options are:
-
-
-<table class="table">
-  <thead>
-    <tr>
-     <td>
-       <strong>Type name</strong>
-     </td>
-     <td>
-       <strong>Meaning</strong>
-     </td>
-     <td>
-       <strong>Selector field name</strong>
-     </td>
-     <td>
-       <strong>Selector field value</strong>
-     </td>
-    </tr>
-  </thead>
-  <tbody>
-    <tr>
-     <td>NodePortIp
-     </td>
-     <td>We will be using the IP address of a k8s NodePort service, the value will be an IP address of a Pod
-     </td>
-     <td>podLabel
-     </td>
-     <td>A kubernetes label selector for a pod whose IP address can be used to connect to
-     </td>
-    </tr>
-    <tr>
-     <td>LoadBalancerIp
-     </td>
-     <td>We will be using the IP address of a k8s LoadBalancer, the value will be an IP address of the load balancer
-     </td>
-     <td>serviceName
-     </td>
-     <td>The name of the LoadBalancer kubernetes service.
-     </td>
-    </tr>
-  </tbody>
-</table>
-
-#### Benchmark configuration files: full example configuration file {#benchmark-configuration-files-full-example-configuration-file}
-
-A configuration file will look like this:
-```
-static_pipeline_options:
-  -postgresUser: postgres
-  -postgresPassword: postgres
-dynamic_pipeline_options:
-  - paramName: PostgresIp
-    type: NodePortIp
-    podLabel: app=postgres
-```
-
-
-and may contain the following elements:
-
-
-<table class="table">
-  <thead>
-    <tr>
-     <td><strong>Configuration element</strong>
-     </td>
-     <td><strong>Description and how to change when adding a new test</strong>
-     </td>
-    </tr>
-  </thead>
-  <tbody>
-    <tr>
-     <td>static_pipeline_options
-     </td>
-     <td>The set of preconfigured pipeline options.
-     </td>
-    </tr>
-    <tr>
-     <td>dynamic_pipeline_options
-     </td>
-     <td>The set of pipeline options that PerfKit Benchmarker will determine at runtime.
-     </td>
-    </tr>
-    <tr>
-     <td>dynamic_pipeline_options.name
-     </td>
-     <td>The name of the parameter to be passed to gradle's invocation of the I/O integration test.
-     </td>
-    </tr>
-    <tr>
-     <td>dynamic_pipeline_options.type
-     </td>
-     <td>The method of determining the value of the pipeline options.
-     </td>
-    </tr>
-    <tr>
-     <td>dynamic_pipeline_options - other attributes
-     </td>
-     <td>These vary depending on the type of the dynamic pipeline option - see the table of dynamic pipeline options for a description.
-     </td>
-    </tr>
-  </tbody>
-</table>
-
-
-
-#### Customizing PerfKit Benchmarker behaviour {#customizing-perf-kit-benchmarker-behaviour}
-
-In most cases, to run the _performanceTest_ task it is sufficient to pass the properties described above, which makes it easy to use. However, users can customize Perfkit Benchmarker's behavior even more by pasing some extra Gradle properties:
-
-
-<table class="table">
-  <thead>
-    <tr>
-     <td><strong>PerfKit Benchmarker Parameter</strong>
-     </td>
-     <td><strong>Corresponding Gradle property</strong>
-     </td>
-     <td><strong>Default value</strong>
-     </td>
-     <td><strong>Description</strong>
-     </td>
-    </tr>
-  </thead>
-  <tbody>
-    <tr>
-     <td>dpb_log_level
-     </td>
-     <td>-DlogLevel
-     </td>
-     <td>INFO
-     </td>
-     <td>Data Processing Backend's log level.
-     </td>
-    </tr>
-    <tr>
-     <td>gradle_binary
-     </td>
-     <td>-DgradleBinary
-     </td>
-     <td>./gradlew
-     </td>
-     <td>Path to gradle binary.
-     </td>
-    </tr>
-    <tr>
-     <td>official
-     </td>
-     <td>-Dofficial
-     </td>
-     <td>false
-     </td>
-     <td>If true, the benchmark results are marked as "official" and can be displayed on PerfKitExplorer dashboards.
-     </td>
-    </tr>
-    <tr>
-     <td>benchmarks
-     </td>
-     <td>-Dbenchmarks
-     </td>
-     <td>beam_integration_benchmark
-     </td>
-     <td>Defines the PerfKit Benchmarker benchmark to run. This is same for all I/O integration tests.
-     </td>
-    </tr>
-    <tr>
-     <td>beam_prebuilt
-     </td>
-     <td>-DbeamPrebuilt
-     </td>
-     <td>true
-     </td>
-     <td>If false, PerfKit Benchmarker runs the build task before running the tests.
-     </td>
-    </tr>
-    <tr>
-     <td>beam_sdk
-     </td>
-     <td>-DbeamSdk
-     </td>
-     <td>java
-     </td>
-     <td>Beam's sdk to be used by PerfKit Benchmarker.
-     </td>
-    </tr>
-    <tr>
-     <td>beam_timeout
-     </td>
-     <td>-DitTimeout
-     </td>
-     <td>1200
-     </td>
-     <td>Timeout (in seconds) after which PerfKit Benchmarker will stop executing the benchmark (and will fail).
-     </td>
-    </tr>
-    <tr>
-     <td>kubeconfig
-     </td>
-     <td>-Dkubeconfig
-     </td>
-     <td>~/.kube/config
-     </td>
-     <td>Path to kubernetes configuration file.
-     </td>
-    </tr>
-    <tr>
-     <td>kubectl
-     </td>
-     <td>-Dkubectl
-     </td>
-     <td>kubectl
-     </td>
-     <td>Path to kubernetes executable.
-     </td>
-    </tr>
-    <tr>
-     <td>beam_extra_properties
-     </td>
-     <td>-DbeamExtraProperties
-     </td>
-     <td>(empty string)
-     </td>
-     <td>Any additional properties to be appended to benchmark execution command.
-     </td>
-    </tr>
-  </tbody>
-</table>
-
-#### Small Scale and Large Scale Integration Tests {#small-scale-and-large-scale-integration-tests}
+### Small Scale and Large Scale Integration Tests {#small-scale-and-large-scale-integration-tests}
 
 Apache Beam expects that it can run integration tests in multiple configurations:
 *   Small scale
diff --git a/website/src/documentation/patterns/custom-io-patterns.md b/website/src/documentation/patterns/custom-io.md
similarity index 97%
rename from website/src/documentation/patterns/custom-io-patterns.md
rename to website/src/documentation/patterns/custom-io.md
index 98825f8..d816010 100644
--- a/website/src/documentation/patterns/custom-io-patterns.md
+++ b/website/src/documentation/patterns/custom-io.md
@@ -2,7 +2,7 @@
 layout: section
 title: "Custom I/O patterns"
 section_menu: section-menu/documentation.html
-permalink: /documentation/patterns/custom-io-patterns/
+permalink: /documentation/patterns/custom-io/
 ---
 <!--
 Licensed under the Apache License, Version 2.0 (the "License");
diff --git a/website/src/documentation/patterns/custom-windows.md b/website/src/documentation/patterns/custom-windows.md
new file mode 100644
index 0000000..c3ef84a
--- /dev/null
+++ b/website/src/documentation/patterns/custom-windows.md
@@ -0,0 +1,114 @@
+---
+layout: section
+title: "Custom window patterns"
+section_menu: section-menu/documentation.html
+permalink: /documentation/patterns/custom-windows/
+---
+<!--
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+
+# Custom window patterns
+The samples on this page demonstrate common custom window patterns. You can create custom windows with [`WindowFn` functions]({{ site.baseurl }}/documentation/programming-guide/#provided-windowing-functions). For more information, see the [programming guide section on windowing]({{ site.baseurl }}/documentation/programming-guide/#windowing).
+
+**Note**: Custom merging windows isn't supported in Python (with fnapi).
+
+## Using data to dynamically set session window gaps
+
+You can modify the [`assignWindows`](https://beam.apache.org/releases/javadoc/current/index.html?org/apache/beam/sdk/transforms/windowing/SlidingWindows.html) function to use data-driven gaps, then window incoming data into sessions.
+
+Access the `assignWindows` function through `WindowFn.AssignContext.element()`. The original, fixed-duration `assignWindows` function is:
+
+```java
+{% github_sample /apache/beam/blob/master/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java tag:CustomSessionWindow1
+%}
+```
+
+### Creating data-driven gaps
+To create data-driven gaps, add the following snippets to the `assignWindows` function:
+- A default value for when the custom gap is not present in the data 
+- A way to set the attribute from the main pipeline as a method of the custom windows
+
+For example, the following function assigns each element to a window between the timestamp and `gapDuration`:
+
+```java
+{% github_sample /apache/beam/blob/master/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java tag:CustomSessionWindow3
+%}
+```
+
+Then, set the `gapDuration` field in a windowing function:
+
+```java
+{% github_sample /apache/beam/blob/master/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java tag:CustomSessionWindow2
+%}
+```
+
+### Windowing messages into sessions
+After creating data-driven gaps, you can window incoming data into the new, custom sessions.
+
+First, set the session length to the gap duration:
+
+```java
+{% github_sample /apache/beam/blob/master/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java tag:CustomSessionWindow4
+%}
+```
+
+Lastly, window data into sessions in your pipeline:
+
+```java
+{% github_sample /apache/beam/blob/master/examples/java/src/main/java/org/apache/beam/examples/snippets/Snippets.java tag:CustomSessionWindow6
+%}
+```
+
+### Example data and windows
+The following test data tallies two users' scores with and without the `gap` attribute:
+
+```
+.apply("Create data", Create.timestamped(
+            TimestampedValue.of("{\"user\":\"user-1\",\"score\":\"12\",\"gap\":\"5\"}", new Instant()),
+            TimestampedValue.of("{\"user\":\"user-2\",\"score\":\"4\"}", new Instant()),
+            TimestampedValue.of("{\"user\":\"user-1\",\"score\":\"-3\",\"gap\":\"5\"}", new Instant().plus(2000)),
+            TimestampedValue.of("{\"user\":\"user-1\",\"score\":\"2\",\"gap\":\"5\"}", new Instant().plus(9000)),
+            TimestampedValue.of("{\"user\":\"user-1\",\"score\":\"7\",\"gap\":\"5\"}", new Instant().plus(12000)),
+            TimestampedValue.of("{\"user\":\"user-2\",\"score\":\"10\"}", new Instant().plus(12000)))
+        .withCoder(StringUtf8Coder.of()))
+```
+
+The diagram below visualizes the test data:
+
+![Two sets of data and the standard and dynamic sessions with which the data is windowed.]( {{ "/images/standard-vs-dynamic-sessions.png" | prepend: site.baseurl }})
+
+#### Standard sessions
+
+Standard sessions use the following windows and scores:
+```
+user=user-2, score=4, window=[2019-05-26T13:28:49.122Z..2019-05-26T13:28:59.122Z)
+user=user-1, score=18, window=[2019-05-26T13:28:48.582Z..2019-05-26T13:29:12.774Z)
+user=user-2, score=10, window=[2019-05-26T13:29:03.367Z..2019-05-26T13:29:13.367Z)
+```
+
+User #1 sees two events separated by 12 seconds. With standard sessions, the gap defaults to 10 seconds; both scores are in different sessions, so the scores aren't added.
+
+User #2 sees four events, seperated by two, seven, and three seconds, respectively. Since none of the gaps are greater than the default, the four events are in the same standard session and added together (18 points).
+
+#### Dynamic sessions
+The dynamic sessions specify a five-second gap, so they use the following windows and scores:
+
+```
+user=user-2, score=4, window=[2019-05-26T14:30:22.969Z..2019-05-26T14:30:32.969Z)
+user=user-1, score=9, window=[2019-05-26T14:30:22.429Z..2019-05-26T14:30:30.553Z)
+user=user-1, score=9, window=[2019-05-26T14:30:33.276Z..2019-05-26T14:30:41.849Z)
+user=user-2, score=10, window=[2019-05-26T14:30:37.357Z..2019-05-26T14:30:47.357Z)
+```
+
+With dynamic sessions, User #2 gets different scores. The third messages arrives seven seconds after the second message, so it's grouped into a different session. The large, 18-point session is split into two 9-point sessions.
\ No newline at end of file
diff --git a/website/src/documentation/patterns/file-processing-patterns.md b/website/src/documentation/patterns/file-processing.md
similarity index 98%
rename from website/src/documentation/patterns/file-processing-patterns.md
rename to website/src/documentation/patterns/file-processing.md
index b579db8..592a58b 100644
--- a/website/src/documentation/patterns/file-processing-patterns.md
+++ b/website/src/documentation/patterns/file-processing.md
@@ -2,7 +2,7 @@
 layout: section
 title: "File processing patterns"
 section_menu: section-menu/documentation.html
-permalink: /documentation/patterns/file-processing-patterns/
+permalink: /documentation/patterns/file-processing/
 ---
 <!--
 Licensed under the Apache License, Version 2.0 (the "License");
diff --git a/website/src/documentation/patterns/overview.md b/website/src/documentation/patterns/overview.md
index d676b2e..8ecca0b 100644
--- a/website/src/documentation/patterns/overview.md
+++ b/website/src/documentation/patterns/overview.md
@@ -23,17 +23,20 @@
 Pipeline patterns demonstrate common Beam use cases. Pipeline patterns are based on real-world Beam deployments. Each pattern has a description, examples, and a solution or psuedocode.
 
 **File processing patterns** - Patterns for reading from and writing to files
-* [Processing files as they arrive]({{ site.baseurl }}/documentation/patterns/file-processing-patterns/#processing-files-as-they-arrive)
-* [Accessing filenames]({{ site.baseurl }}/documentation/patterns/file-processing-patterns/#accessing-filenames)
+* [Processing files as they arrive]({{ site.baseurl }}/documentation/patterns/file-processing/#processing-files-as-they-arrive)
+* [Accessing filenames]({{ site.baseurl }}/documentation/patterns/file-processing/#accessing-filenames)
 
 **Side input patterns** - Patterns for processing supplementary data
-* [Slowly updating global window side inputs]({{ site.baseurl }}/documentation/patterns/side-input-patterns/#slowly-updating-global-window-side-inputs)
+* [Slowly updating global window side inputs]({{ site.baseurl }}/documentation/patterns/side-inputs/#slowly-updating-global-window-side-inputs)
 
 **Pipeline option patterns** - Patterns for configuring pipelines
-* [Retroactively logging runtime parameters]({{ site.baseurl }}/documentation/patterns/pipeline-option-patterns/#retroactively-logging-runtime-parameters)
+* [Retroactively logging runtime parameters]({{ site.baseurl }}/documentation/patterns/pipeline-options/#retroactively-logging-runtime-parameters)
 
-**Custom I/O patterns**
-* [Choosing between built-in and custom connectors]({{ site.baseurl }}/documentation/patterns/custom-io-patterns/#choosing-between-built-in-and-custom-connectors)
+**Custom I/O patterns** - Patterns for pipeline I/O
+* [Choosing between built-in and custom connectors]({{ site.baseurl }}/documentation/patterns/custom-io/#choosing-between-built-in-and-custom-connectors)
+
+**Custom window patterns** - Patterns for windowing functions
+* [Using data to dynamically set session window gaps]({{ site.baseurl }}/documentation/patterns/custom-windows/#using-data-to-dynamically-set-session-window-gaps)
 
 ## Contributing a pattern
 
diff --git a/website/src/documentation/patterns/pipeline-option-patterns.md b/website/src/documentation/patterns/pipeline-options.md
similarity index 96%
rename from website/src/documentation/patterns/pipeline-option-patterns.md
rename to website/src/documentation/patterns/pipeline-options.md
index 71d24f6..84d1bf5 100644
--- a/website/src/documentation/patterns/pipeline-option-patterns.md
+++ b/website/src/documentation/patterns/pipeline-options.md
@@ -2,7 +2,7 @@
 layout: section
 title: "Pipeline option patterns"
 section_menu: section-menu/documentation.html
-permalink: /documentation/patterns/pipeline-option-patterns/
+permalink: /documentation/patterns/pipeline-options/
 ---
 <!--
 Licensed under the Apache License, Version 2.0 (the "License");
diff --git a/website/src/documentation/patterns/side-input-patterns.md b/website/src/documentation/patterns/side-inputs.md
similarity index 97%
rename from website/src/documentation/patterns/side-input-patterns.md
rename to website/src/documentation/patterns/side-inputs.md
index dc58fd1..854c276 100644
--- a/website/src/documentation/patterns/side-input-patterns.md
+++ b/website/src/documentation/patterns/side-inputs.md
@@ -2,7 +2,7 @@
 layout: section
 title: "Side input patterns"
 section_menu: section-menu/documentation.html
-permalink: /documentation/patterns/side-input-patterns/
+permalink: /documentation/patterns/side-inputs/
 ---
 <!--
 Licensed under the Apache License, Version 2.0 (the "License");
diff --git a/website/src/documentation/runners/direct.md b/website/src/documentation/runners/direct.md
index f61619f..2e763bf 100644
--- a/website/src/documentation/runners/direct.md
+++ b/website/src/documentation/runners/direct.md
@@ -82,4 +82,75 @@
 
 If your pipeline uses an unbounded data source or sink, you must set the `streaming` option to `true`.
 
+### Execution Mode
 
+Python [FnApiRunner](https://beam.apache.org/contribute/runner-guide/#the-fn-api) supports multi-threading and multi-processing mode.
+
+#### Setting parallelism
+
+Number of threads or subprocesses is defined by setting the `direct_num_workers` option. There are several ways to set this option.
+
+* Passing through CLI when executing a pipeline.
+```
+python wordcount.py --input xx --output xx --direct_num_workers 2
+```
+
+* Setting with `PipelineOptions`.
+```
+from apache_beam.options.pipeline_options import PipelineOptions
+pipeline_options = PipelineOptions(['--direct_num_workers', '2'])
+```
+
+* Adding to existing `PipelineOptions`.
+```
+from apache_beam.options.pipeline_options import DirectOptions
+pipeline_options = PipelineOptions(xxx)
+pipeline_options.view_as(DirectOptions).direct_num_workers = 2
+```
+
+#### Running with multi-threading mode
+
+```
+import argparse
+
+import apache_beam as beam
+from apache_beam.options.pipeline_options import PipelineOptions
+from apache_beam.runners.portability import fn_api_runner
+from apache_beam.portability.api import beam_runner_api_pb2
+from apache_beam.portability import python_urns
+
+parser = argparse.ArgumentParser()
+parser.add_argument(...)
+known_args, pipeline_args = parser.parse_known_args(argv)
+pipeline_options = PipelineOptions(pipeline_args)
+
+p = beam.Pipeline(options=pipeline_options,
+      runner=fn_api_runner.FnApiRunner(
+          default_environment=beam_runner_api_pb2.Environment(
+          urn=python_urns.EMBEDDED_PYTHON_GRPC)))
+```
+
+#### Running with multi-processing mode
+
+```
+import argparse
+import sys
+
+import apache_beam as beam
+from apache_beam.options.pipeline_options import PipelineOptions
+from apache_beam.runners.portability import fn_api_runner
+from apache_beam.portability.api import beam_runner_api_pb2
+from apache_beam.portability import python_urns
+
+parser = argparse.ArgumentParser()
+parser.add_argument(...)
+known_args, pipeline_args = parser.parse_known_args(argv)
+pipeline_options = PipelineOptions(pipeline_args)
+
+p = beam.Pipeline(options=pipeline_options,
+      runner=fn_api_runner.FnApiRunner(
+          default_environment=beam_runner_api_pb2.Environment(
+              urn=python_urns.SUBPROCESS_SDK,
+              payload=b'%s -m apache_beam.runners.worker.sdk_worker_main'
+                        % sys.executable.encode('ascii'))))
+```
diff --git a/website/src/documentation/runners/flink.md b/website/src/documentation/runners/flink.md
index 9bcb149..515d8e2 100644
--- a/website/src/documentation/runners/flink.md
+++ b/website/src/documentation/runners/flink.md
@@ -39,8 +39,8 @@
 
 It is important to understand that the Flink Runner comes in two flavors:
 
-1. A *legacy Runner* which supports only Java (and other JVM-based languages)
-2. A *portable Runner* which supports Java/Python/Go
+1. The original *classic Runner* which supports only Java (and other JVM-based languages)
+2. The newer *portable Runner* which supports Java/Python/Go
 
 You may ask why there are two Runners?
 
@@ -49,8 +49,8 @@
 architecture of the Runners had to be changed significantly to support executing
 pipelines written in other languages.
 
-If your applications only use Java, then you should currently go with the legacy
-Runner. Eventually, the portable Runner will replace the legacy Runner because
+If your applications only use Java, then you should currently go with the classic
+Runner. Eventually, the portable Runner will replace the classic Runner because
 it contains the generalized framework for executing Java, Python, Go, and more
 languages in the future.
 
@@ -59,14 +59,14 @@
 portability, please visit the [Portability page]({{site.baseurl
 }}/roadmap/portability/).
 
-Consequently, this guide is split into two parts to document the legacy and
+Consequently, this guide is split into two parts to document the classic and
 the portable functionality of the Flink Runner. Please use the switcher below to
 select the appropriate Runner:
 
 <nav class="language-switcher">
   <strong>Adapt for:</strong>
   <ul>
-    <li data-type="language-java">Legacy (Java)</li>
+    <li data-type="language-java">Classic (Java)</li>
     <li data-type="language-py">Portable (Java/Python/Go)</li>
   </ul>
 </nav>
@@ -103,12 +103,33 @@
   <th>Artifact Id</th>
 </tr>
 <tr>
-  <td>>=2.13.0</td>
+  <td rowspan="2">2.17.0</td>
   <td>1.8.x</td>
   <td>beam-runners-flink-1.8</td>
 </tr>
 <tr>
-  <td rowspan="3">>=2.10.0</td>
+  <td>1.7.x</td>
+  <td>beam-runners-flink-1.7</td>
+</tr>
+<tr>
+  <td rowspan="4">2.13.0 - 2.16.0</td>
+  <td>1.8.x</td>
+  <td>beam-runners-flink-1.8</td>
+</tr>
+<tr>
+  <td>1.7.x</td>
+  <td>beam-runners-flink-1.7</td>
+</tr>
+<tr>
+  <td>1.6.x</td>
+  <td>beam-runners-flink-1.6</td>
+</tr>
+<tr>
+  <td>1.5.x</td>
+  <td>beam-runners-flink_2.11</td>
+</tr>
+<tr>
+  <td rowspan="3">2.10.0 - 2.16.0</td>
   <td>1.7.x</td>
   <td>beam-runners-flink-1.7</td>
 </tr>
@@ -250,7 +271,7 @@
 available. To run a pipeline on an embedded Flink cluster:
 </span>
 
-<span class="language-py">1. Start the JobService endpoint: `./gradlew :runners:flink:1.5:job-server:runShadow`
+<span class="language-py">1. Start the JobService endpoint: `./gradlew :runners:flink:1.8:job-server:runShadow`
 </span>
 
 <span class="language-py">
@@ -283,7 +304,7 @@
 <span class="language-py">1. Start a Flink cluster which exposes the Rest interface on `localhost:8081` by default.
 </span>
 
-<span class="language-py">2. Start JobService with Flink Rest endpoint: `./gradlew :runners:flink:1.5:job-server:runShadow -PflinkMasterUrl=localhost:8081`.
+<span class="language-py">2. Start JobService with Flink Rest endpoint: `./gradlew :runners:flink:1.8:job-server:runShadow -PflinkMasterUrl=localhost:8081`.
 </span>
 
 <span class="language-py">3. Submit the pipeline as above.
@@ -608,7 +629,7 @@
 
 The [Beam Capability Matrix]({{ site.baseurl
 }}/documentation/runners/capability-matrix/) documents the
-capabilities of the legacy Flink Runner.
+capabilities of the classic Flink Runner.
 
 The [Portable Capability
 Matrix](https://s.apache.org/apache-beam-portability-support-table) documents
diff --git a/website/src/documentation/sdks/nexmark.md b/website/src/documentation/sdks/nexmark.md
index 22b381a..d5230da 100644
--- a/website/src/documentation/sdks/nexmark.md
+++ b/website/src/documentation/sdks/nexmark.md
@@ -149,7 +149,7 @@
 
     -P nexmark.runner
 	The Gradle project name of the runner, such as ":runners:direct-java" or
-	":runners:flink:1.5. The project names can be found in the root
+	":runners:flink:1.8. The project names can be found in the root
         `settings.gradle`.
 
 Test data is deterministically synthesized on demand. The test
@@ -557,7 +557,7 @@
 Batch Mode:
 
     ./gradlew :sdks:java:testing:nexmark:run \
-        -Pnexmark.runner=":runners:flink:1.5" \
+        -Pnexmark.runner=":runners:flink:1.8" \
         -Pnexmark.args="
             --runner=FlinkRunner
             --suite=SMOKE
@@ -570,7 +570,7 @@
 Streaming Mode:
 
     ./gradlew :sdks:java:testing:nexmark:run \
-        -Pnexmark.runner=":runners:flink:1.5" \
+        -Pnexmark.runner=":runners:flink:1.8" \
         -Pnexmark.args="
             --runner=FlinkRunner
             --suite=SMOKE
diff --git a/website/src/documentation/sdks/python-streaming.md b/website/src/documentation/sdks/python-streaming.md
index 37c6935..ea08da9 100644
--- a/website/src/documentation/sdks/python-streaming.md
+++ b/website/src/documentation/sdks/python-streaming.md
@@ -183,18 +183,13 @@
 - Custom source API
 - Splittable `DoFn` API
 - Handling of late data
-- User-defined custom `WindowFn`
+- User-defined custom merging `WindowFn` (with fnapi)
 
 ### DataflowRunner specific features
 
 Additionally, `DataflowRunner` does not currently support the following Cloud
 Dataflow specific features with Python streaming execution.
 
-- Streaming autoscaling
-- Updating existing pipelines
 - Cloud Dataflow Templates
-- Some monitoring features, such as msec counters, display data, metrics, and
-  element counts for transforms. However, logging, watermarks, and element
-  counts for sources are supported.
 
 
diff --git a/website/src/documentation/transforms/python/element-wise/filter.md b/website/src/documentation/transforms/python/element-wise/filter.md
index 384327a..346ffb4 100644
--- a/website/src/documentation/transforms/python/element-wise/filter.md
+++ b/website/src/documentation/transforms/python/element-wise/filter.md
@@ -24,17 +24,7 @@
 localStorage.setItem('language', 'language-py')
 </script>
 
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.Filter">
-      <img src="https://beam.apache.org/images/logos/sdks/python.png"
-          width="32px" height="32px" alt="Pydoc" />
-      Pydoc
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include button-pydoc.md path="apache_beam.transforms.core" class="Filter" %}
 
 Given a predicate, filter out all elements that don't satisfy that predicate.
 May also be used to filter based on an inequality with a given value based
@@ -61,29 +51,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter_test.py tag:perennials %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run in Colab" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py"
+%}
 
 ### Example 2: Filtering with a lambda function
 
@@ -99,29 +70,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter_test.py tag:perennials %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py"
+%}
 
 ### Example 3: Filtering with multiple arguments
 
@@ -140,29 +92,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter_test.py tag:perennials %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run in Colab" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py"
+%}
 
 ### Example 4: Filtering with side inputs as singletons
 
@@ -182,29 +115,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter_test.py tag:perennials %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run in Colab" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py"
+%}
 
 ### Example 5: Filtering with side inputs as iterators
 
@@ -222,29 +136,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter_test.py tag:valid_plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run in Colab" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py"
+%}
 
 > **Note**: You can pass the `PCollection` as a *list* with `beam.pvalue.AsList(pcollection)`,
 > but this requires that all the elements fit into memory.
@@ -266,29 +161,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter_test.py tag:perennials %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run in Colab" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/filter-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/filter.py"
+%}
 
 ## Related transforms
 
@@ -297,14 +173,4 @@
 * [ParDo]({{ site.baseurl }}/documentation/transforms/python/elementwise/pardo) is the most general element-wise mapping
   operation, and includes other abilities such as multiple output collections and side-inputs.
 
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.Filter">
-      <img src="https://beam.apache.org/images/logos/sdks/python.png"
-          width="32px" height="32px" alt="Pydoc" />
-      Pydoc
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include button-pydoc.md path="apache_beam.transforms.core" class="Filter" %}
diff --git a/website/src/documentation/transforms/python/element-wise/flatmap.md b/website/src/documentation/transforms/python/element-wise/flatmap.md
index 58eb826..d2e861a 100644
--- a/website/src/documentation/transforms/python/element-wise/flatmap.md
+++ b/website/src/documentation/transforms/python/element-wise/flatmap.md
@@ -24,17 +24,7 @@
 localStorage.setItem('language', 'language-py')
 </script>
 
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.FlatMap">
-      <img src="https://beam.apache.org/images/logos/sdks/python.png"
-          width="32px" height="32px" alt="Pydoc" />
-      Pydoc
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include button-pydoc.md path="apache_beam.transforms.core" class="FlatMap" %}
 
 Applies a simple 1-to-many mapping function over each element in the collection.
 The many elements are flattened into the resulting collection.
@@ -62,29 +52,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 2: FlatMap with a function
 
@@ -100,29 +71,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 3: FlatMap with a lambda function
 
@@ -140,29 +92,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 4: FlatMap with a generator
 
@@ -180,29 +113,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 5: FlatMapTuple for key-value pairs
 
@@ -219,29 +133,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 6: FlatMap with multiple arguments
 
@@ -260,29 +155,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 7: FlatMap with side inputs as singletons
 
@@ -302,29 +178,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ### Example 8: FlatMap with side inputs as iterators
 
@@ -342,29 +199,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:valid_plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 > **Note**: You can pass the `PCollection` as a *list* with `beam.pvalue.AsList(pcollection)`,
 > but this requires that all the elements fit into memory.
@@ -386,29 +224,10 @@
 ```
 {% github_sample /apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map_test.py tag:valid_plants %}```
 
-{:.notebook-skip}
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://colab.research.google.com/github/{{ site.branch_repo }}/examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb">
-      <img src="https://github.com/googlecolab/open_in_colab/raw/master/images/icon32.png"
-        width="32px" height="32px" alt="Run code now" />
-      Run code now
-    </a>
-  </td>
-</table>
-
-<table align="left" style="margin-right:1em">
-  <td>
-    <a class="button" target="_blank"
-        href="https://github.com/{{ site.branch_repo }}/sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py">
-      <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png"
-        width="32px" height="32px" alt="View source code" />
-      View source code
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include buttons-code-snippet.md
+  notebook="examples/notebooks/documentation/transforms/python/element-wise/flatmap-py.ipynb"
+  code="sdks/python/apache_beam/examples/snippets/transforms/element_wise/flat_map.py"
+%}
 
 ## Related transforms
 
@@ -418,14 +237,4 @@
   operation, and includes other abilities such as multiple output collections and side-inputs. 
 * [Map]({{ site.baseurl }}/documentation/transforms/python/elementwise/map) behaves the same, but produces exactly one output for each input.
 
-<table>
-  <td>
-    <a class="button" target="_blank"
-        href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.core.html#apache_beam.transforms.core.FlatMap">
-      <img src="https://beam.apache.org/images/logos/sdks/python.png"
-          width="32px" height="32px" alt="Pydoc" />
-      Pydoc
-    </a>
-  </td>
-</table>
-<br><br><br>
+{% include button-pydoc.md path="apache_beam.transforms.core" class="FlatMap" %}
diff --git a/website/src/images/standard-vs-dynamic-sessions.png b/website/src/images/standard-vs-dynamic-sessions.png
new file mode 100644
index 0000000..832a181
--- /dev/null
+++ b/website/src/images/standard-vs-dynamic-sessions.png
Binary files differ