[FLINK-21021][python] Bump Beam to 2.27.0 (#14741)

diff --git a/docs/content.zh/docs/deployment/cli.md b/docs/content.zh/docs/deployment/cli.md
index 7feda99..354b0a6 100644
--- a/docs/content.zh/docs/deployment/cli.md
+++ b/docs/content.zh/docs/deployment/cli.md
@@ -346,10 +346,10 @@
 Currently, users are able to submit a PyFlink job via the CLI. It does not require to specify the
 JAR file path or the entry main class, which is different from the Java job submission.
 
-<span class="label label-info">Note</span> When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.5+.
+<span class="label label-info">Note</span> When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.6+.
 ```bash
 $ python --version
-# the version printed here must be 3.5+
+# the version printed here must be 3.6+
 ```
 
 The following commands show different PyFlink job submission use-cases:
diff --git a/docs/content.zh/docs/dev/python/datastream_tutorial.md b/docs/content.zh/docs/dev/python/datastream_tutorial.md
index 559d1e8..68ffcee 100644
--- a/docs/content.zh/docs/dev/python/datastream_tutorial.md
+++ b/docs/content.zh/docs/dev/python/datastream_tutorial.md
@@ -50,7 +50,7 @@
 首先,你需要在你的电脑上准备以下环境:
 
 * Java 8 or 11
-* Python 3.5, 3.6 or 3.7
+* Python 3.6, 3.7 or 3.8
 
 使用 Python DataStream API 需要安装 PyFlink,PyFlink 发布在 [PyPI](https://pypi.org/project/apache-flink/)上,可以通过 `pip` 快速安装。 
 
diff --git a/docs/content.zh/docs/dev/python/installation.md b/docs/content.zh/docs/dev/python/installation.md
index 70affeb..793f6c1 100644
--- a/docs/content.zh/docs/dev/python/installation.md
+++ b/docs/content.zh/docs/dev/python/installation.md
@@ -29,11 +29,11 @@
 
 
 ## 环境要求
-<span class="label label-info">注意</span> PyFlink需要特定的Python版本(3.5, 3.6, 3.7 或 3.8)。请运行以下命令,以确保Python版本满足要求。
+<span class="label label-info">注意</span> PyFlink需要特定的Python版本(3.6, 3.7 或 3.8)。请运行以下命令,以确保Python版本满足要求。
 
 ```bash
 $ python --version
-# the version printed here must be 3.5, 3.6, 3.7 or 3.8
+# the version printed here must be 3.6, 3.7 or 3.8
 ```
 
 ## 环境设置
diff --git a/docs/content.zh/docs/dev/python/table/udfs/python_udfs.md b/docs/content.zh/docs/dev/python/table/udfs/python_udfs.md
index c762202..cee3cd6 100644
--- a/docs/content.zh/docs/dev/python/table/udfs/python_udfs.md
+++ b/docs/content.zh/docs/dev/python/table/udfs/python_udfs.md
@@ -28,7 +28,7 @@
 
 用户自定义函数是重要的功能,因为它们极大地扩展了Python Table API程序的表达能力。
 
-**注意:** 要执行Python用户自定义函数,客户端和集群端都需要安装Python版本(3.5、3.6、3.7 或 3.8),并安装PyFlink。
+**注意:** 要执行Python用户自定义函数,客户端和集群端都需要安装Python版本(3.6、3.7 或 3.8),并安装PyFlink。
 
 
 
diff --git a/docs/content.zh/docs/dev/python/table/udfs/vectorized_python_udfs.md b/docs/content.zh/docs/dev/python/table/udfs/vectorized_python_udfs.md
index 10e457e..52b585c 100644
--- a/docs/content.zh/docs/dev/python/table/udfs/vectorized_python_udfs.md
+++ b/docs/content.zh/docs/dev/python/table/udfs/vectorized_python_udfs.md
@@ -32,7 +32,7 @@
 向量化用户自定义函数的定义,与[非向量化用户自定义函数]({{< ref "docs/dev/python/table/udfs/python_udfs" >}})具有相似的方式,
 用户只需要在调用`udf`或者`udaf`装饰器时添加一个额外的参数`func_type="pandas"`,将其标记为一个向量化用户自定义函数即可。
 
-**注意:**要执行Python UDF,需要安装PyFlink的Python版本(3.5、3.6、3.7 或 3.8)。客户端和群集端都需要安装它。
+**注意:**要执行Python UDF,需要安装PyFlink的Python版本(3.6、3.7 或 3.8)。客户端和群集端都需要安装它。
 
 
 
diff --git a/docs/content.zh/docs/dev/python/table_api_tutorial.md b/docs/content.zh/docs/dev/python/table_api_tutorial.md
index 9458b49..51d7382 100644
--- a/docs/content.zh/docs/dev/python/table_api_tutorial.md
+++ b/docs/content.zh/docs/dev/python/table_api_tutorial.md
@@ -52,7 +52,7 @@
 如果要继续我们的旅程,您需要一台具有以下功能的计算机:
 
 * Java 8 or 11
-* Python 3.5, 3.6 or 3.7
+* Python 3.6, 3.7 or 3.8
 
 使用Python Table API需要安装PyFlink,它已经被发布到 [PyPi](https://pypi.org/project/apache-flink/),您可以通过如下方式安装PyFlink:
 
diff --git a/docs/content.zh/docs/dev/table/sqlClient.md b/docs/content.zh/docs/dev/table/sqlClient.md
index 46e20a3..08a0831 100644
--- a/docs/content.zh/docs/dev/table/sqlClient.md
+++ b/docs/content.zh/docs/dev/table/sqlClient.md
@@ -225,8 +225,8 @@
                                            python UDF worker (e.g.:
                                            --pyExecutable
                                            /usr/local/bin/python3). The python
-                                           UDF worker depends on Python 3.5+,
-                                           Apache Beam (version == 2.23.0), Pip
+                                           UDF worker depends on Python 3.6+,
+                                           Apache Beam (version == 2.27.0), Pip
                                            (version >= 7.1.0) and SetupTools
                                            (version >= 37.0.0). Please ensure
                                            that the specified environment meets
diff --git a/docs/content.zh/docs/flinkDev/building.md b/docs/content.zh/docs/flinkDev/building.md
index bd927a7..2735a3a 100644
--- a/docs/content.zh/docs/flinkDev/building.md
+++ b/docs/content.zh/docs/flinkDev/building.md
@@ -71,11 +71,11 @@
 
     如果想构建一个可用于 pip 安装的 PyFlink 包,需要先构建 Flink 工程,如 [构建 Flink](#build-flink) 中所述。
 
-2. Python 的版本为 3.5, 3.6, 3.7 或者 3.8.
+2. Python 的版本为 3.6, 3.7 或者 3.8.
 
     ```shell
     $ python --version
-    # the version printed here must be 3.5, 3.6, 3.7 or 3.8
+    # the version printed here must be 3.6, 3.7 or 3.8
     ```
 
 3. 构建 PyFlink 的 Cython 扩展模块(可选的)
diff --git a/docs/content/docs/deployment/cli.md b/docs/content/docs/deployment/cli.md
index e04b247..efbcba8 100644
--- a/docs/content/docs/deployment/cli.md
+++ b/docs/content/docs/deployment/cli.md
@@ -344,10 +344,10 @@
 Currently, users are able to submit a PyFlink job via the CLI. It does not require to specify the
 JAR file path or the entry main class, which is different from the Java job submission.
 
-<span class="label label-info">Note</span> When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.5+.
+<span class="label label-info">Note</span> When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.6+.
 ```bash
 $ python --version
-# the version printed here must be 3.5+
+# the version printed here must be 3.6+
 ```
 
 The following commands show different PyFlink job submission use-cases:
diff --git a/docs/content/docs/dev/python/datastream_tutorial.md b/docs/content/docs/dev/python/datastream_tutorial.md
index 28581ae..d565bc6 100644
--- a/docs/content/docs/dev/python/datastream_tutorial.md
+++ b/docs/content/docs/dev/python/datastream_tutorial.md
@@ -49,7 +49,7 @@
 If you want to follow along, you will require a computer with: 
 
 * Java 8 or 11
-* Python 3.5, 3.6 or 3.7
+* Python 3.6, 3.7 or 3.8
 
 Using Python DataStream API requires installing PyFlink, which is available on [PyPI](https://pypi.org/project/apache-flink/) and can be easily installed using `pip`. 
 
diff --git a/docs/content/docs/dev/python/installation.md b/docs/content/docs/dev/python/installation.md
index 291e16a..8517c13 100644
--- a/docs/content/docs/dev/python/installation.md
+++ b/docs/content/docs/dev/python/installation.md
@@ -29,12 +29,12 @@
 ## Environment Requirements
 
 {{< hint info >}}
-Python version (3.5, 3.6, 3.7 or 3.8) is required for PyFlink. Please run the following command to make sure that it meets the requirements:
+Python version (3.6, 3.7 or 3.8) is required for PyFlink. Please run the following command to make sure that it meets the requirements:
 {{< /hint >}}
 
 ```bash
 $ python --version
-# the version printed here must be 3.5, 3.6, 3.7 or 3.8
+# the version printed here must be 3.6, 3.7 or 3.8
 ```
 
 ## Environment Setup
diff --git a/docs/content/docs/dev/python/table/udfs/python_udfs.md b/docs/content/docs/dev/python/table/udfs/python_udfs.md
index 1af2414..fb5a1e1 100644
--- a/docs/content/docs/dev/python/table/udfs/python_udfs.md
+++ b/docs/content/docs/dev/python/table/udfs/python_udfs.md
@@ -28,7 +28,7 @@
 
 User-defined functions are important features, because they significantly extend the expressiveness of Python Table API programs.
 
-**NOTE:** Python UDF execution requires Python version (3.5, 3.6, 3.7 or 3.8) with PyFlink installed. It's required on both the client side and the cluster side. 
+**NOTE:** Python UDF execution requires Python version (3.6, 3.7 or 3.8) with PyFlink installed. It's required on both the client side and the cluster side. 
 
 ## Scalar Functions
 
diff --git a/docs/content/docs/dev/python/table/udfs/vectorized_python_udfs.md b/docs/content/docs/dev/python/table/udfs/vectorized_python_udfs.md
index ce34160..0e64a60 100644
--- a/docs/content/docs/dev/python/table/udfs/vectorized_python_udfs.md
+++ b/docs/content/docs/dev/python/table/udfs/vectorized_python_udfs.md
@@ -33,7 +33,7 @@
 [non-vectorized user-defined functions]({{< ref "docs/dev/python/table/udfs/python_udfs" >}}) on how to define vectorized user-defined functions.
 Users only need to add an extra parameter `func_type="pandas"` in the decorator `udf` or `udaf` to mark it as a vectorized user-defined function.
 
-**NOTE:** Python UDF execution requires Python version (3.5, 3.6, 3.7 or 3.8) with PyFlink installed. It's required on both the client side and the cluster side. 
+**NOTE:** Python UDF execution requires Python version (3.6, 3.7 or 3.8) with PyFlink installed. It's required on both the client side and the cluster side. 
 
 ## Vectorized Scalar Functions
 
diff --git a/docs/content/docs/dev/python/table_api_tutorial.md b/docs/content/docs/dev/python/table_api_tutorial.md
index 3e43047..39551eb 100644
--- a/docs/content/docs/dev/python/table_api_tutorial.md
+++ b/docs/content/docs/dev/python/table_api_tutorial.md
@@ -51,7 +51,7 @@
 If you want to follow along, you will require a computer with: 
 
 * Java 8 or 11
-* Python 3.5, 3.6 or 3.7
+* Python 3.6, 3.7 or 3.8
 
 Using Python Table API requires installing PyFlink, which is available on [PyPI](https://pypi.org/project/apache-flink/) and can be easily installed using `pip`. 
 
diff --git a/docs/content/docs/dev/table/sqlClient.md b/docs/content/docs/dev/table/sqlClient.md
index 4bfe2c4..025a132 100644
--- a/docs/content/docs/dev/table/sqlClient.md
+++ b/docs/content/docs/dev/table/sqlClient.md
@@ -224,8 +224,8 @@
                                            python UDF worker (e.g.:
                                            --pyExecutable
                                            /usr/local/bin/python3). The python
-                                           UDF worker depends on Python 3.5+,
-                                           Apache Beam (version == 2.23.0), Pip
+                                           UDF worker depends on Python 3.6+,
+                                           Apache Beam (version == 2.27.0), Pip
                                            (version >= 7.1.0) and SetupTools
                                            (version >= 37.0.0). Please ensure
                                            that the specified environment meets
diff --git a/docs/content/docs/flinkDev/building.md b/docs/content/docs/flinkDev/building.md
index b4d4af6..8525e39 100644
--- a/docs/content/docs/flinkDev/building.md
+++ b/docs/content/docs/flinkDev/building.md
@@ -66,11 +66,11 @@
 
     If you want to build a PyFlink package that can be used for pip installation, you need to build the Flink project first, as described in [Build Flink](#build-flink).
 
-2. Python version(3.5, 3.6, 3.7 or 3.8) is required
+2. Python version(3.6, 3.7 or 3.8) is required
 
     ```shell
     $ python --version
-    # the version printed here must be 3.5, 3.6, 3.7 or 3.8
+    # the version printed here must be 3.6, 3.7 or 3.8
     ```
 
 3. Build PyFlink with Cython extension support (optional)
diff --git a/docs/layouts/shortcodes/generated/python_configuration.html b/docs/layouts/shortcodes/generated/python_configuration.html
index e50ca14..7487a4f 100644
--- a/docs/layouts/shortcodes/generated/python_configuration.html
+++ b/docs/layouts/shortcodes/generated/python_configuration.html
@@ -24,7 +24,7 @@
             <td><h5>python.executable</h5></td>
             <td style="word-wrap: break-word;">"python"</td>
             <td>String</td>
-            <td>Specify the path of the python interpreter used to execute the python UDF worker. The python UDF worker depends on Python 3.5+, Apache Beam (version == 2.23.0), Pip (version &gt;= 7.1.0) and SetupTools (version &gt;= 37.0.0). Please ensure that the specified environment meets the above requirements. The option is equivalent to the command line option "-pyexec".</td>
+            <td>Specify the path of the python interpreter used to execute the python UDF worker. The python UDF worker depends on Python 3.6+, Apache Beam (version == 2.27.0), Pip (version &gt;= 7.1.0) and SetupTools (version &gt;= 37.0.0). Please ensure that the specified environment meets the above requirements. The option is equivalent to the command line option "-pyexec".</td>
         </tr>
         <tr>
             <td><h5>python.files</h5></td>
diff --git a/flink-clients/src/main/java/org/apache/flink/client/cli/CliFrontendParser.java b/flink-clients/src/main/java/org/apache/flink/client/cli/CliFrontendParser.java
index ce4acc9..8334435 100644
--- a/flink-clients/src/main/java/org/apache/flink/client/cli/CliFrontendParser.java
+++ b/flink-clients/src/main/java/org/apache/flink/client/cli/CliFrontendParser.java
@@ -240,7 +240,7 @@
                     true,
                     "Specify the path of the python interpreter used to execute the python UDF worker "
                             + "(e.g.: --pyExecutable /usr/local/bin/python3). "
-                            + "The python UDF worker depends on Python 3.5+, Apache Beam (version == 2.23.0), "
+                            + "The python UDF worker depends on Python 3.6+, Apache Beam (version == 2.27.0), "
                             + "Pip (version >= 7.1.0) and SetupTools (version >= 37.0.0). "
                             + "Please ensure that the specified environment meets the above requirements.");
 
diff --git a/flink-python/README.md b/flink-python/README.md
index ec87028..d14aaab 100644
--- a/flink-python/README.md
+++ b/flink-python/README.md
@@ -16,7 +16,7 @@
 
 ## Python Requirements
 
-Apache Flink Python API depends on Py4J (currently version 0.10.8.1), CloudPickle (currently version 1.2.2), python-dateutil(currently version 2.8.0), Apache Beam (currently version 2.23.0) and jsonpickle (currently 1.2).
+Apache Flink Python API depends on Py4J (currently version 0.10.8.1), CloudPickle (currently version 1.2.2), python-dateutil(currently version 2.8.0), Apache Beam (currently version 2.27.0) and jsonpickle (currently 1.2).
 
 ## Development Notices
 
diff --git a/flink-python/dev/build-wheels.sh b/flink-python/dev/build-wheels.sh
index d37c830..1d22ca2 100755
--- a/flink-python/dev/build-wheels.sh
+++ b/flink-python/dev/build-wheels.sh
@@ -19,7 +19,7 @@
 dev/lint-python.sh -s py_env
 
 PY_ENV_DIR=`pwd`/dev/.conda/envs
-py_env=("3.5" "3.6" "3.7" "3.8")
+py_env=("3.6" "3.7" "3.8")
 ## 2. install dependency
 for ((i=0;i<${#py_env[@]};i++)) do
     ${PY_ENV_DIR}/${py_env[i]}/bin/pip install -r dev/dev-requirements.txt
diff --git a/flink-python/dev/dev-requirements.txt b/flink-python/dev/dev-requirements.txt
index 08faa45..0c95755 100755
--- a/flink-python/dev/dev-requirements.txt
+++ b/flink-python/dev/dev-requirements.txt
@@ -14,5 +14,5 @@
 # limitations under the License.
 setuptools>=18.0
 wheel
-apache-beam==2.23.0
+apache-beam==2.27.0
 cython==0.29.16
diff --git a/flink-python/dev/lint-python.sh b/flink-python/dev/lint-python.sh
index 1d28da3..7ccf4ac 100755
--- a/flink-python/dev/lint-python.sh
+++ b/flink-python/dev/lint-python.sh
@@ -222,7 +222,7 @@
     if [[ ${BUILD_REASON} = 'IndividualCI' ]]; then
         py_env=("3.8")
     else
-        py_env=("3.5" "3.6" "3.7" "3.8")
+        py_env=("3.6" "3.7" "3.8")
     fi
     for ((i=0;i<${#py_env[@]};i++)) do
         if [ -d "$CURRENT_DIR/.conda/envs/${py_env[i]}" ]; then
@@ -397,7 +397,7 @@
     print_function "STEP" "install miniconda... [SUCCESS]"
 
     # step-3 install python environment whcih includes
-    # 3.5 3.6 3.7 3.8
+    # 3.6 3.7 3.8
     if [ $STEP -lt 3 ] && [ `need_install_component "py_env"` = true ]; then
         print_function "STEP" "installing python environment..."
         install_py_env
@@ -772,7 +772,7 @@
 -l          list all checks supported.
 Examples:
   ./lint-python -s basic        =>  install environment with basic components.
-  ./lint-python -s py_env       =>  install environment with python env(3.5,3.6,3.7,3.8).
+  ./lint-python -s py_env       =>  install environment with python env(3.6,3.7,3.8).
   ./lint-python -s all          =>  install environment with all components such as python env,tox,flake8,sphinx,mypy etc.
   ./lint-python -s tox,flake8   =>  install environment with tox,flake8.
   ./lint-python -s tox -f       =>  reinstall environment with tox.
diff --git a/flink-python/pom.xml b/flink-python/pom.xml
index d893664..cc9dee3 100644
--- a/flink-python/pom.xml
+++ b/flink-python/pom.xml
@@ -242,6 +242,13 @@
 				<artifactId>protobuf-java</artifactId>
 				<version>${protoc.version}</version>
 			</dependency>
+
+			<dependency>
+				<groupId>org.conscrypt</groupId>
+				<artifactId>conscrypt-openjdk-uber</artifactId>
+				<version>2.5.1</version>
+				<scope>runtime</scope>
+			</dependency>
 		</dependencies>
 	</dependencyManagement>
 
diff --git a/flink-python/pyflink/__init__.py b/flink-python/pyflink/__init__.py
index fce89be..9b751de 100644
--- a/flink-python/pyflink/__init__.py
+++ b/flink-python/pyflink/__init__.py
@@ -18,9 +18,9 @@
 import sys
 from functools import wraps
 
-if sys.version_info < (3, 5):
+if sys.version_info < (3, 6):
     raise RuntimeError(
-        'Python versions prior to 3.5 are not supported for PyFlink [' +
+        'Python versions prior to 3.6 are not supported for PyFlink [' +
         str(sys.version_info) + '].')
 
 
diff --git a/flink-python/pyflink/datastream/stream_execution_environment.py b/flink-python/pyflink/datastream/stream_execution_environment.py
index e5150f7..3ec0807 100644
--- a/flink-python/pyflink/datastream/stream_execution_environment.py
+++ b/flink-python/pyflink/datastream/stream_execution_environment.py
@@ -491,7 +491,7 @@
         .. note::
 
             Please make sure the uploaded python environment matches the platform that the cluster
-            is running on and that the python version must be 3.5 or higher.
+            is running on and that the python version must be 3.6 or higher.
 
         .. note::
 
@@ -540,11 +540,11 @@
         .. note::
 
             Please make sure the uploaded python environment matches the platform that the cluster
-            is running on and that the python version must be 3.5 or higher.
+            is running on and that the python version must be 3.6 or higher.
 
         .. note::
 
-            The python udf worker depends on Apache Beam (version == 2.23.0).
+            The python udf worker depends on Apache Beam (version == 2.27.0).
             Please ensure that the specified environment meets the above requirements.
 
         :param python_exec: The path of python interpreter.
diff --git a/flink-python/pyflink/table/table_config.py b/flink-python/pyflink/table/table_config.py
index 57a5abf..7305102 100644
--- a/flink-python/pyflink/table/table_config.py
+++ b/flink-python/pyflink/table/table_config.py
@@ -352,11 +352,11 @@
         .. note::
 
             Please make sure the uploaded python environment matches the platform that the cluster
-            is running on and that the python version must be 3.5 or higher.
+            is running on and that the python version must be 3.6 or higher.
 
         .. note::
 
-            The python udf worker depends on Apache Beam (version == 2.23.0).
+            The python udf worker depends on Apache Beam (version == 2.27.0).
             Please ensure that the specified environment meets the above requirements.
 
         :param python_exec: The path of python interpreter.
diff --git a/flink-python/pyflink/table/tests/test_pandas_conversion.py b/flink-python/pyflink/table/tests/test_pandas_conversion.py
index f767fb2..4f20e24 100644
--- a/flink-python/pyflink/table/tests/test_pandas_conversion.py
+++ b/flink-python/pyflink/table/tests/test_pandas_conversion.py
@@ -138,7 +138,11 @@
         result_pdf = table.to_pandas()
         result_pdf.index = self.pdf.index
         self.assertEqual(2, len(result_pdf))
-        assert_frame_equal(self.pdf, result_pdf)
+        expected_arrow = self.pdf.to_records(index=False)
+        result_arrow = result_pdf.to_records(index=False)
+        for r in range(len(expected_arrow)):
+            for e in range(len(expected_arrow[r])):
+                self.assert_equal_field(expected_arrow[r][e], result_arrow[r][e])
 
     def test_empty_to_pandas(self):
         table = self.t_env.from_pandas(self.pdf, self.data_type)
@@ -155,6 +159,18 @@
         result_pdf = table.group_by("f2").select("max(f1) as f2").to_pandas()
         assert_frame_equal(result_pdf, pd.DataFrame(data={'f2': np.int8([1, 1])}))
 
+    def assert_equal_field(self, expected_field, result_field):
+        import numpy as np
+        result_type = type(result_field)
+        if result_type == dict:
+            self.assertEqual(expected_field.keys(), result_field.keys())
+            for key in expected_field:
+                self.assert_equal_field(expected_field[key], result_field[key])
+        elif result_type == np.ndarray:
+            self.assertTrue((expected_field == result_field).all())
+        else:
+            self.assertTrue(expected_field == result_field)
+
 
 class StreamPandasConversionTests(PandasConversionITTests,
                                   PyFlinkOldStreamTableTestCase):
diff --git a/flink-python/setup.py b/flink-python/setup.py
index df97d88..1fc80f1 100644
--- a/flink-python/setup.py
+++ b/flink-python/setup.py
@@ -28,8 +28,8 @@
 
 from setuptools import setup, Extension
 
-if sys.version_info < (3, 5):
-    print("Python versions prior to 3.5 are not supported for PyFlink.",
+if sys.version_info < (3, 6):
+    print("Python versions prior to 3.6 are not supported for PyFlink.",
           file=sys.stderr)
     sys.exit(-1)
 
@@ -321,12 +321,11 @@
         license='https://www.apache.org/licenses/LICENSE-2.0',
         author='Apache Software Foundation',
         author_email='dev@flink.apache.org',
-        python_requires='>=3.5',
-        install_requires=['py4j==0.10.8.1', 'python-dateutil==2.8.0', 'apache-beam==2.23.0',
-                          'cloudpickle==1.2.2', 'avro-python3>=1.8.1,<=1.9.1', 'jsonpickle==1.2',
-                          'pandas>=0.24.2,<1; python_full_version < "3.5.3"',
-                          'pandas>=0.25.2,<1; python_full_version >= "3.5.3"',
-                          'pyarrow>=0.15.1,<0.18.0', 'pytz>=2018.3', 'numpy>=1.14.3,<1.20'],
+        python_requires='>=3.6',
+        install_requires=['py4j==0.10.8.1', 'python-dateutil==2.8.0', 'apache-beam==2.27.0',
+                          'cloudpickle==1.2.2', 'avro-python3>=1.8.1,!=1.9.2,<1.10.0',
+                          'jsonpickle==1.2', 'pandas>=1.0,<1.2.0', 'pyarrow>=0.15.1,<3.0.0',
+                          'pytz>=2018.3', 'numpy>=1.14.3,<1.20', 'fastavro>=0.21.4,<0.24'],
         cmdclass={'build_ext': build_ext},
         tests_require=['pytest==4.4.1'],
         description='Apache Flink Python API',
@@ -336,7 +335,6 @@
         classifiers=[
             'Development Status :: 5 - Production/Stable',
             'License :: OSI Approved :: Apache Software License',
-            'Programming Language :: Python :: 3.5',
             'Programming Language :: Python :: 3.6',
             'Programming Language :: Python :: 3.7',
             'Programming Language :: Python :: 3.8'],
diff --git a/flink-python/src/main/java/org/apache/flink/python/PythonOptions.java b/flink-python/src/main/java/org/apache/flink/python/PythonOptions.java
index bcde0ab..098a6b6 100644
--- a/flink-python/src/main/java/org/apache/flink/python/PythonOptions.java
+++ b/flink-python/src/main/java/org/apache/flink/python/PythonOptions.java
@@ -111,8 +111,8 @@
                     .defaultValue("python")
                     .withDescription(
                             "Specify the path of the python interpreter used to execute the python "
-                                    + "UDF worker. The python UDF worker depends on Python 3.5+, Apache Beam "
-                                    + "(version == 2.23.0), Pip (version >= 7.1.0) and SetupTools (version >= 37.0.0). "
+                                    + "UDF worker. The python UDF worker depends on Python 3.6+, Apache Beam "
+                                    + "(version == 2.27.0), Pip (version >= 7.1.0) and SetupTools (version >= 37.0.0). "
                                     + "Please ensure that the specified environment meets the above requirements. The "
                                     + "option is equivalent to the command line option \"-pyexec\".");
 
diff --git a/flink-python/src/main/resources/META-INF/NOTICE b/flink-python/src/main/resources/META-INF/NOTICE
index d97c286..895559f 100644
--- a/flink-python/src/main/resources/META-INF/NOTICE
+++ b/flink-python/src/main/resources/META-INF/NOTICE
@@ -16,15 +16,15 @@
 - org.apache.arrow:arrow-format:0.16.0
 - org.apache.arrow:arrow-memory:0.16.0
 - org.apache.arrow:arrow-vector:0.16.0
-- org.apache.beam:beam-model-fn-execution:2.23.0
-- org.apache.beam:beam-model-job-management:2.23.0
-- org.apache.beam:beam-model-pipeline:2.23.0
-- org.apache.beam:beam-runners-core-construction-java:2.23.0
-- org.apache.beam:beam-runners-core-java:2.23.0
-- org.apache.beam:beam-runners-java-fn-execution:2.23.0
-- org.apache.beam:beam-sdks-java-core:2.23.0
-- org.apache.beam:beam-sdks-java-fn-execution:2.23.0
-- org.apache.beam:beam-vendor-sdks-java-extensions-protobuf:2.23.0
+- org.apache.beam:beam-model-fn-execution:2.27.0
+- org.apache.beam:beam-model-job-management:2.27.0
+- org.apache.beam:beam-model-pipeline:2.27.0
+- org.apache.beam:beam-runners-core-construction-java:2.27.0
+- org.apache.beam:beam-runners-core-java:2.27.0
+- org.apache.beam:beam-runners-java-fn-execution:2.27.0
+- org.apache.beam:beam-sdks-java-core:2.27.0
+- org.apache.beam:beam-sdks-java-fn-execution:2.27.0
+- org.apache.beam:beam-vendor-sdks-java-extensions-protobuf:2.27.0
 - org.apache.beam:beam-vendor-guava-26_0-jre:0.1
 - org.apache.beam:beam-vendor-grpc-1_26_0:0.3
 
@@ -53,19 +53,19 @@
 - io.grpc:grpc-protobuf:1.26.0
 - io.grpc:grpc-stub:1.26.0
 - io.grpc:grpc-testing:1.26.0
-- io.netty:netty-buffer:4.1.42.Final
-- io.netty:netty-codec:4.1.42.Final
-- io.netty:netty-codec-http:4.1.42.Final
-- io.netty:netty-codec-http2:4.1.42.Final
-- io.netty:netty-codec-socks:4.1.42.Final
-- io.netty:netty-common:4.1.42.Final
-- io.netty:netty-handler:4.1.42.Final
-- io.netty:netty-handler-proxy:4.1.42.Final
-- io.netty:netty-resolver:4.1.42.Final
-- io.netty:netty-transport:4.1.42.Final
-- io.netty:netty-transport-native-epoll:4.1.42.Final
-- io.netty:netty-transport-native-unix-common:4.1.42.Final
-- io.netty:netty-tcnative-boringssl-static:2.0.26.Final
+- io.netty:netty-buffer:4.1.51.Final
+- io.netty:netty-codec:4.1.51.Final
+- io.netty:netty-codec-http:4.1.51.Final
+- io.netty:netty-codec-http2:4.1.51.Final
+- io.netty:netty-codec-socks:4.1.51.Final
+- io.netty:netty-common:4.1.51.Final
+- io.netty:netty-handler:4.1.51.Final
+- io.netty:netty-handler-proxy:4.1.51.Final
+- io.netty:netty-resolver:4.1.51.Final
+- io.netty:netty-transport:4.1.51.Final
+- io.netty:netty-transport-native-epoll:4.1.51.Final
+- io.netty:netty-transport-native-unix-common:4.1.51.Final
+- io.netty:netty-tcnative-boringssl-static:2.0.33.Final
 - io.opencensus:opencensus-api:0.24.0
 - io.opencensus:opencensus-contrib-grpc-metrics:0.24.0
 - io.perfmark:perfmark-api:0.19.0
diff --git a/flink-python/tox.ini b/flink-python/tox.ini
index c1eb6b2..e2acf2a 100644
--- a/flink-python/tox.ini
+++ b/flink-python/tox.ini
@@ -21,14 +21,14 @@
 # in multiple virtualenvs. This configuration file will run the
 # test suite on all supported python versions.
 # new environments will be excluded by default unless explicitly added to envlist.
-envlist = {py35, py36, py37, py38}-cython
+envlist = {py36, py37, py38}-cython
 
 [testenv]
 whitelist_externals=
     /bin/bash
 deps =
     pytest
-    apache-beam==2.23.0
+    apache-beam==2.27.0
     cython==0.29.16
     grpcio>=1.17.0,<=1.26.0
     grpcio-tools>=1.3.5,<=1.14.2
diff --git a/pom.xml b/pom.xml
index f9ee93b..89eae9a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -133,7 +133,7 @@
 		<powermock.version>2.0.4</powermock.version>
 		<hamcrest.version>1.3</hamcrest.version>
 		<py4j.version>0.10.8.1</py4j.version>
-		<beam.version>2.23.0</beam.version>
+		<beam.version>2.27.0</beam.version>
 		<protoc.version>3.11.1</protoc.version>
 		<arrow.version>0.16.0</arrow.version>
 		<japicmp.skip>false</japicmp.skip>
diff --git a/tools/releasing/create_binary_release.sh b/tools/releasing/create_binary_release.sh
index f035329..28a59b1 100755
--- a/tools/releasing/create_binary_release.sh
+++ b/tools/releasing/create_binary_release.sh
@@ -112,8 +112,8 @@
   cp ${pyflink_actual_name} "${PYTHON_RELEASE_DIR}/${pyflink_release_name}"
 
   wheel_packages_num=0
-  # py35,py36,py37,py38 for mac and linux (8 wheel packages)
-  EXPECTED_WHEEL_PACKAGES_NUM=8
+  # py36,py37,py38 for mac and linux (6 wheel packages)
+  EXPECTED_WHEEL_PACKAGES_NUM=6
   # Need to move the downloaded wheel packages from Azure CI to the directory flink-python/dist manually.
   for wheel_file in *.whl; do
     if [[ ! ${wheel_file} =~ ^apache_flink-$PYFLINK_VERSION- ]]; then