Python 3 upgrade (#3522)

* Initial Python 3 upgrade effort

* Fixes towards python3 support

 * update heronpy release scripts for python3
 * update dist Dockerfiles to only use python3
 * remove python2 from docker images
 * upgrade pylint for python3 support
 * upgrade PEX so transative dependencies are captured

Additionally:
 * fix Ubuntu 16.04 images
 * fix linting issues found by newer pylint

There is an issue with encapsulation in the builds where the global python3 environment is used
while PEX installs a nested transitive dependency of pylint: `pylint>astroid>wrapt`. This seems
to be because of logic in its setup.py which can be disabled with `WRAPT_INSTALL_EXTENSIONS=false`

* Fix new pylint issues

* update setuptools

* Make pex_pytest non-zip-safe

* Rough proto_library fix

The issue encountered was https://github.com/protocolbuffers/protobuf/issues/1491 which
may be fixed by a pending PR to protoc, or with a switch to the official protobuf rules
and the import_prefix parameter to proto_library.

* WIP: Fix python3 incompatibilities

 * bytes vs str issues
 * update kazoo
 * order of processes in executor test changed due to dict ordering?
 * some places needed / switched to // - may be more not caught by tests
 * add travis_wait as some stages going over 10 minutes without output in CI

TODO:
 * make sure the kazoo upgrade is correct, it was done only by updating package versoin

* Try fixing build time issue in travis

* Upgrade docker rules

* Upgrade to python3 in CI

* Fix python integration tests

* Fix more bytes vs str errors + update vagrant

* Update Travis to Python3.7 + fix Vagrant on mac

* Reduce requirement to python3.6 + py3 fixes

 * use universal_newline in popen instead of text in Popen for py3.6
 * fix bytes/str issues in deserialisation
 * fix file open modes
 * use set instead of sets.Set
 * fix __import__(level) default

* Update cloudpickle

* Fix python addressing of release.yaml

* Additions to get docker image builds working and tested

 * use new external pkg_* rules
 * add python to compile docker images until pkg_*
 * add --host_force_python=PY3 to other bazel.rc files

* WIP: Add CI for docker images/releases

 * use kind to create ephemeral clusters
 * start consolidating scripts with python

* Fix helm chart

* bytes vs str fix

* Mention Python 3.6 requirement in README.md

* updatedockerfile

Co-authored-by: Neng Lu <nlu@twitter.com>
Co-authored-by: Nicholas Nezis <nicholas.nezis@gmail.com>
Co-authored-by: bed debug <huijunwu@users.noreply.github.com>
Co-authored-by: huijunwu <huijun.wu.2010@gmail.com>
diff --git a/.travis.yml b/.travis.yml
index 9819a77..caa4046 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -16,9 +16,10 @@
     packages:
       - libtool-bin
       - libcppunit-dev
+      - python3
       - pkg-config
-      - python-dev
-      - python-wheel
+      - python3-dev
+      - python3-wheel
       - wget
       - zip
       - zlib1g-dev
@@ -34,6 +35,10 @@
   - chmod +x bazel-${BAZEL_VERSION}-installer-linux-x86_64.sh
   - ./bazel-${BAZEL_VERSION}-installer-linux-x86_64.sh --user
 
+install:
+  - sudo apt-get install python3-pip python3-setuptools
+  - pip3 install travis-wait-improved
+
 script:
   - which gcc
   - gcc --version
@@ -41,6 +46,6 @@
   - g++ --version
   - which python
   - python -V
-  - which python2.7
-  - python2.7 -V
-  - scripts/travis/ci.sh
\ No newline at end of file
+  - which python3
+  - python3 -V
+  - travis-wait-improved --timeout=180m scripts/travis/ci.sh
diff --git a/README.md b/README.md
index 77ab0de..77f949d 100644
--- a/README.md
+++ b/README.md
@@ -31,7 +31,7 @@
 
 #### Heron Requirements:
  * Java 11
- * Python 2.7
+ * Python 3.6
  * Bazel 3.0.0
 
 ## Contact
diff --git a/WORKSPACE b/WORKSPACE
index 41cb1db..83b6702 100644
--- a/WORKSPACE
+++ b/WORKSPACE
@@ -171,7 +171,7 @@
 # pip_repositories()
 
 # for pex repos
-PEX_SRC = "https://pypi.python.org/packages/3a/1d/cd41cd3765b78a4353bbf27d18b099f7afbcd13e7f2dc9520f304ec8981c/pex-1.2.15.tar.gz"
+PEX_WHEEL = "https://pypi.python.org/packages/18/92/99270775cfc5ddb60c19588de1c475f9ff2837a6e0bbd5eaa5286a6a472b/pex-2.1.9-py2.py3-none-any.whl"
 
 PY_WHEEL = "https://pypi.python.org/packages/53/67/9620edf7803ab867b175e4fd23c7b8bd8eba11cb761514dcd2e726ef07da/py-1.4.34-py2.py3-none-any.whl"
 
@@ -179,7 +179,7 @@
 
 REQUESTS_SRC = "https://pypi.python.org/packages/d9/03/155b3e67fe35fe5b6f4227a8d9e96a14fda828b18199800d161bcefc1359/requests-2.12.3.tar.gz"
 
-SETUPTOOLS_SRC = "https://pypi.python.org/packages/68/13/1bfbfbd86560e61fa9803d241084fff41a775bf56ee8b3ad72fc9e550dad/setuptools-31.0.0.tar.gz"
+SETUPTOOLS_WHEEL = "https://pypi.python.org/packages/a0/df/635cdb901ee4a8a42ec68e480c49f85f4c59e8816effbf57d9e6ee8b3588/setuptools-46.1.3-py3-none-any.whl"
 
 VIRTUALENV_SRC = "https://pypi.python.org/packages/d4/0c/9840c08189e030873387a73b90ada981885010dd9aea134d6de30cd24cb8/virtualenv-15.1.0.tar.gz"
 
@@ -210,9 +210,9 @@
 
 http_file(
     name = "pex_src",
-    downloaded_file_path = "pex-1.2.15.tar.gz",
-    sha256 = "0147d19123340677b9793b00ec86fe65b6697db3ec99afb796da2300ae5fec14",
-    urls = [PEX_SRC],
+    downloaded_file_path = "pex-2.1.9-py2.py3-none-any.whl",
+    sha256 = "5cad8d960c187541f71682fc938a843ef9092aab46f27b33ace7e570325e2626",
+    urls = [PEX_WHEEL],
 )
 
 http_file(
@@ -223,10 +223,10 @@
 )
 
 http_file(
-    name = "setuptools_src",
-    downloaded_file_path = "setuptools-31.0.0.tar.gz",
-    sha256 = "0818cc0de692c3a5c83ca83aa7ec7ba6bc206f278735f1e0267b8d0e095cfe7a",
-    urls = [SETUPTOOLS_SRC],
+    name = "setuptools_wheel",
+    downloaded_file_path = "setuptools-46.1.3-py3-none-any.whl",
+    sha256 = "4fe404eec2738c20ab5841fa2d791902d2a645f32318a7850ef26f8d7215a8ee",
+    urls = [SETUPTOOLS_WHEEL],
 )
 
 http_archive(
@@ -367,11 +367,12 @@
 # end helm
 
 # for docker image building
+DOCKER_RULES_VERSION = "0.14.1"
 http_archive(
     name = "io_bazel_rules_docker",
-    sha256 = "aed1c249d4ec8f703edddf35cbe9dfaca0b5f5ea6e4cd9e83e99f3b0d1136c3d",
-    strip_prefix = "rules_docker-0.7.0",
-    urls = ["https://github.com/bazelbuild/rules_docker/archive/v0.7.0.tar.gz"],
+    sha256 = "dc97fccceacd4c6be14e800b2a00693d5e8d07f69ee187babfd04a80a9f8e250",
+    strip_prefix = "rules_docker-%s" % DOCKER_RULES_VERSION,
+    urls = ["https://github.com/bazelbuild/rules_docker/archive/v%s.tar.gz" % DOCKER_RULES_VERSION],
 )
 
 load(
@@ -381,6 +382,10 @@
 
 container_repositories()
 
+load("@io_bazel_rules_docker//repositories:deps.bzl", container_deps = "deps")
+
+container_deps()
+
 load(
     "@io_bazel_rules_docker//container:container.bzl",
     "container_pull",
@@ -391,11 +396,19 @@
     digest = "sha256:495800e9eb001dfd2fb41d1941155203bb9be06b716b0f8b1b0133eb12ea813c",
     registry = "index.docker.io",
     repository = "heron/base",
-    tag = "0.4.0",
+    tag = "0.5.0",
 )
 
 # end docker image building
 
+http_archive(
+    name = "rules_pkg",
+    url = "https://github.com/bazelbuild/rules_pkg/releases/download/0.2.5/rules_pkg-0.2.5.tar.gz",
+    sha256 = "352c090cc3d3f9a6b4e676cf42a6047c16824959b438895a76c2989c6d7c246a",
+)
+load("@rules_pkg//:deps.bzl", "rules_pkg_dependencies")
+rules_pkg_dependencies()
+
 # for nomad repear
 http_archive(
     name = "nomad_mac",
diff --git a/bazel_configure.py b/bazel_configure.py
index 02e67f1..63d1248 100755
--- a/bazel_configure.py
+++ b/bazel_configure.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -413,7 +413,7 @@
   env_map['AUTOMAKE'] = discover_tool('automake', 'Automake', 'AUTOMAKE', '1.9.6')
   env_map['AUTOCONF'] = discover_tool('autoconf', 'Autoconf', 'AUTOCONF', '2.6.3')
   env_map['MAKE'] = discover_tool('make', 'Make', 'MAKE', '3.81')
-  env_map['PYTHON'] = discover_tool('python', 'Python', 'PYTHON', '2.7')
+  env_map['PYTHON3'] = discover_tool('python3', 'Python3', 'PYTHON3', '3.4')
 
   if platform == 'Darwin':
     env_map['LIBTOOL'] = discover_tool('glibtool', 'Libtool', 'LIBTOOL', '2.4.2')
diff --git a/config/configure.ac b/config/configure.ac
index b7ab890..caba2fa 100644
--- a/config/configure.ac
+++ b/config/configure.ac
@@ -100,8 +100,8 @@
 ACX_PTHREAD
 
 # Check the python version required
-AM_PATH_PYTHON([2.4.3])
-AC_PATH_PROG([PYTHON], [python], [],[])
+AM_PATH_PYTHON([3.4])
+AC_PATH_PROG([PYTHON3], [python3], [],[])
 
 abs_top_builddir=`pwd`
 AC_SUBST(abs_top_builddir)
diff --git a/deploy/kubernetes/minikube/apiserver.yaml b/deploy/kubernetes/minikube/apiserver.yaml
index 502f746..b6e8df7 100644
--- a/deploy/kubernetes/minikube/apiserver.yaml
+++ b/deploy/kubernetes/minikube/apiserver.yaml
@@ -64,7 +64,7 @@
       initContainers:
         - name: init-heron-apiserver
           image: apache/bookkeeper:4.7.3
-          command: ['sh', '-c', '/opt/bookkeeper/bin/dlog admin bind -l /ledgers -s zookeeper:2181 -c distributedlog://zookeeper:2181/heron']
+          command: ['sh', '-c', '/opt/bookkeeper/bin/dlog admin bind -l /ledgers -s zookeeper:2181 -c distributedlog://zookeeper:2181/heronbkdl']
       containers:
         - name: heron-apiserver
           image: heron/heron:latest
@@ -79,9 +79,9 @@
               -D heron.executor.docker.image=heron/heron:latest
               -D heron.class.uploader=org.apache.heron.uploader.dlog.DLUploader
               -D heron.uploader.dlog.topologies.num.replicas=1
-              -D heron.uploader.dlog.topologies.namespace.uri=distributedlog://zookeeper:2181/heron
+              -D heron.uploader.dlog.topologies.namespace.uri=distributedlog://zookeeper:2181/heronbkdl
               -D heron.statefulstorage.classname=org.apache.heron.statefulstorage.dlog.DlogStorage
-              -D heron.statefulstorage.dlog.namespace.uri=distributedlog://zookeeper:2181/heron
+              -D heron.statefulstorage.dlog.namespace.uri=distributedlog://zookeeper:2181/heronbkdl
 
 ---
 apiVersion: v1
diff --git a/deploy/kubernetes/minikube/bookkeeper.yaml b/deploy/kubernetes/minikube/bookkeeper.yaml
index f5778c3..9e1d80f 100644
--- a/deploy/kubernetes/minikube/bookkeeper.yaml
+++ b/deploy/kubernetes/minikube/bookkeeper.yaml
@@ -65,6 +65,11 @@
           envFrom:
             - configMapRef:
                 name: bookie-config
+          volumeMounts:
+            - name: journal-disk
+              mountPath: /bookkeeper/data/journal
+            - name: ledgers-disk
+              mountPath: /bookkeeper/data/ledgers
       containers:
         - name: bookie
           image: apache/bookkeeper:4.7.3
@@ -91,13 +96,11 @@
               valueFrom:
                 fieldRef:
                   fieldPath: status.hostIP
-    
           volumeMounts:
             - name: journal-disk
               mountPath: /bookkeeper/data/journal
             - name: ledgers-disk
               mountPath: /bookkeeper/data/ledgers
-
       volumes:
           # Mount local disks
         - name: journal-disk
diff --git a/docker/base/Dockerfile.base.debian9 b/docker/base/Dockerfile.base.debian9
index 3a4ba7f..652f6e7 100644
--- a/docker/base/Dockerfile.base.debian9
+++ b/docker/base/Dockerfile.base.debian9
@@ -19,7 +19,7 @@
 
 RUN apt-get -y update && apt-get -y install \
     netcat-openbsd \
-    python \
+    python3 \
     unzip \
     curl \
     supervisor && \
diff --git a/docker/compile/Dockerfile.centos7 b/docker/compile/Dockerfile.centos7
index c117ee9..5f39bde 100644
--- a/docker/compile/Dockerfile.centos7
+++ b/docker/compile/Dockerfile.centos7
@@ -35,8 +35,8 @@
       libtool \
       make \
       patch \
-      python-devel \
-      cppunit-devel \
+      python \
+      python3-devel \
       zip \
       unzip \
       wget \
diff --git a/docker/compile/Dockerfile.debian10 b/docker/compile/Dockerfile.debian10
index 2731995..c691831 100644
--- a/docker/compile/Dockerfile.debian10
+++ b/docker/compile/Dockerfile.debian10
@@ -33,9 +33,10 @@
       libcppunit-dev \
       pkg-config \
       python \
-      python-dev \
+      python3 \
+      python3-dev \
       software-properties-common \
-      python-setuptools \
+      python3-setuptools \
       tree \
       zip \
       unzip \
diff --git a/docker/compile/Dockerfile.debian9 b/docker/compile/Dockerfile.debian9
index f5e7b11..ffe96bc 100644
--- a/docker/compile/Dockerfile.debian9
+++ b/docker/compile/Dockerfile.debian9
@@ -32,9 +32,9 @@
       libtool-bin \
       libcppunit-dev \
       pkg-config \
-      python-dev \
-      python3-dev \
       software-properties-common \
+      python \
+      python3-dev \
       python3-setuptools \
       tree \
       zip \
diff --git a/docker/compile/Dockerfile.ubuntu14.04 b/docker/compile/Dockerfile.ubuntu14.04
index 521cce9..2bf2e4d 100644
--- a/docker/compile/Dockerfile.ubuntu14.04
+++ b/docker/compile/Dockerfile.ubuntu14.04
@@ -33,7 +33,7 @@
       libssl-dev \
       git \
       libtool \
-      python-dev \
+      python3-dev \
       pkg-config \
       libcppunit-dev \
       zip \
diff --git a/docker/compile/Dockerfile.ubuntu16.04 b/docker/compile/Dockerfile.ubuntu16.04
index dbf7b70..f3f6fb5 100644
--- a/docker/compile/Dockerfile.ubuntu16.04
+++ b/docker/compile/Dockerfile.ubuntu16.04
@@ -30,17 +30,18 @@
       build-essential \
       cmake \
       curl \
-      libssl-dev \
       git \
+      libssl-dev \
       libtool-bin \
-      libunwind8 \
       libunwind-setjmp0-dev \
+      python \
+      python3-dev \
       pkg-config \
-      python-dev \
       libcppunit-dev \
+      software-properties-common \
       tree \
-      zip \
       unzip \
+      zip \
       wget
 
 RUN apt-get update && apt-get -y install \
diff --git a/docker/compile/Dockerfile.ubuntu18.04 b/docker/compile/Dockerfile.ubuntu18.04
index 2f115cf..b0c71ea 100644
--- a/docker/compile/Dockerfile.ubuntu18.04
+++ b/docker/compile/Dockerfile.ubuntu18.04
@@ -29,7 +29,8 @@
       libunwind8 \
       libcppunit-dev \
       patch \
-      python-dev \
+      python \
+      python3-dev \
       pkg-config \
       wget \
       zip \
diff --git a/docker/compile/Dockerfile.ubuntu20.04 b/docker/compile/Dockerfile.ubuntu20.04
index c963910..a028c4e 100644
--- a/docker/compile/Dockerfile.ubuntu20.04
+++ b/docker/compile/Dockerfile.ubuntu20.04
@@ -31,7 +31,8 @@
       libunwind8 \
       libcppunit-dev \
       patch \
-      python-dev \
+      python3-dev \
+      python \
       pkg-config \
       wget \
       zip \
diff --git a/docker/dist/Dockerfile.dist.centos7 b/docker/dist/Dockerfile.dist.centos7
index 2124a1f..002afd6 100644
--- a/docker/dist/Dockerfile.dist.centos7
+++ b/docker/dist/Dockerfile.dist.centos7
@@ -24,7 +24,8 @@
     java-11-openjdk-headless \
     supervisor \
     nmap-ncat \
-    python \
+    python3 \
+    python3-setuptools \
     unzip \
     which \
     && yum clean all
diff --git a/docker/dist/Dockerfile.dist.debian10 b/docker/dist/Dockerfile.dist.debian10
index 8f1caa3..236fbcc 100644
--- a/docker/dist/Dockerfile.dist.debian10
+++ b/docker/dist/Dockerfile.dist.debian10
@@ -21,7 +21,8 @@
     && apt-get -y install \
     curl \
     netcat-openbsd \
-    python \
+    python3 \
+    python3-dev \
     supervisor \
     unzip \
     && apt-get clean
diff --git a/docker/dist/Dockerfile.dist.debian9 b/docker/dist/Dockerfile.dist.debian9
index bf89741..b525722 100644
--- a/docker/dist/Dockerfile.dist.debian9
+++ b/docker/dist/Dockerfile.dist.debian9
@@ -21,7 +21,8 @@
     && apt-get -y install \
     curl \
     netcat-openbsd \
-    python \
+    python3 \
+    python3-dev \
     supervisor \
     unzip \
     && apt-get clean all \
diff --git a/docker/dist/Dockerfile.dist.ubuntu14.04 b/docker/dist/Dockerfile.dist.ubuntu14.04
index 676efa4..4bc93e0 100644
--- a/docker/dist/Dockerfile.dist.ubuntu14.04
+++ b/docker/dist/Dockerfile.dist.ubuntu14.04
@@ -21,7 +21,8 @@
     && apt-get -y install \
     curl \
     netcat-openbsd \
-    python \
+    python3 \
+    python3-distutils \
     software-properties-common \
     supervisor \
     unzip \
diff --git a/docker/dist/Dockerfile.dist.ubuntu16.04 b/docker/dist/Dockerfile.dist.ubuntu16.04
index 1c5d0ea..deb97d8 100644
--- a/docker/dist/Dockerfile.dist.ubuntu16.04
+++ b/docker/dist/Dockerfile.dist.ubuntu16.04
@@ -21,7 +21,8 @@
     && apt-get install -y \
     curl \
     netcat-openbsd \
-    python \
+    python3 \
+    python3-distutils \
     software-properties-common \
     supervisor \
     unzip \
diff --git a/docker/dist/Dockerfile.dist.ubuntu18.04 b/docker/dist/Dockerfile.dist.ubuntu18.04
index 5d4f664..7eaa7f8 100644
--- a/docker/dist/Dockerfile.dist.ubuntu18.04
+++ b/docker/dist/Dockerfile.dist.ubuntu18.04
@@ -22,7 +22,8 @@
     curl \
     netcat-openbsd \
     openjdk-11-jre-headless \
-    python \
+    python3 \
+    python3-distutils \
     supervisor \
     unzip \
     && apt-get clean
diff --git a/docker/dist/Dockerfile.dist.ubuntu20.04 b/docker/dist/Dockerfile.dist.ubuntu20.04
index 0b43260..8dc6224 100644
--- a/docker/dist/Dockerfile.dist.ubuntu20.04
+++ b/docker/dist/Dockerfile.dist.ubuntu20.04
@@ -24,7 +24,8 @@
     curl \
     openjdk-11-jre-headless \
     netcat-openbsd \
-    python \
+    python3 \
+    python3-distutils \
     supervisor \
     unzip \
     && apt-get clean
diff --git a/docker/test/Dockerfile.centos7 b/docker/test/Dockerfile.centos7
index 3a36273..d2ad420 100644
--- a/docker/test/Dockerfile.centos7
+++ b/docker/test/Dockerfile.centos7
@@ -36,7 +36,7 @@
       cppunit-devel \
       make \
       patch \
-      python-devel \
+      python3-devel \
       python3-devel \
       python3-setuptools \
       zip \
diff --git a/docker/test/Dockerfile.ubuntu18.04 b/docker/test/Dockerfile.ubuntu18.04
index 2586947..a62a77a 100644
--- a/docker/test/Dockerfile.ubuntu18.04
+++ b/docker/test/Dockerfile.ubuntu18.04
@@ -29,7 +29,6 @@
       libunwind8 \
       libcppunit-dev \
       patch \
-      python-dev \
       python3-dev \
       wget \
       zip \
diff --git a/examples/src/python/bolt/consume_bolt.py b/examples/src/python/bolt/consume_bolt.py
index 4a8f70b..55df344 100644
--- a/examples/src/python/bolt/consume_bolt.py
+++ b/examples/src/python/bolt/consume_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/bolt/count_bolt.py b/examples/src/python/bolt/count_bolt.py
index 34ac1e8..0a9a33a 100644
--- a/examples/src/python/bolt/count_bolt.py
+++ b/examples/src/python/bolt/count_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/bolt/half_ack_bolt.py b/examples/src/python/bolt/half_ack_bolt.py
index f7346cb..5b6dab4 100644
--- a/examples/src/python/bolt/half_ack_bolt.py
+++ b/examples/src/python/bolt/half_ack_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/bolt/stateful_count_bolt.py b/examples/src/python/bolt/stateful_count_bolt.py
index 260c0bf..26fd000 100644
--- a/examples/src/python/bolt/stateful_count_bolt.py
+++ b/examples/src/python/bolt/stateful_count_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/bolt/stream_aggregate_bolt.py b/examples/src/python/bolt/stream_aggregate_bolt.py
index e94c3ca..c73af5d 100644
--- a/examples/src/python/bolt/stream_aggregate_bolt.py
+++ b/examples/src/python/bolt/stream_aggregate_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/bolt/window_size_bolt.py b/examples/src/python/bolt/window_size_bolt.py
index 083b793..5604038 100644
--- a/examples/src/python/bolt/window_size_bolt.py
+++ b/examples/src/python/bolt/window_size_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/custom_grouping_topology.py b/examples/src/python/custom_grouping_topology.py
index ec6e0e2..2d0f199 100644
--- a/examples/src/python/custom_grouping_topology.py
+++ b/examples/src/python/custom_grouping_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/half_acking_topology.py b/examples/src/python/half_acking_topology.py
index 7c89aeb..e4fd3cc 100644
--- a/examples/src/python/half_acking_topology.py
+++ b/examples/src/python/half_acking_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/join_streamlet_topology.py b/examples/src/python/join_streamlet_topology.py
index 8667315..82c45dd 100644
--- a/examples/src/python/join_streamlet_topology.py
+++ b/examples/src/python/join_streamlet_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/misc/test_task_hook.py b/examples/src/python/misc/test_task_hook.py
index 8ad2eec..f3eb767 100644
--- a/examples/src/python/misc/test_task_hook.py
+++ b/examples/src/python/misc/test_task_hook.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/multi_stream_topology.py b/examples/src/python/multi_stream_topology.py
index 7507e55..24aad64 100644
--- a/examples/src/python/multi_stream_topology.py
+++ b/examples/src/python/multi_stream_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/pulsar_word_count_streamlet.py b/examples/src/python/pulsar_word_count_streamlet.py
index 6d6d7d8..3e1a409 100644
--- a/examples/src/python/pulsar_word_count_streamlet.py
+++ b/examples/src/python/pulsar_word_count_streamlet.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/spout/multi_stream_spout.py b/examples/src/python/spout/multi_stream_spout.py
index b9e60a4..d2ded5d 100644
--- a/examples/src/python/spout/multi_stream_spout.py
+++ b/examples/src/python/spout/multi_stream_spout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/spout/stateful_word_spout.py b/examples/src/python/spout/stateful_word_spout.py
index 3a09777..c976ab8 100644
--- a/examples/src/python/spout/stateful_word_spout.py
+++ b/examples/src/python/spout/stateful_word_spout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/spout/word_spout.py b/examples/src/python/spout/word_spout.py
index d8a44c9..25ab969 100644
--- a/examples/src/python/spout/word_spout.py
+++ b/examples/src/python/spout/word_spout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/stateful_word_count_topology.py b/examples/src/python/stateful_word_count_topology.py
index 7068aa0..b7bd3f8 100644
--- a/examples/src/python/stateful_word_count_topology.py
+++ b/examples/src/python/stateful_word_count_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/window_size_topology.py b/examples/src/python/window_size_topology.py
index 3ef9409..9a2b513 100644
--- a/examples/src/python/window_size_topology.py
+++ b/examples/src/python/window_size_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/word_count_streamlet.py b/examples/src/python/word_count_streamlet.py
index 112749f..4f2ecd6 100644
--- a/examples/src/python/word_count_streamlet.py
+++ b/examples/src/python/word_count_streamlet.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/examples/src/python/word_count_topology.py b/examples/src/python/word_count_topology.py
index c849699..ed01cde 100644
--- a/examples/src/python/word_count_topology.py
+++ b/examples/src/python/word_count_topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/common/src/python/pex_loader.py b/heron/common/src/python/pex_loader.py
index 622e357..565dbdf 100644
--- a/heron/common/src/python/pex_loader.py
+++ b/heron/common/src/python/pex_loader.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -36,8 +36,8 @@
   Note that dependencies are located under `.deps` directory
   """
   pex = zipfile.ZipFile(abs_path_to_pex, mode='r')
-  deps = list(set([re.match(egg_regex, i).group(1) for i in pex.namelist()
-                   if re.match(egg_regex, i) is not None]))
+  deps = list({re.match(egg_regex, i).group(1) for i in pex.namelist()
+               if re.match(egg_regex, i) is not None})
   return deps
 
 def load_pex(path_to_pex, include_deps=True):
@@ -81,6 +81,7 @@
   tests have to have a pex with its top level package name of ``heron``.
   """
   # import top-level package named `heron` of a given pex file
+  # pylint: disable=no-member
   importer = zipimport.zipimporter(abs_pex_path)
   importer.load_module("heron")
 
@@ -89,6 +90,7 @@
   loaded = ['heron']
   loaded_mod = None
   for to_load in to_load_lst:
+    # pylint: disable=no-member
     sub_importer = zipimport.zipimporter(os.path.join(abs_pex_path, '/'.join(loaded)))
     loaded_mod = sub_importer.load_module(to_load)
     loaded.append(to_load)
@@ -122,6 +124,6 @@
     except:
       Log.error("Could not resolve class %s with special handling" % python_class_name)
 
-  mod = __import__(from_path, fromlist=[import_name], level=-1)
+  mod = __import__(from_path, fromlist=[import_name], level=0)
   Log.debug("Imported module: %s" % str(mod))
   return getattr(mod, import_name)
diff --git a/heron/common/src/python/utils/log.py b/heron/common/src/python/utils/log.py
index f3e4b01..b7fb209 100644
--- a/heron/common/src/python/utils/log.py
+++ b/heron/common/src/python/utils/log.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -61,7 +61,6 @@
   # otherwise, use StreamHandler to output to stream (stdout, stderr...)
   else:
     log_format = "[%(asctime)s] %(log_color)s[%(levelname)s]%(reset)s: %(message)s"
-    # pylint: disable=redefined-variable-type
     formatter = colorlog.ColoredFormatter(fmt=log_format, datefmt=date_format)
     stream_handler = logging.StreamHandler()
     stream_handler.setFormatter(formatter)
@@ -85,9 +84,9 @@
   root_logger.addHandler(handler)
 
   for handler in root_logger.handlers:
-    root_logger.debug("Associated handlers - " + str(handler))
+    root_logger.debug("Associated handlers - %s", str(handler))
     if isinstance(handler, logging.StreamHandler):
-      root_logger.debug("Removing StreamHandler: " + str(handler))
+      root_logger.debug("Removing StreamHandler: %s", str(handler))
       root_logger.handlers.remove(handler)
 
 def set_logging_level(cl_args):
diff --git a/heron/common/src/python/utils/proc.py b/heron/common/src/python/utils/proc.py
index 2adcb29..897c681 100644
--- a/heron/common/src/python/utils/proc.py
+++ b/heron/common/src/python/utils/proc.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -81,12 +81,12 @@
   """
   return _async_stream_process_output(process, stream_process_stderr, handler)
 
-class StringBuilder(object):
+class StringBuilder:
   def __init__(self):
     self.end = False
     self.strs = []
 
-  def add(self, line):
+  def add(self, line: bytes):
     if not line:
       self.end = True
     else:
@@ -96,8 +96,6 @@
     while True:
       if self.end:
         return ''.join(self.strs)
-      else:
-        continue
 
 def async_stdout_builder(proc):
   """ Save stdout into string builder
diff --git a/heron/common/tests/python/pex_loader/constants.py b/heron/common/tests/python/pex_loader/constants.py
index 63bdd27..ff99926 100644
--- a/heron/common/tests/python/pex_loader/constants.py
+++ b/heron/common/tests/python/pex_loader/constants.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/common/tests/python/pex_loader/pex_loader_unittest.py b/heron/common/tests/python/pex_loader/pex_loader_unittest.py
index 06b4b64..cf92ee1 100644
--- a/heron/common/tests/python/pex_loader/pex_loader_unittest.py
+++ b/heron/common/tests/python/pex_loader/pex_loader_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/common/tests/python/pex_loader/testdata/src/sample.py b/heron/common/tests/python/pex_loader/testdata/src/sample.py
index 382bd6b..fcc4780 100644
--- a/heron/common/tests/python/pex_loader/testdata/src/sample.py
+++ b/heron/common/tests/python/pex_loader/testdata/src/sample.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 '''sample.py: sample module as testdata for pex_loader unittest'''
 
-class SampleClass(object):
+class SampleClass:
   """Sample class"""
   name = "sample class"
   age = 100
diff --git a/heron/executor/src/python/heron_executor.py b/heron/executor/src/python/heron_executor.py
index 674c3fc..cb063aa 100755
--- a/heron/executor/src/python/heron_executor.py
+++ b/heron/executor/src/python/heron_executor.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -148,7 +148,7 @@
   # Log the messages to stdout and strip off the newline because Log.info adds one automatically
   return lambda line: Log.info("%s stdout: %s", cmd, line.rstrip('\n'))
 
-class Command(object):
+class Command:
   """
   Command to run as a separate process using subprocess.POpen
   :param cmd: command to run (as a list)
@@ -179,15 +179,15 @@
   def __eq__(self, other):
     return self.cmd == other.cmd
 
-class ProcessInfo(object):
+class ProcessInfo:
+  """
+  Container for info related to a running process
+  :param process: the process POpen object
+  :param name: the logical (i.e., unique) name of the process
+  :param command: an array of strings comprising the command and it's args
+  :param attempts: how many times the command has been run (defaults to 1)
+  """
   def __init__(self, process, name, command, attempts=1):
-    """
-    Container for info related to a running process
-    :param process: the process POpen object
-    :param name: the logical (i.e., unique) name of the process
-    :param command: an array of strings comprising the command and it's args
-    :param attempts: how many times the command has been run (defaults to 1)
-    """
     self.process = process
     self.pid = process.pid
     self.name = name
@@ -199,8 +199,14 @@
     self.attempts += 1
     return self
 
+  def __repr__(self):
+    return (
+        "ProcessInfo(pid=%(pid)r, name=%(name)r, command=%(command)r, attempts=%(attempts)r)"
+        % vars(self)
+    )
+
 # pylint: disable=too-many-instance-attributes,too-many-statements
-class HeronExecutor(object):
+class HeronExecutor:
   """ Heron executor is a class that is responsible for running each of the process on a given
   container. Based on the container id and the instance distribution, it determines if the container
   is a master node or a worker node and it starts processes accordingly."""
@@ -223,8 +229,8 @@
     # escaping is still left there for reference and backward compatibility purposes (to be
     # removed after no topology needs it)
     self.instance_jvm_opts =\
-        base64.b64decode(parsed_args.instance_jvm_opts.lstrip('"').
-                         rstrip('"').replace('(61)', '=').replace('&equals;', '='))
+        base64.b64decode(parsed_args.instance_jvm_opts.
+                         strip('"').replace('(61)', '=').replace('&equals;', '=')).decode()
     self.classpath = parsed_args.classpath
     # Needed for Docker environments since the hostname of a docker container is the container's
     # id within docker, rather than the host's hostname. NOTE: this 'HOST' env variable is not
@@ -255,11 +261,11 @@
     # removed after no topology needs it)
     component_jvm_opts_in_json =\
         base64.b64decode(parsed_args.component_jvm_opts.
-                         lstrip('"').rstrip('"').replace('(61)', '=').replace('&equals;', '='))
+                         strip('"').replace('(61)', '=').replace('&equals;', '=')).decode()
     if component_jvm_opts_in_json != "":
       for (k, v) in list(json.loads(component_jvm_opts_in_json).items()):
         # In json, the component name and JVM options are still in base64 encoding
-        self.component_jvm_opts[base64.b64decode(k)] = base64.b64decode(v)
+        self.component_jvm_opts[base64.b64decode(k).decode()] = base64.b64decode(v).decode()
 
     self.pkg_type = parsed_args.pkg_type
     self.topology_binary_file = parsed_args.topology_binary_file
@@ -691,7 +697,8 @@
     if not self.jvm_version:
       cmd = [os.path.join(self.heron_java_home, 'bin/java'),
              '-cp', self.instance_classpath, 'org.apache.heron.instance.util.JvmVersion']
-      process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+      process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE,
+                                 universal_newlines=True)
       (process_stdout, process_stderr) = process.communicate()
       if process.returncode != 0:
         Log.error("Failed to determine JVM version. Exiting. Output of %s: %s",
@@ -905,7 +912,7 @@
       # stderr is redirected to stdout so that it can more easily be logged. stderr has a max buffer
       # size and can cause the child process to deadlock if it fills up
       process = subprocess.Popen(cmd.cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
-                                 env=cmd.env, bufsize=1)
+                                 env=cmd.env, universal_newlines=True, bufsize=1)
       proc.async_stream_process_stdout(process, stdout_log_fn(name))
     except Exception:
       Log.info("Exception running command %s", cmd)
@@ -919,7 +926,7 @@
       # stderr is redirected to stdout so that it can more easily be logged. stderr has a max buffer
       # size and can cause the child process to deadlock if it fills up
       process = subprocess.Popen(cmd.cmd, shell=is_shell, stdout=subprocess.PIPE,
-                                 stderr=subprocess.STDOUT, env=cmd.env)
+                                 stderr=subprocess.STDOUT, universal_newlines=True, env=cmd.env)
 
       # wait for termination
       self._wait_process_std_out_err(cmd.cmd, process)
diff --git a/heron/executor/tests/python/heron_executor_unittest.py b/heron/executor/tests/python/heron_executor_unittest.py
index 255b958..823898c 100644
--- a/heron/executor/tests/python/heron_executor_unittest.py
+++ b/heron/executor/tests/python/heron_executor_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -24,6 +24,8 @@
 import unittest2 as unittest
 import json
 
+from pprint import pprint
+
 from heron.executor.src.python.heron_executor import ProcessInfo
 from heron.executor.src.python.heron_executor import HeronExecutor
 from heron.proto.packing_plan_pb2 import PackingPlan
@@ -52,7 +54,7 @@
   def default(self, o):
     return o.cmd
 
-class MockPOpen(object):
+class MockPOpen:
   """fake subprocess.Popen object that we can use to mock processes and pids"""
   next_pid = 0
 
@@ -180,10 +182,10 @@
                   '--metrics_sinks_yaml=metrics_sinks_config_file '
                   '--metricsmgr_port=metricsmgr_port '
                   '--ckptmgr_port=ckptmgr-port' % (HOSTNAME, INTERNAL_CONF_PATH, OVERRIDE_PATH)),
-      ProcessInfo(MockPOpen(), 'heron-shell-0', get_expected_shell_command(0)),
-      ProcessInfo(MockPOpen(), 'metricsmgr-0', get_expected_metricsmgr_command(0)),
       ProcessInfo(MockPOpen(), 'heron-metricscache', get_expected_metricscachemgr_command()),
       ProcessInfo(MockPOpen(), 'heron-healthmgr', get_expected_healthmgr_command()),
+      ProcessInfo(MockPOpen(), 'metricsmgr-0', get_expected_metricsmgr_command(0)),
+      ProcessInfo(MockPOpen(), 'heron-shell-0', get_expected_shell_command(0)),
   ]
 
   MockPOpen.set_next_pid(37)
@@ -199,20 +201,17 @@
                   '--ckptmgr_port=ckptmgr-port --ckptmgr_id=ckptmgr-1 '
                   '--metricscachemgr_mode=cluster'
                   % (HOSTNAME, INTERNAL_CONF_PATH, OVERRIDE_PATH)),
+      ProcessInfo(MockPOpen(), 'metricsmgr-1', get_expected_metricsmgr_command(1)),
       ProcessInfo(MockPOpen(), 'container_1_word_3', get_expected_instance_command('word', 3, 1)),
-      ProcessInfo(MockPOpen(), 'container_1_exclaim1_1',
-                  get_expected_instance_command('exclaim1', 1, 1)),
       ProcessInfo(MockPOpen(), 'container_1_exclaim1_2',
                   get_expected_instance_command('exclaim1', 2, 1)),
+      ProcessInfo(MockPOpen(), 'container_1_exclaim1_1',
+                  get_expected_instance_command('exclaim1', 1, 1)),
       ProcessInfo(MockPOpen(), 'heron-shell-1', get_expected_shell_command(1)),
-      ProcessInfo(MockPOpen(), 'metricsmgr-1', get_expected_metricsmgr_command(1)),
   ]
 
   MockPOpen.set_next_pid(37)
   expected_processes_container_7 = [
-      ProcessInfo(MockPOpen(), 'container_7_word_11', get_expected_instance_command('word', 11, 7)),
-      ProcessInfo(MockPOpen(), 'container_7_exclaim1_210',
-                  get_expected_instance_command('exclaim1', 210, 7)),
       ProcessInfo(MockPOpen(), 'stmgr-7',
                   'stmgr_binary --topology_name=topname --topology_id=topid '
                   '--topologydefn_file=topdefnfile --zkhostportlist=zknode --zkroot=zkroot '
@@ -225,6 +224,9 @@
                   '--metricscachemgr_mode=cluster'
                   % (HOSTNAME, INTERNAL_CONF_PATH, OVERRIDE_PATH)),
       ProcessInfo(MockPOpen(), 'metricsmgr-7', get_expected_metricsmgr_command(7)),
+      ProcessInfo(MockPOpen(), 'container_7_word_11', get_expected_instance_command('word', 11, 7)),
+      ProcessInfo(MockPOpen(), 'container_7_exclaim1_210',
+                  get_expected_instance_command('exclaim1', 210, 7)),
       ProcessInfo(MockPOpen(), 'heron-shell-7', get_expected_shell_command(7)),
   ]
 
@@ -330,12 +332,16 @@
     found_monitored = list([(pinfo[0], pinfo[1].name, pinfo[1].command_str) for pinfo in list(monitored_processes.items())])
     found_processes.sort(key=lambda tuple: tuple[0])
     found_monitored.sort(key=lambda tuple: tuple[0])
-    print("do_test_commands - found_processes: %s found_monitored: %s" \
-          % (found_processes, found_monitored))
+    print("found_processes:")
+    pprint(found_processes)
+    print("found_monitored:")
+    pprint(found_monitored)
     self.assertEqual(found_processes, found_monitored)
 
-    print("do_test_commands - expected_processes: %s monitored_processes: %s" \
-          % (expected_processes, monitored_processes))
+    print("expected_processes:")
+    pprint(expected_processes)
+    print("monitored_processes:")
+    pprint(monitored_processes)
     self.assert_processes(expected_processes, monitored_processes)
 
   def test_change_instance_dist_container_1(self):
diff --git a/heron/instance/src/python/basics/base_instance.py b/heron/instance/src/python/basics/base_instance.py
index 0a17520..ba114e5 100644
--- a/heron/instance/src/python/basics/base_instance.py
+++ b/heron/instance/src/python/basics/base_instance.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -38,7 +38,7 @@
 from heronpy.api.state.stateful_component import StatefulComponent
 
 # pylint: disable=too-many-instance-attributes
-class BaseInstance(object):
+class BaseInstance:
   """The base class for heron bolt/spout instance
 
   Implements the following functionality:
@@ -83,7 +83,7 @@
     if level is None:
       _log_level = logging.INFO
     else:
-      if level == "trace" or level == "debug":
+      if level in ("trace", "debug"):
         _log_level = logging.DEBUG
       elif level == "info":
         _log_level = logging.INFO
diff --git a/heron/instance/src/python/basics/bolt_instance.py b/heron/instance/src/python/basics/bolt_instance.py
index 3391513..ad18bcd 100644
--- a/heron/instance/src/python/basics/bolt_instance.py
+++ b/heron/instance/src/python/basics/bolt_instance.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -128,8 +128,8 @@
     # Set the anchors for a tuple
     if anchors is not None:
       merged_roots = set()
-      for tup in [t for t in anchors if isinstance(t, HeronTuple) and t.roots is not None]:
-        merged_roots.update(tup.roots)
+      for tuple_ in [t for t in anchors if isinstance(t, HeronTuple) and t.roots is not None]:
+        merged_roots.update(tuple_.roots)
       for rt in merged_roots:
         to_add = data_tuple.roots.add()
         to_add.CopyFrom(rt)
@@ -154,6 +154,7 @@
       if direct_task is not None:
         sent_task_ids.append(direct_task)
       return sent_task_ids
+    return None
 
   def process_incoming_tuples(self):
     """Should be called when tuple was buffered into in_stream
@@ -184,7 +185,7 @@
       if isinstance(tuples, tuple_pb2.HeronTupleSet):
         if tuples.HasField("control"):
           raise RuntimeError("Bolt cannot get acks/fails from other components")
-        elif tuples.HasField("data"):
+        if tuples.HasField("data"):
           stream = tuples.data.stream
 
           for data_tuple in tuples.data.tuples:
diff --git a/heron/instance/src/python/basics/spout_instance.py b/heron/instance/src/python/basics/spout_instance.py
index 3b4dccf..206dac5 100644
--- a/heron/instance/src/python/basics/spout_instance.py
+++ b/heron/instance/src/python/basics/spout_instance.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -173,6 +173,7 @@
       if direct_task is not None:
         sent_task_ids.append(direct_task)
       return sent_task_ids
+    return None
 
   # pylint: disable=no-self-use
   def process_incoming_tuples(self):
@@ -191,7 +192,7 @@
       if isinstance(tuples, tuple_pb2.HeronTupleSet):
         if tuples.HasField("data"):
           raise RuntimeError("Spout cannot get incoming data tuples from other components")
-        elif tuples.HasField("control"):
+        if tuples.HasField("control"):
           for ack_tuple in tuples.control.acks:
             self._handle_ack_tuple(ack_tuple, True)
           for fail_tuple in tuples.control.fails:
@@ -291,13 +292,12 @@
 
     if not self.acking_enabled and self.output_helper.is_out_queue_available():
       return True
-    elif self.acking_enabled and self.output_helper.is_out_queue_available() and \
+    if self.acking_enabled and self.output_helper.is_out_queue_available() and \
         len(self.in_flight_tuples) < max_spout_pending:
       return True
-    elif self.acking_enabled and not self.in_stream.is_empty():
+    if self.acking_enabled and not self.in_stream.is_empty():
       return True
-    else:
-      return False
+    return False
 
   def _look_for_timeouts(self):
     spout_config = self.pplan_helper.context.get_cluster_config()
diff --git a/heron/instance/src/python/instance/st_heron_instance.py b/heron/instance/src/python/instance/st_heron_instance.py
index eeed39e..1c91c49 100644
--- a/heron/instance/src/python/instance/st_heron_instance.py
+++ b/heron/instance/src/python/instance/st_heron_instance.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -26,11 +26,10 @@
 import resource
 import signal
 import traceback
-from heron.common.src.python.utils import log
 import yaml
 
+from heron.common.src.python.utils import log
 from heron.proto import physical_plan_pb2, tuple_pb2, ckptmgr_pb2, common_pb2
-
 from heron.instance.src.python.utils.misc import HeronCommunicator
 from heron.instance.src.python.utils.misc import SerializerHelper
 from heron.instance.src.python.utils.misc import PhysicalPlanHelper
@@ -52,7 +51,7 @@
   resource.setrlimit(resource.RLIMIT_RSS, (max_ram, max_ram))
 
 # pylint: disable=too-many-instance-attributes
-class SingleThreadHeronInstance(object):
+class SingleThreadHeronInstance:
   """SingleThreadHeronInstance is an implementation of Heron Instance in python"""
   STREAM_MGR_HOST = "127.0.0.1"
   METRICS_MGR_HOST = "127.0.0.1"
diff --git a/heron/instance/src/python/network/event_looper.py b/heron/instance/src/python/network/event_looper.py
index 607de7e..4d66cec 100644
--- a/heron/instance/src/python/network/event_looper.py
+++ b/heron/instance/src/python/network/event_looper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -28,7 +28,7 @@
 
 from heron.common.src.python.utils.log import Log
 
-class EventLooper(object):
+class EventLooper:
   """EventLooper is a Python implementation of WakeableLooper.java
 
   EventLooper is a class for scheduling recurring tasks that could:
@@ -85,7 +85,6 @@
   @abstractmethod
   def do_wait(self):
     """Blocking operation, should be implemented by a subclass"""
-    pass
 
   @abstractmethod
   def wake_up(self):
@@ -93,7 +92,6 @@
 
     Note that this method should be implemented in a thread-safe way.
     """
-    pass
 
   def add_wakeup_task(self, task):
     """Add a wakeup task
@@ -135,9 +133,8 @@
     """
     if len(self.timer_tasks) == 0:
       return sys.maxsize
-    else:
-      next_timeout_interval = self.timer_tasks[0][0] - time.time()
-      return next_timeout_interval
+    next_timeout_interval = self.timer_tasks[0][0] - time.time()
+    return next_timeout_interval
 
   def _execute_wakeup_tasks(self):
     """Executes wakeup tasks, should only be called from loop()"""
@@ -149,6 +146,7 @@
   def _trigger_timers(self):
     """Triggers expired timers"""
     current = time.time()
-    while len(self.timer_tasks) > 0 and (self.timer_tasks[0][0] - current <= 0):
+    # pylint: disable=chained-comparison
+    while self.timer_tasks and (self.timer_tasks[0][0] - current <= 0):
       task = heappop(self.timer_tasks)[1]
       task()
diff --git a/heron/instance/src/python/network/gateway_looper.py b/heron/instance/src/python/network/gateway_looper.py
index 62c8e8c..4bd7b44 100644
--- a/heron/instance/src/python/network/gateway_looper.py
+++ b/heron/instance/src/python/network/gateway_looper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -66,7 +66,7 @@
       self.poll(timeout=0.0)
 
   def wake_up(self):
-    os.write(self.pipe_w, "\n")
+    os.write(self.pipe_w, b"\n")
     Log.debug("Wake up called")
 
   def on_exit(self):
@@ -105,8 +105,7 @@
       Log.debug("Trivial error: " + str(err))
       if err.args[0] != errno.EINTR:
         raise
-      else:
-        return
+      return
     Log.debug("Selected [r]: " + str(readable_lst) +
               " [w]: " + str(writable_lst) + " [e]: " + str(error_lst))
 
diff --git a/heron/instance/src/python/network/heron_client.py b/heron/instance/src/python/network/heron_client.py
index 61eb99e..5e63af0 100644
--- a/heron/instance/src/python/network/heron_client.py
+++ b/heron/instance/src/python/network/heron_client.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -158,6 +158,7 @@
     write_batch_time_sec = self.socket_options.nw_write_batch_time_ms * constants.MS_TO_SEC
     write_batch_size_bytes = self.socket_options.nw_write_batch_size_bytes
 
+    # pylint: disable=chained-comparison
     while (time.time() - start_cycle_time - write_batch_time_sec) < 0 and \
             bytes_written < write_batch_size_bytes and len(self.out_buffer) > 0:
       outgoing_pkt = self.out_buffer[0]
@@ -320,7 +321,6 @@
 
     Should be implemented by a subclass.
     """
-    pass
 
   @abstractmethod
   def on_response(self, status, context, response):
@@ -328,7 +328,6 @@
 
     Should be implemented by a subclass.
     """
-    pass
 
   @abstractmethod
   def on_incoming_message(self, message):
@@ -336,7 +335,6 @@
 
     Should be implemented by a subclass.
     """
-    pass
 
   @abstractmethod
   def on_error(self):
@@ -345,4 +343,3 @@
     Note that this method is not called when a connection is not yet established.
     In such a case, ``on_connect()`` with status == StatusCode.CONNECT_ERROR is called.
     """
-    pass
diff --git a/heron/instance/src/python/network/metricsmgr_client.py b/heron/instance/src/python/network/metricsmgr_client.py
index 1f63e26..22c38d1 100644
--- a/heron/instance/src/python/network/metricsmgr_client.py
+++ b/heron/instance/src/python/network/metricsmgr_client.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/src/python/network/protocol.py b/heron/instance/src/python/network/protocol.py
index e15a85f..2e35692 100644
--- a/heron/instance/src/python/network/protocol.py
+++ b/heron/instance/src/python/network/protocol.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -25,7 +25,7 @@
 
 from heron.common.src.python.utils.log import Log
 
-class HeronProtocol(object):
+class HeronProtocol:
   """Heron's application level network protocol"""
   INT_PACK_FMT = ">I"
   HEADER_SIZE = 4
@@ -71,7 +71,7 @@
     len_typename = HeronProtocol.unpack_int(data[:4])
     data = data[4:]
 
-    typename = data[:len_typename]
+    typename = data[:len_typename].decode()
     data = data[len_typename:]
 
     reqid = REQID.unpack(data[:REQID.REQID_SIZE])
@@ -84,11 +84,11 @@
 
     return typename, reqid, serialized_msg
 
-class OutgoingPacket(object):
+class OutgoingPacket:
   """Wrapper class for outgoing packet"""
   def __init__(self, raw_data):
-    self.raw = str(raw_data)
-    self.to_send = str(raw_data)
+    self.raw = bytes(raw_data)
+    self.to_send = bytes(raw_data)
 
   def __len__(self):
     return len(self.raw)
@@ -101,7 +101,7 @@
     :param message: protocol buffer object
     """
     assert message.IsInitialized()
-    packet = ''
+    packet = b''
 
     # calculate the totla size of the packet incl. header
     typename = message.DESCRIPTOR.full_name
@@ -114,7 +114,7 @@
 
     # next write the type string
     packet += HeronProtocol.pack_int(len(typename))
-    packet += typename
+    packet += typename.encode()
 
     # reqid
     packet += reqid.pack()
@@ -137,12 +137,12 @@
     sent = dispatcher.send(self.to_send)
     self.to_send = self.to_send[sent:]
 
-class IncomingPacket(object):
+class IncomingPacket:
   """Helper class for incoming packet"""
   def __init__(self):
     """Initializes IncomingPacket object"""
-    self.header = ''
-    self.data = ''
+    self.header = b''
+    self.data = b''
     self.is_header_read = False
     self.is_complete = False
     # for debugging identification purposes
@@ -218,7 +218,7 @@
            (str(self.id), self.is_header_read, self.is_complete)
 
 
-class REQID(object):
+class REQID:
   """Helper class for REQID"""
   REQID_SIZE = 32
 
@@ -259,10 +259,9 @@
   def __str__(self):
     if self.is_zero():
       return "ZERO"
-    else:
-      return ''.join([str(i) for i in list(self.bytes)])
+    return ''.join([str(i) for i in list(self.bytes)])
 
-class StatusCode(object):
+class StatusCode:
   """StatusCode for Response"""
   OK = 0
   WRITE_ERROR = 1
diff --git a/heron/instance/src/python/network/socket_options.py b/heron/instance/src/python/network/socket_options.py
index 7d93219..d23520a 100644
--- a/heron/instance/src/python/network/socket_options.py
+++ b/heron/instance/src/python/network/socket_options.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/src/python/network/st_stmgr_client.py b/heron/instance/src/python/network/st_stmgr_client.py
index e2317e4..b993314 100644
--- a/heron/instance/src/python/network/st_stmgr_client.py
+++ b/heron/instance/src/python/network/st_stmgr_client.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/src/python/utils/metrics/metrics_helper.py b/heron/instance/src/python/utils/metrics/metrics_helper.py
index 39555ba..8e6c0c8 100644
--- a/heron/instance/src/python/utils/metrics/metrics_helper.py
+++ b/heron/instance/src/python/utils/metrics/metrics_helper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -29,7 +29,7 @@
 from heronpy.api.metrics import (CountMetric, MultiCountMetric, MeanReducedMetric,
                                  ReducedMetric, MultiMeanReducedMetric, MultiReducedMetric)
 
-class BaseMetricsHelper(object):
+class BaseMetricsHelper:
   """Helper class for metrics management
 
   It registers metrics to the metrics collector and provides methods for
@@ -325,7 +325,7 @@
     self.update_count(self.FAIL_COUNT, key=global_stream_id)
     self.update_reduced_metric(self.FAIL_LATENCY, latency_in_ns, global_stream_id)
 
-class MetricsCollector(object):
+class MetricsCollector:
   """Helper class for pushing metrics to Out-Metrics queue"""
   def __init__(self, looper, out_metrics):
     self.looper = looper
@@ -378,7 +378,7 @@
 
     if metric_value is None:
       return
-    elif isinstance(metric_value, dict):
+    if isinstance(metric_value, dict):
       for key, value in list(metric_value.items()):
         if key is not None and value is not None:
           self._add_data_to_message(message, name + "/" + str(key), value)
diff --git a/heron/instance/src/python/utils/metrics/py_metrics.py b/heron/instance/src/python/utils/metrics/py_metrics.py
index 96b095f..26ad2b6 100644
--- a/heron/instance/src/python/utils/metrics/py_metrics.py
+++ b/heron/instance/src/python/utils/metrics/py_metrics.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/src/python/utils/misc/communicator.py b/heron/instance/src/python/utils/misc/communicator.py
index f43d4d8..5f11cbb 100644
--- a/heron/instance/src/python/utils/misc/communicator.py
+++ b/heron/instance/src/python/utils/misc/communicator.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -24,7 +24,7 @@
 
 from heron.common.src.python.utils.log import Log
 
-class HeronCommunicator(object):
+class HeronCommunicator:
   """HeronCommunicator: a wrapper class for non-blocking queue in Heron.
 
   Note that this class does not yet implement the dynamic tuning of expected available capacity,
diff --git a/heron/instance/src/python/utils/misc/custom_grouping_helper.py b/heron/instance/src/python/utils/misc/custom_grouping_helper.py
index 18819d4..b149ceb 100644
--- a/heron/instance/src/python/utils/misc/custom_grouping_helper.py
+++ b/heron/instance/src/python/utils/misc/custom_grouping_helper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 '''custom_grouping_helper.py'''
 from collections import namedtuple
 
-class CustomGroupingHelper(object):
+class CustomGroupingHelper:
   """Helper class for managing custom grouping"""
   def __init__(self):
     # map <stream_id -> list(targets)>
@@ -72,12 +72,11 @@
     if not isinstance(ret, list):
       raise TypeError("Returned object after custom grouping's choose_tasks() "
                       "needs to be a list, given: %s" % str(type(ret)))
-    else:
-      for i in ret:
-        if not isinstance(i, int):
-          raise TypeError("Returned object after custom grouping's choose_tasks() "
-                          "contained non-integer: %s" % str(i))
-        if i not in self.task_ids:
-          raise ValueError("Returned object after custom grouping's choose_tasks() contained "
-                           "a task id that is not registered: %d" % i)
-      return ret
+    for i in ret:
+      if not isinstance(i, int):
+        raise TypeError("Returned object after custom grouping's choose_tasks() "
+                        "contained non-integer: %s" % str(i))
+      if i not in self.task_ids:
+        raise ValueError("Returned object after custom grouping's choose_tasks() contained "
+                         "a task id that is not registered: %d" % i)
+    return ret
diff --git a/heron/instance/src/python/utils/misc/outgoing_tuple_helper.py b/heron/instance/src/python/utils/misc/outgoing_tuple_helper.py
index 71a7f81..53c3556 100644
--- a/heron/instance/src/python/utils/misc/outgoing_tuple_helper.py
+++ b/heron/instance/src/python/utils/misc/outgoing_tuple_helper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -29,7 +29,7 @@
 
 # pylint: disable=too-many-instance-attributes
 # pylint: disable=no-value-for-parameter
-class OutgoingTupleHelper(object):
+class OutgoingTupleHelper:
   """Helper class for preparing and pushing tuples to Out-Stream
 
   This is a Python implementation of OutgoingTupleCollection.java
diff --git a/heron/instance/src/python/utils/misc/pplan_helper.py b/heron/instance/src/python/utils/misc/pplan_helper.py
index 477aaa3..6298969 100644
--- a/heron/instance/src/python/utils/misc/pplan_helper.py
+++ b/heron/instance/src/python/utils/misc/pplan_helper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -32,7 +32,7 @@
 from .custom_grouping_helper import CustomGroupingHelper
 
 # pylint: disable=too-many-instance-attributes
-class PhysicalPlanHelper(object):
+class PhysicalPlanHelper:
   """Helper class for accessing Physical Plan
 
   :ivar pplan: Physical Plan protobuf message
@@ -121,7 +121,7 @@
     if size is None:
       raise RuntimeError("%s emitting to stream %s but was not declared in output fields"
                          % (self.my_component_name, stream_id))
-    elif size != len(tup):
+    if size != len(tup):
       raise RuntimeError("Number of fields emitted in stream %s does not match what's expected. "
                          "Expected: %s, Observed: %s" % (stream_id, size, len(tup)))
 
@@ -129,15 +129,13 @@
     """Returns spout instance, or ``None`` if bolt is assigned"""
     if self.is_spout:
       return self._my_spbl
-    else:
-      return None
+    return None
 
   def get_my_bolt(self):
     """Returns bolt instance, or ``None`` if spout is assigned"""
     if self.is_spout:
       return None
-    else:
-      return self._my_spbl
+    return self._my_spbl
 
   def get_topology_state(self):
     """Returns the current topology state"""
@@ -159,8 +157,7 @@
     """Returns the topology config"""
     if self.pplan.topology.HasField("topology_config"):
       return self._get_dict_from_config(self.pplan.topology.topology_config)
-    else:
-      return {}
+    return {}
 
   def set_topology_context(self, metrics_collector):
     """Sets a new topology context"""
@@ -192,9 +189,10 @@
         if PhysicalPlanHelper._is_number(kv.value):
           config[kv.key] = PhysicalPlanHelper._get_number(kv.value)
         elif kv.value.lower() in ("true", "false"):
-          config[kv.key] = True if kv.value.lower() == "true" else False
+          config[kv.key] = kv.value.lower() == "true"
         else:
           config[kv.key] = kv.value
+      # pytlint: disable=simplifiable-if-expression
       elif kv.HasField("serialized_value") and \
         kv.type == topology_pb2.ConfigValueType.Value("PYTHON_SERIALIZED_VALUE"):
         # deserialize that
diff --git a/heron/instance/src/python/utils/misc/serializer_helper.py b/heron/instance/src/python/utils/misc/serializer_helper.py
index f81d1f0..3004c35 100644
--- a/heron/instance/src/python/utils/misc/serializer_helper.py
+++ b/heron/instance/src/python/utils/misc/serializer_helper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -25,7 +25,7 @@
 from heronpy.api.serializer import PythonSerializer
 import heronpy.api.api_constants as constants
 
-class SerializerHelper(object):
+class SerializerHelper:
   """Helper class for getting serializer for component"""
   @staticmethod
   def get_serializer(context):
@@ -34,13 +34,12 @@
     serializer_clsname = cluster_config.get(constants.TOPOLOGY_SERIALIZER_CLASSNAME, None)
     if serializer_clsname is None:
       return PythonSerializer()
-    else:
-      try:
-        topo_pex_path = context.get_topology_pex_path()
-        pex_loader.load_pex(topo_pex_path)
-        serializer_cls = pex_loader.import_and_get_class(topo_pex_path, serializer_clsname)
-        serializer = serializer_cls()
-        return serializer
-      except Exception as e:
-        raise RuntimeError("Error with loading custom serializer class: %s, with error message: %s"
-                           % (serializer_clsname, str(e)))
+    try:
+      topo_pex_path = context.get_topology_pex_path()
+      pex_loader.load_pex(topo_pex_path)
+      serializer_cls = pex_loader.import_and_get_class(topo_pex_path, serializer_clsname)
+      serializer = serializer_cls()
+      return serializer
+    except Exception as e:
+      raise RuntimeError("Error with loading custom serializer class: %s, with error message: %s"
+                         % (serializer_clsname, str(e)))
diff --git a/heron/instance/src/python/utils/system_config.py b/heron/instance/src/python/utils/system_config.py
index 6327578..6801fc9 100644
--- a/heron/instance/src/python/utils/system_config.py
+++ b/heron/instance/src/python/utils/system_config.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/src/python/utils/system_constants.py b/heron/instance/src/python/utils/system_constants.py
index aa080b6..fd75423 100644
--- a/heron/instance/src/python/utils/system_constants.py
+++ b/heron/instance/src/python/utils/system_constants.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/src/python/utils/topology/topology_context_impl.py b/heron/instance/src/python/utils/topology/topology_context_impl.py
index 579aea9..8d3c4c1 100644
--- a/heron/instance/src/python/utils/topology/topology_context_impl.py
+++ b/heron/instance/src/python/utils/topology/topology_context_impl.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -100,8 +100,7 @@
         key = StreamId(id=istream.stream.id, component_name=istream.stream.component_name)
         ret[key] = istream.gtype
       return ret
-    else:
-      return None
+    return None
 
   def get_this_sources(self):
     return self.get_sources(self.get_component_id())
diff --git a/heron/instance/src/python/utils/tuple.py b/heron/instance/src/python/utils/tuple.py
index 129c16d..bd608d9 100644
--- a/heron/instance/src/python/utils/tuple.py
+++ b/heron/instance/src/python/utils/tuple.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -51,7 +51,7 @@
   def is_expired(self, current_time, timeout_sec):
     return self.insertion_time + timeout_sec - current_time <= 0
 
-class TupleHelper(object):
+class TupleHelper:
   """Tuple Helper, returns Heron Tuple compatible tuple"""
   TICK_TUPLE_ID = "__tick"
   TICK_SOURCE_COMPONENT = "__system"
diff --git a/heron/instance/tests/python/mock_protobuf.py b/heron/instance/tests/python/mock_protobuf.py
index 13ed8e3..fb5a147 100644
--- a/heron/instance/tests/python/mock_protobuf.py
+++ b/heron/instance/tests/python/mock_protobuf.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/network/event_looper_unittest.py b/heron/instance/tests/python/network/event_looper_unittest.py
index 69f4249..0fe3411 100644
--- a/heron/instance/tests/python/network/event_looper_unittest.py
+++ b/heron/instance/tests/python/network/event_looper_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/network/gateway_looper_unittest.py b/heron/instance/tests/python/network/gateway_looper_unittest.py
index 8f8b348..de8ecc7 100644
--- a/heron/instance/tests/python/network/gateway_looper_unittest.py
+++ b/heron/instance/tests/python/network/gateway_looper_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/network/heron_client_unittest.py b/heron/instance/tests/python/network/heron_client_unittest.py
index ead3a36..68e4711 100644
--- a/heron/instance/tests/python/network/heron_client_unittest.py
+++ b/heron/instance/tests/python/network/heron_client_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/network/metricsmgr_client_unittest.py b/heron/instance/tests/python/network/metricsmgr_client_unittest.py
index 605d836..39cbe39 100644
--- a/heron/instance/tests/python/network/metricsmgr_client_unittest.py
+++ b/heron/instance/tests/python/network/metricsmgr_client_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/network/mock_generator.py b/heron/instance/tests/python/network/mock_generator.py
index 708471b..e589328 100644
--- a/heron/instance/tests/python/network/mock_generator.py
+++ b/heron/instance/tests/python/network/mock_generator.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/network/mock_generator_client.py b/heron/instance/tests/python/network/mock_generator_client.py
index 7e61cb0..dad89ce 100644
--- a/heron/instance/tests/python/network/mock_generator_client.py
+++ b/heron/instance/tests/python/network/mock_generator_client.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -95,14 +95,14 @@
   packet.is_complete = False
   return packet
 
-class MockDispatcher(object):
+class MockDispatcher:
   """Mock asyncore.dispatcher class, supporting only recv() and send() method
 
   This dispatcher provides several options as to how to prepare its receive buffer.
   """
   PARTIAL_DATA_SIZE = 4
   def __init__(self):
-    self.to_be_received = ""
+    self.to_be_received = b""
     self.eagain_test = False
     self.fatal_error_test = False
 
diff --git a/heron/instance/tests/python/network/protocol_unittest.py b/heron/instance/tests/python/network/protocol_unittest.py
index e76b0c3..d42ddb8 100644
--- a/heron/instance/tests/python/network/protocol_unittest.py
+++ b/heron/instance/tests/python/network/protocol_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -73,7 +73,7 @@
     pkt.read(header_dispatcher)
     self.assertTrue(pkt.is_header_read)
     self.assertFalse(pkt.is_complete)
-    self.assertEqual(pkt.data, "")
+    self.assertEqual(pkt.data, b"")
 
     # an incomplete data packet is prepared
     partial_data_dispatcher = mock_generator.MockDispatcher()
diff --git a/heron/instance/tests/python/network/st_stmgr_client_unittest.py b/heron/instance/tests/python/network/st_stmgr_client_unittest.py
index 1bc63b7..9bc9855 100644
--- a/heron/instance/tests/python/network/st_stmgr_client_unittest.py
+++ b/heron/instance/tests/python/network/st_stmgr_client_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/communicator_unittest.py b/heron/instance/tests/python/utils/communicator_unittest.py
index 31f3a48..caf1c21 100644
--- a/heron/instance/tests/python/utils/communicator_unittest.py
+++ b/heron/instance/tests/python/utils/communicator_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/custom_grouping_unittest.py b/heron/instance/tests/python/utils/custom_grouping_unittest.py
index 8ba69d1..00cc8ab 100644
--- a/heron/instance/tests/python/utils/custom_grouping_unittest.py
+++ b/heron/instance/tests/python/utils/custom_grouping_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/global_metrics_unittest.py b/heron/instance/tests/python/utils/global_metrics_unittest.py
index 2216d76..90df974 100644
--- a/heron/instance/tests/python/utils/global_metrics_unittest.py
+++ b/heron/instance/tests/python/utils/global_metrics_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/log_unittest.py b/heron/instance/tests/python/utils/log_unittest.py
index daf6ccf..2fde7dd 100644
--- a/heron/instance/tests/python/utils/log_unittest.py
+++ b/heron/instance/tests/python/utils/log_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/metrics_helper_unittest.py b/heron/instance/tests/python/utils/metrics_helper_unittest.py
index ad22024..6e23eb2 100644
--- a/heron/instance/tests/python/utils/metrics_helper_unittest.py
+++ b/heron/instance/tests/python/utils/metrics_helper_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/mock_generator.py b/heron/instance/tests/python/utils/mock_generator.py
index 0a4060b..79c9a5e 100644
--- a/heron/instance/tests/python/utils/mock_generator.py
+++ b/heron/instance/tests/python/utils/mock_generator.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/outgoing_tuple_helper_unittest.py b/heron/instance/tests/python/utils/outgoing_tuple_helper_unittest.py
index 62ff330..c81bedc 100644
--- a/heron/instance/tests/python/utils/outgoing_tuple_helper_unittest.py
+++ b/heron/instance/tests/python/utils/outgoing_tuple_helper_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/pplan_helper_unittest.py b/heron/instance/tests/python/utils/pplan_helper_unittest.py
index 7de571f..427f28c 100644
--- a/heron/instance/tests/python/utils/pplan_helper_unittest.py
+++ b/heron/instance/tests/python/utils/pplan_helper_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/py_metrics_unittest.py b/heron/instance/tests/python/utils/py_metrics_unittest.py
index 33ab25c..11bc4ae 100644
--- a/heron/instance/tests/python/utils/py_metrics_unittest.py
+++ b/heron/instance/tests/python/utils/py_metrics_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/topology_context_impl_unittest.py b/heron/instance/tests/python/utils/topology_context_impl_unittest.py
index ed02412..409dbb4 100644
--- a/heron/instance/tests/python/utils/topology_context_impl_unittest.py
+++ b/heron/instance/tests/python/utils/topology_context_impl_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/instance/tests/python/utils/tuple_helper_unittest.py b/heron/instance/tests/python/utils/tuple_helper_unittest.py
index a169907..04c2864 100644
--- a/heron/instance/tests/python/utils/tuple_helper_unittest.py
+++ b/heron/instance/tests/python/utils/tuple_helper_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/proto/BUILD b/heron/proto/BUILD
index 5da2908..7bab2b7 100644
--- a/heron/proto/BUILD
+++ b/heron/proto/BUILD
@@ -182,7 +182,7 @@
     name = "proto-py",
     reqs = [
         "protobuf==3.8.0",
-        "setuptools==18.8.1",
+        "setuptools==46.1.3",
     ],
     deps = [
         ":proto_ckptmgr_py",
diff --git a/heron/shell/src/python/handlers/browsehandler.py b/heron/shell/src/python/handlers/browsehandler.py
index c5a35ab..760764c 100644
--- a/heron/shell/src/python/handlers/browsehandler.py
+++ b/heron/shell/src/python/handlers/browsehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/downloadhandler.py b/heron/shell/src/python/handlers/downloadhandler.py
index 9629e3f..20e6adf 100644
--- a/heron/shell/src/python/handlers/downloadhandler.py
+++ b/heron/shell/src/python/handlers/downloadhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/filedatahandler.py b/heron/shell/src/python/handlers/filedatahandler.py
index 7518560..d9531c0 100644
--- a/heron/shell/src/python/handlers/filedatahandler.py
+++ b/heron/shell/src/python/handlers/filedatahandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -41,7 +41,7 @@
       self.write("Only relative paths are allowed")
       self.set_status(403)
       self.finish()
-      return
+      return None
 
     offset = self.get_argument("offset", default=-1)
     length = self.get_argument("length", default=-1)
@@ -50,3 +50,4 @@
     data = utils.read_chunk(path, offset=offset, length=length, escape_data=True)
     self.write(json.dumps(data))
     self.finish()
+    return None
diff --git a/heron/shell/src/python/handlers/filehandler.py b/heron/shell/src/python/handlers/filehandler.py
index 55fbf2c..1648c81 100644
--- a/heron/shell/src/python/handlers/filehandler.py
+++ b/heron/shell/src/python/handlers/filehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -54,3 +54,4 @@
     )
     self.write(t.generate(**args))
     self.finish()
+    return
diff --git a/heron/shell/src/python/handlers/filestatshandler.py b/heron/shell/src/python/handlers/filestatshandler.py
index a3774ca..0c32af9 100644
--- a/heron/shell/src/python/handlers/filestatshandler.py
+++ b/heron/shell/src/python/handlers/filestatshandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/healthhandler.py b/heron/shell/src/python/handlers/healthhandler.py
index 2455ad5..19c8585 100644
--- a/heron/shell/src/python/handlers/healthhandler.py
+++ b/heron/shell/src/python/handlers/healthhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/jmaphandler.py b/heron/shell/src/python/handlers/jmaphandler.py
index d147e5c..0352add 100644
--- a/heron/shell/src/python/handlers/jmaphandler.py
+++ b/heron/shell/src/python/handlers/jmaphandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/jstackhandler.py b/heron/shell/src/python/handlers/jstackhandler.py
index 3e5181c..dd1c6ab 100644
--- a/heron/shell/src/python/handlers/jstackhandler.py
+++ b/heron/shell/src/python/handlers/jstackhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/killexecutorhandler.py b/heron/shell/src/python/handlers/killexecutorhandler.py
index 6a04161..1dadff0 100644
--- a/heron/shell/src/python/handlers/killexecutorhandler.py
+++ b/heron/shell/src/python/handlers/killexecutorhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,6 +20,7 @@
 
 
 ''' killexecutorhandler.py '''
+# pylint: disable=wrong-import-order
 from future.standard_library import install_aliases
 install_aliases()
 
@@ -67,11 +68,11 @@
           fh = open(filepath)
           firstLine = int(fh.readline())
           fh.close()
-          logger.info("Killing process " + instanceId + " " + str(firstLine))
+          logger.info("Killing process %s %s", instanceId, firstLine)
           os.kill(firstLine, signal.SIGTERM)
           status_finish(200)
       else: # instance_id not found
-        logger.info(filepath + " not found")
+        logger.info("%s not found", filepath)
         status_finish(422)
     else: # instance_id not given, which means kill the container
       kill_parent()
diff --git a/heron/shell/src/python/handlers/memoryhistogramhandler.py b/heron/shell/src/python/handlers/memoryhistogramhandler.py
index 7b355cf..5ea48c7 100644
--- a/heron/shell/src/python/handlers/memoryhistogramhandler.py
+++ b/heron/shell/src/python/handlers/memoryhistogramhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/pidhandler.py b/heron/shell/src/python/handlers/pidhandler.py
index 7d8b2a8..b10eb03 100644
--- a/heron/shell/src/python/handlers/pidhandler.py
+++ b/heron/shell/src/python/handlers/pidhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/handlers/pmaphandler.py b/heron/shell/src/python/handlers/pmaphandler.py
index acef828..fb507ce 100644
--- a/heron/shell/src/python/handlers/pmaphandler.py
+++ b/heron/shell/src/python/handlers/pmaphandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/main.py b/heron/shell/src/python/main.py
index 0f375c8..c476eca 100644
--- a/heron/shell/src/python/main.py
+++ b/heron/shell/src/python/main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/shell/src/python/utils.py b/heron/shell/src/python/utils.py
index ffbe2ce..e5e1cd8 100644
--- a/heron/shell/src/python/utils.py
+++ b/heron/shell/src/python/utils.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -46,10 +46,9 @@
     ''' stat type'''
     if stat.S_ISDIR(md):
       return 'd'
-    elif stat.S_ISSOCK(md):
+    if stat.S_ISSOCK(md):
       return 's'
-    else:
-      return '-'
+    return '-'
 
   def triple(md):
     ''' triple '''
@@ -170,7 +169,8 @@
   Runs the command and returns its stdout and stderr.
   """
   process = subprocess.Popen(cmd, stdout=subprocess.PIPE,
-                             stderr=subprocess.PIPE, cwd=cwd, env=env)
+                             stderr=subprocess.PIPE, cwd=cwd,
+                             env=env, universal_newlines=True)
   stdout_builder, stderr_builder = proc.async_stdout_stderr_builder(process)
   process.wait()
   stdout, stderr = stdout_builder.result(), stderr_builder.result()
@@ -197,7 +197,7 @@
 
 def get_asset(asset_name):
   ''' get assset '''
-  return pkgutil.get_data("heron.shell", os.path.join("assets", asset_name))
+  return pkgutil.get_data("heron.shell", os.path.join("assets", asset_name)).decode()
 
 def check_path(path):
   """
diff --git a/heron/statemgrs/src/python/BUILD b/heron/statemgrs/src/python/BUILD
index e2421df..6c0a382 100644
--- a/heron/statemgrs/src/python/BUILD
+++ b/heron/statemgrs/src/python/BUILD
@@ -5,7 +5,7 @@
     srcs = glob(["**/*.py"]),
     reqs = [
         "PyYAML==3.13",
-        "kazoo==1.3.1",
+        "kazoo==2.7.0",
         "zope.interface==4.0.5",
     ],
     deps = [
diff --git a/heron/statemgrs/src/python/config.py b/heron/statemgrs/src/python/config.py
index 14975b3..d6a282b 100644
--- a/heron/statemgrs/src/python/config.py
+++ b/heron/statemgrs/src/python/config.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 ''' config.py '''
 
-class Config(object):
+class Config:
   """
   Responsible for reading the yaml config files and
   exposing state locations through python methods.
diff --git a/heron/statemgrs/src/python/configloader.py b/heron/statemgrs/src/python/configloader.py
index 384a5c7..9fe98f2 100644
--- a/heron/statemgrs/src/python/configloader.py
+++ b/heron/statemgrs/src/python/configloader.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/statemgrs/src/python/filestatemanager.py b/heron/statemgrs/src/python/filestatemanager.py
index ca56e56..3ece907 100644
--- a/heron/statemgrs/src/python/filestatemanager.py
+++ b/heron/statemgrs/src/python/filestatemanager.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -101,9 +101,9 @@
       """
       for topology, callbacks in list(watchers.items()):
         file_path = os.path.join(path, topology)
-        data = ""
+        data = b""
         if os.path.exists(file_path):
-          with open(os.path.join(path, topology)) as f:
+          with open(os.path.join(path, topology), "rb") as f:
             data = f.read()
         if topology not in directory or data != directory[topology]:
           proto_object = ProtoClass()
@@ -167,6 +167,7 @@
       topologies_path = self.get_topologies_path()
       return [f for f in os.listdir(topologies_path)
               if os.path.isfile(os.path.join(topologies_path, f))]
+    return None
 
   def get_topology(self, topologyName, callback=None):
     """get topology"""
@@ -174,23 +175,22 @@
       self.topology_watchers[topologyName].append(callback)
     else:
       topology_path = self.get_topology_path(topologyName)
-      with open(topology_path) as f:
+      with open(topology_path, "rb") as f:
         data = f.read()
         topology = Topology()
         topology.ParseFromString(data)
         return topology
+    return None
 
   def create_topology(self, topologyName, topology):
     """
     Create path is currently not supported in file based state manager.
     """
-    pass
 
   def delete_topology(self, topologyName):
     """
     Delete path is currently not supported in file based state manager.
     """
-    pass
 
   def get_packing_plan(self, topologyName, callback=None):
     """ get packing plan """
@@ -198,7 +198,7 @@
       self.packing_plan_watchers[topologyName].append(callback)
     else:
       packing_plan_path = self.get_packing_plan_path(topologyName)
-      with open(packing_plan_path) as f:
+      with open(packing_plan_path, "rb") as f:
         data = f.read()
         packing_plan = PackingPlan()
         packing_plan.ParseFromString(data)
@@ -211,23 +211,22 @@
       self.pplan_watchers[topologyName].append(callback)
     else:
       pplan_path = self.get_pplan_path(topologyName)
-      with open(pplan_path) as f:
+      with open(pplan_path, "rb") as f:
         data = f.read()
         pplan = PhysicalPlan()
         pplan.ParseFromString(data)
         return pplan
+    return None
 
   def create_pplan(self, topologyName, pplan):
     """
     Create path is currently not supported in file based state manager.
     """
-    pass
 
   def delete_pplan(self, topologyName):
     """
     Delete path is currently not supported in file based state manager.
     """
-    pass
 
   def get_execution_state(self, topologyName, callback=None):
     """
@@ -237,23 +236,22 @@
       self.execution_state_watchers[topologyName].append(callback)
     else:
       execution_state_path = self.get_execution_state_path(topologyName)
-      with open(execution_state_path) as f:
+      with open(execution_state_path, "rb") as f:
         data = f.read()
         executionState = ExecutionState()
         executionState.ParseFromString(data)
         return executionState
+    return None
 
   def create_execution_state(self, topologyName, executionState):
     """
     Create path is currently not supported in file based state manager.
     """
-    pass
 
   def delete_execution_state(self, topologyName):
     """
     Delete path is currently not supported in file based state manager.
     """
-    pass
 
   def get_tmaster(self, topologyName, callback=None):
     """
@@ -263,11 +261,12 @@
       self.tmaster_watchers[topologyName].append(callback)
     else:
       tmaster_path = self.get_tmaster_path(topologyName)
-      with open(tmaster_path) as f:
+      with open(tmaster_path, "rb") as f:
         data = f.read()
         tmaster = TMasterLocation()
         tmaster.ParseFromString(data)
         return tmaster
+    return None
 
   def get_scheduler_location(self, topologyName, callback=None):
     """
@@ -277,8 +276,9 @@
       self.scheduler_location_watchers[topologyName].append(callback)
     else:
       scheduler_location_path = self.get_scheduler_location_path(topologyName)
-      with open(scheduler_location_path) as f:
+      with open(scheduler_location_path, "rb") as f:
         data = f.read()
         scheduler_location = SchedulerLocation()
         scheduler_location.ParseFromString(data)
         return scheduler_location
+    return None
diff --git a/heron/statemgrs/src/python/log.py b/heron/statemgrs/src/python/log.py
index cf23b29..3c3f800 100644
--- a/heron/statemgrs/src/python/log.py
+++ b/heron/statemgrs/src/python/log.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/statemgrs/src/python/stateexceptions.py b/heron/statemgrs/src/python/stateexceptions.py
index 69380d9..639da76 100644
--- a/heron/statemgrs/src/python/stateexceptions.py
+++ b/heron/statemgrs/src/python/stateexceptions.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/statemgrs/src/python/statemanager.py b/heron/statemgrs/src/python/statemanager.py
index a4affff..655cf25 100644
--- a/heron/statemgrs/src/python/statemanager.py
+++ b/heron/statemgrs/src/python/statemanager.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -127,12 +127,10 @@
   @abc.abstractmethod
   def start(self):
     """ If the state manager needs to connect to a remote host. """
-    pass
 
   @abc.abstractmethod
   def stop(self):
     """ If the state manager had connected to a remote server, it would need to stop as well. """
-    pass
 
   def get_topologies_path(self):
     return HERON_TOPOLOGIES_KEY.format(self.rootpath)
@@ -179,7 +177,6 @@
     sets watch on the path and calls the callback
     with the new packing_plan.
     """
-    pass
 
   @abc.abstractmethod
   def get_pplan(self, topologyName, callback=None):
diff --git a/heron/statemgrs/src/python/statemanagerfactory.py b/heron/statemgrs/src/python/statemanagerfactory.py
index 86b51f7..a0d03b5 100644
--- a/heron/statemgrs/src/python/statemanagerfactory.py
+++ b/heron/statemgrs/src/python/statemanagerfactory.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/statemgrs/src/python/zkstatemanager.py b/heron/statemgrs/src/python/zkstatemanager.py
index 438a019..9f29390 100644
--- a/heron/statemgrs/src/python/zkstatemanager.py
+++ b/heron/statemgrs/src/python/zkstatemanager.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -206,9 +206,6 @@
     except ZookeeperError:
       raise_(StateException("Zookeeper while creating topology",
                             StateException.EX_TYPE_ZOOKEEPER_ERROR), sys.exc_info()[2])
-    except Exception:
-      # Just re raise the exception.
-      raise
 
   def delete_topology(self, topologyName):
     """ delete topology """
@@ -227,9 +224,6 @@
     except ZookeeperError:
       raise_(StateException("Zookeeper while deleting topology",
                             StateException.EX_TYPE_ZOOKEEPER_ERROR), sys.exc_info()[2])
-    except Exception:
-      # Just re raise the exception.
-      raise
 
   def get_packing_plan(self, topologyName, callback=None):
     """ get packing plan """
@@ -349,9 +343,6 @@
     except ZookeeperError:
       raise_(StateException("Zookeeper while creating pplan",
                             StateException.EX_TYPE_ZOOKEEPER_ERROR), sys.exc_info()[2])
-    except Exception:
-      # Just re raise the exception.
-      raise
 
   def delete_pplan(self, topologyName):
     """ delete physical plan info """
@@ -370,9 +361,6 @@
     except ZookeeperError:
       raise_(StateException("Zookeeper while deleting pplan",
                             StateException.EX_TYPE_ZOOKEEPER_ERROR), sys.exc_info()[2])
-    except Exception:
-      # Just re raise the exception.
-      raise
 
   def get_execution_state(self, topologyName, callback=None):
     """ get execution state """
@@ -445,9 +433,6 @@
     except ZookeeperError:
       raise_(StateException("Zookeeper while creating execution state",
                             StateException.EX_TYPE_ZOOKEEPER_ERROR), sys.exc_info()[2])
-    except Exception:
-      # Just re raise the exception.
-      raise
 
   def delete_execution_state(self, topologyName):
     """ delete execution state """
@@ -466,9 +451,6 @@
     except ZookeeperError:
       raise_(StateException("Zookeeper while deleting execution state",
                             StateException.EX_TYPE_ZOOKEEPER_ERROR), sys.exc_info()[2])
-    except Exception:
-      # Just re raise the exception.
-      raise
 
   def get_tmaster(self, topologyName, callback=None):
     """ get tmaster """
diff --git a/heron/statemgrs/tests/python/configloader_unittest.py b/heron/statemgrs/tests/python/configloader_unittest.py
index 3549e66..7018cad 100644
--- a/heron/statemgrs/tests/python/configloader_unittest.py
+++ b/heron/statemgrs/tests/python/configloader_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/statemgrs/tests/python/statemanagerfactory_unittest.py b/heron/statemgrs/tests/python/statemanagerfactory_unittest.py
index dea8cfe..0dcfcda 100644
--- a/heron/statemgrs/tests/python/statemanagerfactory_unittest.py
+++ b/heron/statemgrs/tests/python/statemanagerfactory_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/statemgrs/tests/python/zkstatemanager_unittest.py b/heron/statemgrs/tests/python/zkstatemanager_unittest.py
index 8decf63..ee6da0e 100644
--- a/heron/statemgrs/tests/python/zkstatemanager_unittest.py
+++ b/heron/statemgrs/tests/python/zkstatemanager_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/admin/src/python/main.py b/heron/tools/admin/src/python/main.py
index 9d01881..de9597e 100644
--- a/heron/tools/admin/src/python/main.py
+++ b/heron/tools/admin/src/python/main.py
@@ -16,7 +16,7 @@
 #  under the License.
 
 
-# !/usr/bin/env python2.7
+# !/usr/bin/env python3
 ''' main.py '''
 import argparse
 import os
@@ -100,9 +100,8 @@
 
   if command in handlers:
     return handlers[command].run(command, parser, command_args, unknown_args)
-  else:
-    err_context = 'Unknown subcommand: %s' % command
-    return result.SimpleResult(result.Status.InvocationError, err_context)
+  err_context = 'Unknown subcommand: %s' % command
+  return result.SimpleResult(result.Status.InvocationError, err_context)
 
 def cleanup(files):
   '''
diff --git a/heron/tools/admin/src/python/standalone.py b/heron/tools/admin/src/python/standalone.py
index b7afdff..656f720 100644
--- a/heron/tools/admin/src/python/standalone.py
+++ b/heron/tools/admin/src/python/standalone.py
@@ -41,7 +41,7 @@
 # pylint: disable=unused-argument
 # pylint: disable=too-many-branches
 
-class Action(object):
+class Action:
   SET = "set"
   CLUSTER = "cluster"
   TEMPLATE = "template"
@@ -50,17 +50,17 @@
 
 TYPE = "type"
 
-class Role(object):
+class Role:
   ZOOKEEPERS = "zookeepers"
   MASTERS = "masters"
   SLAVES = "slaves"
   CLUSTER = "cluster"
 
-class Cluster(object):
+class Cluster:
   START = "start"
   STOP = "stop"
 
-class Get(object):
+class Get:
   SERVICE_URL = "service-url"
   HERON_TRACKER_URL = "heron-tracker-url"
   HERON_UI_URL = "heron-ui-url"
@@ -423,6 +423,7 @@
     Log.debug(cmd)
     pid = subprocess.Popen(cmd,
                            shell=True,
+                           universal_newlines=True,
                            stdout=subprocess.PIPE,
                            stderr=subprocess.PIPE)
 
@@ -437,6 +438,7 @@
     Log.debug(cmd)
     pid = subprocess.Popen(cmd,
                            shell=True,
+                           universal_newlines=True,
                            stdout=subprocess.PIPE,
                            stderr=subprocess.PIPE)
 
@@ -495,6 +497,7 @@
   Log.debug(cmd)
   pid = subprocess.Popen(cmd,
                          shell=True,
+                         universal_newlines=True,
                          stdout=subprocess.PIPE,
                          stderr=subprocess.PIPE)
 
@@ -524,6 +527,7 @@
   Log.debug(cmd)
   pid = subprocess.Popen(cmd,
                          shell=True,
+                         universal_newlines=True,
                          stdout=subprocess.PIPE,
                          stderr=subprocess.PIPE)
 
@@ -581,8 +585,7 @@
       r = requests.get("http://%s:4646/v1/job/%s" % (single_master, job))
       if r.status_code == 200 and r.json()["Status"] == "running":
         break
-      else:
-        raise RuntimeError()
+      raise RuntimeError()
     except:
       Log.debug(sys.exc_info()[0])
       Log.info("Waiting for %s to come up... %s" % (job, i))
@@ -612,6 +615,7 @@
     Log.debug(cmd)
     pid = subprocess.Popen(cmd,
                            shell=True,
+                           universal_newlines=True,
                            stdout=subprocess.PIPE,
                            stderr=subprocess.PIPE)
     pids.append({"pid": pid, "dest": dest})
@@ -653,6 +657,7 @@
     Log.debug(cmd)
     pid = subprocess.Popen(cmd,
                            shell=True,
+                           universal_newlines=True,
                            stdout=subprocess.PIPE,
                            stderr=subprocess.PIPE)
     pids.append({"pid": pid, "dest": master})
@@ -687,6 +692,7 @@
     Log.debug(cmd)
     pid = subprocess.Popen(cmd,
                            shell=True,
+                           universal_newlines=True,
                            stdout=subprocess.PIPE,
                            stderr=subprocess.PIPE)
     pids.append({"pid": pid, "dest": slave})
@@ -860,12 +866,12 @@
 
 def check_sure(cl_args, prompt):
   yes = input("%s" % prompt + ' (yes/no): ')
-  if yes == "y" or yes == "yes":
+  if yes in ("y", "yes"):
     return True
-  elif yes == "n" or yes == "no":
+  if yes in ("n", "no"):
     return False
-  else:
-    print('Invalid input.  Please input "yes" or "no"')
+  print('Invalid input.  Please input "yes" or "no"')
+  return None
 
 def get_jobs(cl_args, nomad_addr):
   r = requests.get("http://%s:4646/v1/jobs" % nomad_addr)
diff --git a/heron/tools/cli/src/python/activate.py b/heron/tools/cli/src/python/activate.py
index daabcb5..3567484 100644
--- a/heron/tools/cli/src/python/activate.py
+++ b/heron/tools/cli/src/python/activate.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/args.py b/heron/tools/cli/src/python/args.py
index 3d280ce..60879d9 100644
--- a/heron/tools/cli/src/python/args.py
+++ b/heron/tools/cli/src/python/args.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/cdefs.py b/heron/tools/cli/src/python/cdefs.py
index a8ca744..d89cb25 100644
--- a/heron/tools/cli/src/python/cdefs.py
+++ b/heron/tools/cli/src/python/cdefs.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/cli_helper.py b/heron/tools/cli/src/python/cli_helper.py
index 4b73661..da228aa 100644
--- a/heron/tools/cli/src/python/cli_helper.py
+++ b/heron/tools/cli/src/python/cli_helper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -152,5 +152,4 @@
 def run(command, cl_args, action, extra_lib_jars=[]):
   if cl_args['deploy_mode'] == config.SERVER_MODE:
     return run_server(command, cl_args, action, extra_args=dict())
-  else:
-    return run_direct(command, cl_args, action, extra_args=[], extra_lib_jars=extra_lib_jars)
+  return run_direct(command, cl_args, action, extra_args=[], extra_lib_jars=extra_lib_jars)
diff --git a/heron/tools/cli/src/python/cliconfig.py b/heron/tools/cli/src/python/cliconfig.py
index 2daf6f0..f88fd70 100644
--- a/heron/tools/cli/src/python/cliconfig.py
+++ b/heron/tools/cli/src/python/cliconfig.py
@@ -79,7 +79,7 @@
     config_directory = get_config_directory(cluster)
     if not os.path.isdir(config_directory):
       os.makedirs(config_directory)
-    with open(cluster_config_file, 'wb') as cf:
+    with open(cluster_config_file, 'w') as cf:
       yaml.dump(config, cf, default_flow_style=False)
   else:
     if os.path.isfile(cluster_config_file):
diff --git a/heron/tools/cli/src/python/config.py b/heron/tools/cli/src/python/config.py
index a48a4f0..137a969 100644
--- a/heron/tools/cli/src/python/config.py
+++ b/heron/tools/cli/src/python/config.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -134,7 +134,6 @@
   configcommand = cl_args.get('configcommand', None)
   if configcommand == 'set':
     return _set(cl_args)
-  elif configcommand == 'unset':
+  if configcommand == 'unset':
     return _unset(cl_args)
-  else:
-    return _list(cl_args)
+  return _list(cl_args)
diff --git a/heron/tools/cli/src/python/deactivate.py b/heron/tools/cli/src/python/deactivate.py
index 9534de8..7cef5ab 100644
--- a/heron/tools/cli/src/python/deactivate.py
+++ b/heron/tools/cli/src/python/deactivate.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/execute.py b/heron/tools/cli/src/python/execute.py
index 9534d1c..bff33b5 100644
--- a/heron/tools/cli/src/python/execute.py
+++ b/heron/tools/cli/src/python/execute.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,6 +22,7 @@
 import contextlib
 import os
 import subprocess
+import shlex
 import tarfile
 import tempfile
 import traceback
@@ -61,7 +62,7 @@
 
   java_path = config.get_java_path()
   if java_path is None:
-    err_context = "Neither JAVA_BIN or JAVA_HOME are set"
+    err_context = "Unable to find java command"
     return SimpleResult(Status.InvocationError, err_context)
 
   # Construct the command line for the sub process to run
@@ -78,12 +79,12 @@
   heron_env['HERON_OPTIONS'] = opts.get_heron_config()
 
   # print the verbose message
-  Log.debug("Invoking class using command: ``%s''", ' '.join(all_args))
+  Log.debug("Invoking class using command: `%s`", ' '.join(shlex.quote(a) for a in all_args))
   Log.debug("Heron options: {%s}", str(heron_env["HERON_OPTIONS"]))
 
   # invoke the command with subprocess and print error message, if any
   process = subprocess.Popen(all_args, env=heron_env, stdout=subprocess.PIPE,
-                             stderr=subprocess.PIPE, bufsize=1)
+                             stderr=subprocess.PIPE, universal_newlines=True, bufsize=1)
   # stdout message has the information Java program sends back
   # stderr message has extra information, such as debugging message
   return ProcessResult(process)
@@ -121,6 +122,7 @@
   return heron_class(class_name, lib_jars, extra_jars, arguments, java_defines)
 
 def heron_pex(topology_pex, topology_class_name, args=None):
+  """Use a topology defined in a PEX."""
   Log.debug("Importing %s from %s", topology_class_name, topology_pex)
   if topology_class_name == '-':
     # loading topology by running its main method (if __name__ == "__main__")
@@ -133,27 +135,28 @@
     Log.debug('Heron options: {%s}', str(heron_env['HERON_OPTIONS']))
     # invoke the command with subprocess and print error message, if any
     process = subprocess.Popen(cmd, env=heron_env, stdout=subprocess.PIPE,
-                               stderr=subprocess.PIPE, bufsize=1)
+                               stderr=subprocess.PIPE, universal_newlines=True, bufsize=1)
+    # pylint: disable=fixme
     # todo(rli): improve python topology submission workflow
     return ProcessResult(process)
-  else:
-    try:
-      # loading topology from Topology's subclass (no main method)
-      # to support specifying the name of topology
-      Log.debug("args: %s", args)
-      if args is not None and isinstance(args, (list, tuple)) and len(args) > 0:
-        opts.set_config('cmdline.topology.name', args[0])
-      os.environ["HERON_OPTIONS"] = opts.get_heron_config()
-      Log.debug("Heron options: {%s}", os.environ["HERON_OPTIONS"])
-      pex_loader.load_pex(topology_pex)
-      topology_class = pex_loader.import_and_get_class(topology_pex, topology_class_name)
-      topology_class.write()
-      return SimpleResult(Status.Ok)
-    except Exception as ex:
-      Log.debug(traceback.format_exc())
-      err_context = "Topology %s failed to be loaded from the given pex: %s" %\
-                (topology_class_name, ex)
-      return SimpleResult(Status.HeronError, err_context)
+  try:
+    # loading topology from Topology's subclass (no main method)
+    # to support specifying the name of topology
+    Log.debug("args: %s", args)
+    if args is not None and isinstance(args, (list, tuple)) and len(args) > 0:
+      opts.set_config('cmdline.topology.name', args[0])
+    os.environ["HERON_OPTIONS"] = opts.get_heron_config()
+    Log.debug("Heron options: {%s}", os.environ["HERON_OPTIONS"])
+    pex_loader.load_pex(topology_pex)
+    topology_class = pex_loader.import_and_get_class(topology_pex, topology_class_name)
+    topology_class.write()
+    return SimpleResult(Status.Ok)
+  except Exception as ex:
+    Log.debug(traceback.format_exc())
+    err_context = "Topology %s failed to be loaded from the given pex: %s" %\
+              (topology_class_name, ex)
+    return SimpleResult(Status.HeronError, err_context)
+  return None
 
 # pylint: disable=superfluous-parens
 def heron_cpp(topology_binary, args=None):
@@ -169,5 +172,5 @@
   print('Heron options: {%s}' % str(heron_env['HERON_OPTIONS']))
   # invoke the command with subprocess and print error message, if any
   proc = subprocess.Popen(cmd, env=heron_env, stdout=subprocess.PIPE,
-                          stderr=subprocess.PIPE, bufsize=1)
+                          stderr=subprocess.PIPE, universal_newlines=True, bufsize=1)
   return ProcessResult(proc)
diff --git a/heron/tools/cli/src/python/help.py b/heron/tools/cli/src/python/help.py
index 6edd8a0..6c9d35a 100644
--- a/heron/tools/cli/src/python/help.py
+++ b/heron/tools/cli/src/python/help.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -69,6 +69,5 @@
   if subparser:
     print(subparser.format_help())
     return SimpleResult(Status.Ok)
-  else:
-    Log.error("Unknown subcommand \'%s\'", command_help)
-    return SimpleResult(Status.InvocationError)
+  Log.error("Unknown subcommand \'%s\'", command_help)
+  return SimpleResult(Status.InvocationError)
diff --git a/heron/tools/cli/src/python/jars.py b/heron/tools/cli/src/python/jars.py
index 611de05..3215294 100644
--- a/heron/tools/cli/src/python/jars.py
+++ b/heron/tools/cli/src/python/jars.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/kill.py b/heron/tools/cli/src/python/kill.py
index d290269..2b85fcd 100644
--- a/heron/tools/cli/src/python/kill.py
+++ b/heron/tools/cli/src/python/kill.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/main.py b/heron/tools/cli/src/python/main.py
index ac43e8d..a4a060e 100644
--- a/heron/tools/cli/src/python/main.py
+++ b/heron/tools/cli/src/python/main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -127,9 +127,8 @@
 
   if command in handlers:
     return handlers[command].run(command, parser, command_args, unknown_args)
-  else:
-    err_context = 'Unknown subcommand: %s' % command
-    return result.SimpleResult(result.Status.InvocationError, err_context)
+  err_context = 'Unknown subcommand: %s' % command
+  return result.SimpleResult(result.Status.InvocationError, err_context)
 
 def cleanup(files):
   '''
diff --git a/heron/tools/cli/src/python/opts.py b/heron/tools/cli/src/python/opts.py
index be29bf2..160be97 100644
--- a/heron/tools/cli/src/python/opts.py
+++ b/heron/tools/cli/src/python/opts.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/rest.py b/heron/tools/cli/src/python/rest.py
index cc95a4f..ec6b554 100644
--- a/heron/tools/cli/src/python/rest.py
+++ b/heron/tools/cli/src/python/rest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/src/python/restart.py b/heron/tools/cli/src/python/restart.py
index 9b731fa..06f42d3 100644
--- a/heron/tools/cli/src/python/restart.py
+++ b/heron/tools/cli/src/python/restart.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -73,6 +73,5 @@
   if cl_args['deploy_mode'] == config.SERVER_MODE:
     dict_extra_args = {"container_id": str(container_id)}
     return cli_helper.run_server(command, cl_args, message, extra_args=dict_extra_args)
-  else:
-    list_extra_args = ["--container_id", str(container_id)]
-    return cli_helper.run_direct(command, cl_args, message, extra_args=list_extra_args)
+  list_extra_args = ["--container_id", str(container_id)]
+  return cli_helper.run_direct(command, cl_args, message, extra_args=list_extra_args)
diff --git a/heron/tools/cli/src/python/result.py b/heron/tools/cli/src/python/result.py
index e8b556d..df65293 100644
--- a/heron/tools/cli/src/python/result.py
+++ b/heron/tools/cli/src/python/result.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -52,14 +52,13 @@
 def status_type(status_code):
   if status_code == 0:
     return Status.Ok
-  elif status_code < 100:
+  if status_code < 100:
     return Status.InvocationError
-  elif status_code == 200:
+  if status_code == 200:
     return Status.DryRun
-  else:
-    return Status.HeronError
+  return Status.HeronError
 
-class Result(object):
+class Result:
   """Result class"""
   def __init__(self, status=None, err_context=None, succ_context=None):
     self.status = status
@@ -89,7 +88,7 @@
       self._do_log(Log.error, self.err_context)
     else:
       raise RuntimeError(
-          "Unknown status type of value %d. Expected value: %s", self.status.value, list(Status))
+          "Unknown status type of value %d. Expected value: %s" % (self.status.value, list(Status)))
 
   def add_context(self, err_context, succ_context=None):
     """ Prepend msg to add some context information
@@ -181,13 +180,12 @@
     for r in results:
       r.render()
   else:
-    raise RuntimeError("Unknown result instance: %s", str(results.__class__))
+    raise RuntimeError("Unknown result instance: %s" % (str(results.__class__),))
 
 # check if all results are successful
 def is_successful(results):
   if isinstance(results, list):
     return all([is_successful(result) for result in results])
-  elif isinstance(results, Result):
+  if isinstance(results, Result):
     return results.status == Status.Ok or results.status == Status.DryRun
-  else:
-    raise RuntimeError("Unknown result instance: %s", str(results.__class__))
+  raise RuntimeError("Unknown result instance: %s" % (str(results.__class__),))
diff --git a/heron/tools/cli/src/python/submit.py b/heron/tools/cli/src/python/submit.py
index 2a9c229..5e6be8d 100644
--- a/heron/tools/cli/src/python/submit.py
+++ b/heron/tools/cli/src/python/submit.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -19,6 +19,7 @@
 #  under the License.
 
 ''' submit.py '''
+# pylint: disable=wrong-import-order
 from future.standard_library import install_aliases
 install_aliases()
 
@@ -399,6 +400,7 @@
   for f in os.listdir(tmp_dir):
     if f.endswith(suffix):
       return os.path.join(tmp_dir, f)
+  return None
 
 ################################################################################
 # pylint: disable=unused-argument
@@ -478,9 +480,8 @@
   # check the extension of the file name to see if it is tar/jar file.
   if jar_type:
     return submit_fatjar(cl_args, unknown_args, tmp_dir)
-  elif tar_type:
+  if tar_type:
     return submit_tar(cl_args, unknown_args, tmp_dir)
-  elif cpp_type:
+  if cpp_type:
     return submit_cpp(cl_args, unknown_args, tmp_dir)
-  else:
-    return submit_pex(cl_args, unknown_args, tmp_dir)
+  return submit_pex(cl_args, unknown_args, tmp_dir)
diff --git a/heron/tools/cli/src/python/update.py b/heron/tools/cli/src/python/update.py
index bc39c7b..8dddb18 100644
--- a/heron/tools/cli/src/python/update.py
+++ b/heron/tools/cli/src/python/update.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -179,7 +179,6 @@
   # Execute
   if cl_args['deploy_mode'] == config.SERVER_MODE:
     return cli_helper.run_server(command, cl_args, action, dict_extra_args)
-  else:
-    # Convert extra argument to commandline format and then execute
-    list_extra_args = convert_args_dict_to_list(dict_extra_args)
-    return cli_helper.run_direct(command, cl_args, action, list_extra_args, extra_lib_jars)
+  # Convert extra argument to commandline format and then execute
+  list_extra_args = convert_args_dict_to_list(dict_extra_args)
+  return cli_helper.run_direct(command, cl_args, action, list_extra_args, extra_lib_jars)
diff --git a/heron/tools/cli/src/python/version.py b/heron/tools/cli/src/python/version.py
index 581a153..b101f94 100644
--- a/heron/tools/cli/src/python/version.py
+++ b/heron/tools/cli/src/python/version.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/tests/python/client_command_unittest.py b/heron/tools/cli/tests/python/client_command_unittest.py
index 00f35a4..891f31a 100644
--- a/heron/tools/cli/tests/python/client_command_unittest.py
+++ b/heron/tools/cli/tests/python/client_command_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/cli/tests/python/opts_unittest.py b/heron/tools/cli/tests/python/opts_unittest.py
index 2f31ff6..49e128b 100644
--- a/heron/tools/cli/tests/python/opts_unittest.py
+++ b/heron/tools/cli/tests/python/opts_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/common/src/python/access/fetch.py b/heron/tools/common/src/python/access/fetch.py
index 19adf2b..bc8b8a3 100644
--- a/heron/tools/common/src/python/access/fetch.py
+++ b/heron/tools/common/src/python/access/fetch.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/common/src/python/access/heron_api.py b/heron/tools/common/src/python/access/heron_api.py
index c5a5203..6e77e51 100644
--- a/heron/tools/common/src/python/access/heron_api.py
+++ b/heron/tools/common/src/python/access/heron_api.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/common/src/python/access/query.py b/heron/tools/common/src/python/access/query.py
index b65dff6..ba874bc 100644
--- a/heron/tools/common/src/python/access/query.py
+++ b/heron/tools/common/src/python/access/query.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,10 +20,10 @@
 
 ''' query.py '''
 
-class QueryHandler(object):
+class QueryHandler:
   ''' QueryHandler '''
 
-  def fetch(self, cluster, metric, topology, component, instance, timerange, envirn=None):
+  def fetch(self, cluster, metric, topology, component, instance, timerange, environ=None):
     '''
     :param cluster:
     :param metric:
@@ -31,12 +31,11 @@
     :param component:
     :param instance:
     :param timerange:
-    :param envirn:
+    :param environ:
     :return:
     '''
-    pass
 
-  def fetch_max(self, cluster, metric, topology, component, instance, timerange, envirn=None):
+  def fetch_max(self, cluster, metric, topology, component, instance, timerange, environ=None):
     '''
     :param cluster:
     :param metric:
@@ -44,10 +43,9 @@
     :param component:
     :param instance:
     :param timerange:
-    :param envirn:
+    :param environ:
     :return:
     '''
-    pass
 
   def fetch_backpressure(self, cluster, metric, topology, component, instance, \
     timerange, is_max, environ=None):
@@ -62,4 +60,3 @@
     :param environ:
     :return:
     '''
-    pass
diff --git a/heron/tools/common/src/python/access/tracker_access.py b/heron/tools/common/src/python/access/tracker_access.py
index 8b49b97..c0700f5 100644
--- a/heron/tools/common/src/python/access/tracker_access.py
+++ b/heron/tools/common/src/python/access/tracker_access.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/common/src/python/utils/classpath.py b/heron/tools/common/src/python/utils/classpath.py
index a0293af..712ae1f 100644
--- a/heron/tools/common/src/python/utils/classpath.py
+++ b/heron/tools/common/src/python/utils/classpath.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -39,11 +39,10 @@
   Log.debug('Checking classpath entry as directory: %s', path)
   if os.path.isdir(path):
     return True
-  else:
-    # check if the classpath entry is a file
-    Log.debug('Checking classpath entry as file: %s', path)
-    if os.path.isfile(path):
-      return True
+  # check if the classpath entry is a file
+  Log.debug('Checking classpath entry as file: %s', path)
+  if os.path.isfile(path):
+    return True
 
   return False
 
diff --git a/heron/tools/common/src/python/utils/config.py b/heron/tools/common/src/python/utils/config.py
index 83afb24..9747021 100644
--- a/heron/tools/common/src/python/utils/config.py
+++ b/heron/tools/common/src/python/utils/config.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -25,9 +25,11 @@
 import getpass
 import os
 import sys
+import shutil
 import subprocess
 import tarfile
 import tempfile
+from pathlib import Path
 import yaml
 
 from heron.common.src.python.utils.log import Log
@@ -44,7 +46,6 @@
 LIB_DIR = "lib"
 CLI_DIR = ".heron"
 RELEASE_YAML = "release.yaml"
-ZIPPED_RELEASE_YAML = "scripts/packages/release.yaml"
 OVERRIDE_YAML = "override.yaml"
 
 # mode of deployment
@@ -112,7 +113,7 @@
   normalized class path on cygwin
   '''
   command = ['cygpath', '-wp', x]
-  p = subprocess.Popen(command, stdout=subprocess.PIPE)
+  p = subprocess.Popen(command, stdout=subprocess.PIPE, universal_newlines=True)
   result = p.communicate()
   output = result[0]
   lines = output.split("\n")
@@ -141,6 +142,9 @@
   '''
   return ':'.join(map(normalized_class_path, jars))
 
+def _get_heron_dir():
+  # assuming the tool runs from $HERON_ROOT/bin/<binary>
+  return normalized_class_path(str(Path(sys.argv[0]).resolve(strict=True).parent.parent))
 
 def get_heron_dir():
   """
@@ -155,9 +159,7 @@
 
   :return: root location of the .pex file
   """
-  go_above_dirs = 9
-  path = "/".join(os.path.realpath(__file__).split('/')[:-go_above_dirs])
-  return normalized_class_path(path)
+  return _get_heron_dir()
 
 def get_zipped_heron_dir():
   """
@@ -174,9 +176,7 @@
 
   :return: root location of the .pex file.
   """
-  go_above_dirs = 7
-  path = "/".join(os.path.realpath(__file__).split('/')[:-go_above_dirs])
-  return normalized_class_path(path)
+  return _get_heron_dir()
 
 ################################################################################
 # Get the root of heron dir and various sub directories depending on platform
@@ -216,17 +216,6 @@
   return os.path.join(get_heron_dir(), RELEASE_YAML)
 
 
-def get_zipped_heron_release_file():
-  """
-  This will provide the path to heron release.yaml file.
-  To be used for .pex file built with `zip_safe = False` flag.
-  For example, `heron-ui'.
-
-  :return: absolute path of heron release.yaml file
-  """
-  return os.path.join(get_zipped_heron_dir(), ZIPPED_RELEASE_YAML)
-
-
 def get_heron_cluster_conf_dir(cluster, default_config_path):
   """
   This will provide heron cluster config directory, if config path is default
@@ -289,16 +278,14 @@
         if (ROLE_REQUIRED in cli_confs) and (cli_confs[ROLE_REQUIRED] is True):
           raise Exception("role required but not provided (cluster/role/env = %s). See %s in %s"
                           % (cluster_role_env, ROLE_REQUIRED, cli_conf_file))
-        else:
-          parts.append(getpass.getuser())
+        parts.append(getpass.getuser())
 
       # if environ is required but not provided, raise exception
       if len(parts) == 2:
         if (ENV_REQUIRED in cli_confs) and (cli_confs[ENV_REQUIRED] is True):
           raise Exception("environ required but not provided (cluster/role/env = %s). See %s in %s"
                           % (cluster_role_env, ENV_REQUIRED, cli_conf_file))
-        else:
-          parts.append(ENVIRON)
+        parts.append(ENVIRON)
 
   # if cluster or role or environ is empty, print
   if len(parts[0]) == 0 or len(parts[1]) == 0 or len(parts[2]) == 0:
@@ -342,12 +329,14 @@
       return True
 
     # if role is required but not provided, raise exception
-    role_present = True if len(cluster_role_env[1]) > 0 else False
+    role_present = bool(cluster_role_env[1])
+    # pylint: disable=simplifiable-if-expression
     if ROLE_REQUIRED in client_confs and client_confs[ROLE_REQUIRED] and not role_present:
       raise Exception("role required but not provided (cluster/role/env = %s). See %s in %s"
                       % (cluster_role_env, ROLE_REQUIRED, cli_conf_file))
 
     # if environ is required but not provided, raise exception
+    # pylint: disable=simplifiable-if-expression
     environ_present = True if len(cluster_role_env[2]) > 0 else False
     if ENV_REQUIRED in client_confs and client_confs[ENV_REQUIRED] and not environ_present:
       raise Exception("environ required but not provided (cluster/role/env = %s). See %s in %s"
@@ -362,13 +351,15 @@
   cmap = config_map[cluster_role_env[0]]
 
   # if role is required but not provided, raise exception
-  role_present = True if len(cluster_role_env[1]) > 0 else False
+  role_present = bool(cluster_role_env[1])
+  # pylint: disable=simplifiable-if-expression
   if ROLE_KEY in cmap and cmap[ROLE_KEY] and not role_present:
     raise Exception("role required but not provided (cluster/role/env = %s)."\
         % (cluster_role_env))
 
   # if environ is required but not provided, raise exception
   environ_present = True if len(cluster_role_env[2]) > 0 else False
+  # pylint: disable=simplifiable-if-expression
   if ENVIRON_KEY in cmap and cmap[ENVIRON_KEY] and not environ_present:
     raise Exception("environ required but not provided (cluster/role/env = %s)."\
         % (cluster_role_env))
@@ -430,8 +421,8 @@
   java_home = os.environ.get("JAVA_HOME")
   if java_home:
     return os.path.join(java_home, BIN_DIR, "java")
-  # this could use shutil.which("java") when python2 support is dropped
-  return None
+
+  return shutil.which("java")
 
 
 def check_release_file_exists():
@@ -445,15 +436,9 @@
 
   return True
 
-def print_build_info(zipped_pex=False):
-  """Print build_info from release.yaml
-
-  :param zipped_pex: True if the PEX file is built with flag `zip_safe=False'.
-  """
-  if zipped_pex:
-    release_file = get_zipped_heron_release_file()
-  else:
-    release_file = get_heron_release_file()
+def print_build_info():
+  """Print build_info from release.yaml"""
+  release_file = get_heron_release_file()
 
   with open(release_file) as release_info:
     release_map = yaml.load(release_info)
@@ -461,15 +446,9 @@
     for key, value in release_items:
       print("%s : %s" % (key, value))
 
-def get_version_number(zipped_pex=False):
-  """Print version from release.yaml
-
-  :param zipped_pex: True if the PEX file is built with flag `zip_safe=False'.
-  """
-  if zipped_pex:
-    release_file = get_zipped_heron_release_file()
-  else:
-    release_file = get_heron_release_file()
+def get_version_number():
+  """Print version from release.yaml"""
+  release_file = get_heron_release_file()
   with open(release_file) as release_info:
     for line in release_info:
       trunks = line[:-1].split(' ')
diff --git a/heron/tools/common/src/python/utils/heronparser.py b/heron/tools/common/src/python/utils/heronparser.py
index 142e7e9..ee5a3e9 100755
--- a/heron/tools/common/src/python/utils/heronparser.py
+++ b/heron/tools/common/src/python/utils/heronparser.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -78,8 +78,8 @@
       # it means we have captured a non-quoted (real) comment string.
       if match.group(1) is not None:
         return ""  # so we will return empty to remove the comment
-      else:  # otherwise, we will return the 1st group
-        return match.group(1)  # captured quoted-string
+      # otherwise, we will return the 1st group
+      return match.group(1)  # captured quoted-string
     return regex.sub(_replacer, string)
 
   @classmethod
@@ -101,6 +101,7 @@
 
   @classmethod
   def initializeFromRC(cls, rcfile):
+    """Initialise."""
     if len(cls.cmdmap) > 0:
       return
     effective_rc = (rcfile, HERON_RC)[rcfile is None]
diff --git a/heron/tools/explorer/src/python/args.py b/heron/tools/explorer/src/python/args.py
index 2d5aca4..c312c35 100644
--- a/heron/tools/explorer/src/python/args.py
+++ b/heron/tools/explorer/src/python/args.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/explorer/src/python/clusters.py b/heron/tools/explorer/src/python/clusters.py
index 4efb25a..bdd208f 100644
--- a/heron/tools/explorer/src/python/clusters.py
+++ b/heron/tools/explorer/src/python/clusters.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/explorer/src/python/help.py b/heron/tools/explorer/src/python/help.py
index 37edee9..6995849 100644
--- a/heron/tools/explorer/src/python/help.py
+++ b/heron/tools/explorer/src/python/help.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -60,6 +60,5 @@
   if subparser:
     print(subparser.format_help())
     return True
-  else:
-    Log.error("Unknown subcommand \'%s\'" % command_help)
-    return False
+  Log.error("Unknown subcommand \'%s\'" % command_help)
+  return False
diff --git a/heron/tools/explorer/src/python/logicalplan.py b/heron/tools/explorer/src/python/logicalplan.py
index f6eb245..3b6fa48 100644
--- a/heron/tools/explorer/src/python/logicalplan.py
+++ b/heron/tools/explorer/src/python/logicalplan.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/explorer/src/python/main.py b/heron/tools/explorer/src/python/main.py
index 3093ac9..a1c8224 100644
--- a/heron/tools/explorer/src/python/main.py
+++ b/heron/tools/explorer/src/python/main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -100,29 +100,29 @@
     return clusters.run(command, *args)
 
   # show topologies
-  elif command == 'topologies':
+  if command == 'topologies':
     return topologies.run(command, *args)
 
   # physical plan
-  elif command == 'containers':
+  if command == 'containers':
     return physicalplan.run_containers(command, *args)
-  elif command == 'metrics':
+  if command == 'metrics':
     return physicalplan.run_metrics(command, *args)
 
   # logical plan
-  elif command == 'components':
+  if command == 'components':
     return logicalplan.run_components(command, *args)
-  elif command == 'spouts':
+  if command == 'spouts':
     return logicalplan.run_spouts(command, *args)
-  elif command == 'bolts':
+  if command == 'bolts':
     return logicalplan.run_bolts(command, *args)
 
   # help
-  elif command == 'help':
+  if command == 'help':
     return help.run(command, *args)
 
   # version
-  elif command == 'version':
+  if command == 'version':
     return version.run(command, *args)
 
   return 1
diff --git a/heron/tools/explorer/src/python/opts.py b/heron/tools/explorer/src/python/opts.py
index 8b3e069..47c91f0 100644
--- a/heron/tools/explorer/src/python/opts.py
+++ b/heron/tools/explorer/src/python/opts.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/explorer/src/python/physicalplan.py b/heron/tools/explorer/src/python/physicalplan.py
index 50ad7c3..a577ee2 100644
--- a/heron/tools/explorer/src/python/physicalplan.py
+++ b/heron/tools/explorer/src/python/physicalplan.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/explorer/src/python/topologies.py b/heron/tools/explorer/src/python/topologies.py
index dab24a7..ecb0dfc 100644
--- a/heron/tools/explorer/src/python/topologies.py
+++ b/heron/tools/explorer/src/python/topologies.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -50,8 +50,7 @@
         count += 1
         if count > max_count:
           continue
-        else:
-          table.append([role, env, topo])
+        table.append([role, env, topo])
   header = ['role', 'env', 'topology']
   rest_count = 0 if count <= max_count else count - max_count
   return table, header, rest_count
@@ -123,10 +122,9 @@
   location = cl_args['cluster/[role]/[env]'].split('/')
   if len(location) == 1:
     return show_cluster(cl_args, *location)
-  elif len(location) == 2:
+  if len(location) == 2:
     return show_cluster_role(cl_args, *location)
-  elif len(location) == 3:
+  if len(location) == 3:
     return show_cluster_role_env(cl_args, *location)
-  else:
-    Log.error('Invalid topologies selection')
-    return False
+  Log.error('Invalid topologies selection')
+  return False
diff --git a/heron/tools/explorer/src/python/version.py b/heron/tools/explorer/src/python/version.py
index 00f7e51..84773f5 100644
--- a/heron/tools/explorer/src/python/version.py
+++ b/heron/tools/explorer/src/python/version.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/explorer/tests/python/explorer_unittest.py b/heron/tools/explorer/tests/python/explorer_unittest.py
index 4070337..ab942fa 100644
--- a/heron/tools/explorer/tests/python/explorer_unittest.py
+++ b/heron/tools/explorer/tests/python/explorer_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/config.py b/heron/tools/tracker/src/python/config.py
index 089b03e..b449cec 100644
--- a/heron/tools/tracker/src/python/config.py
+++ b/heron/tools/tracker/src/python/config.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -28,7 +28,7 @@
 EXTRA_LINK_FORMATTER_KEY = "formatter"
 EXTRA_LINK_URL_KEY = "url"
 
-class Config(object):
+class Config:
   """
   Responsible for reading the yaml config file and
   exposing various tracker configs.
diff --git a/heron/tools/tracker/src/python/constants.py b/heron/tools/tracker/src/python/constants.py
index 695a608..2ae59ba 100644
--- a/heron/tools/tracker/src/python/constants.py
+++ b/heron/tools/tracker/src/python/constants.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/graph.py b/heron/tools/tracker/src/python/graph.py
index 747fd5d..b73f084 100644
--- a/heron/tools/tracker/src/python/graph.py
+++ b/heron/tools/tracker/src/python/graph.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,7 +22,7 @@
 
 
 ################################################################################
-class Graph(object):
+class Graph:
   '''
   Adjacency list of edges in graph. This will correspond to the streams in a topology DAG.
   '''
diff --git a/heron/tools/tracker/src/python/handlers/basehandler.py b/heron/tools/tracker/src/python/handlers/basehandler.py
index 300fbf3..49c2892 100644
--- a/heron/tools/tracker/src/python/handlers/basehandler.py
+++ b/heron/tools/tracker/src/python/handlers/basehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/clustershandler.py b/heron/tools/tracker/src/python/handlers/clustershandler.py
index d8aeede..73af255 100644
--- a/heron/tools/tracker/src/python/handlers/clustershandler.py
+++ b/heron/tools/tracker/src/python/handlers/clustershandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/containerfilehandler.py b/heron/tools/tracker/src/python/handlers/containerfilehandler.py
index 234ae20..749d0d5 100644
--- a/heron/tools/tracker/src/python/handlers/containerfilehandler.py
+++ b/heron/tools/tracker/src/python/handlers/containerfilehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -103,6 +103,7 @@
 
   @tornado.gen.coroutine
   def get(self):
+    """Serve a GET request."""
     try:
       cluster = self.get_argument_cluster()
       role = self.get_argument_role()
diff --git a/heron/tools/tracker/src/python/handlers/defaulthandler.py b/heron/tools/tracker/src/python/handlers/defaulthandler.py
index ff8093e..a30443b 100644
--- a/heron/tools/tracker/src/python/handlers/defaulthandler.py
+++ b/heron/tools/tracker/src/python/handlers/defaulthandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/exceptionhandler.py b/heron/tools/tracker/src/python/handlers/exceptionhandler.py
index 2e67e79..c43f285 100644
--- a/heron/tools/tracker/src/python/handlers/exceptionhandler.py
+++ b/heron/tools/tracker/src/python/handlers/exceptionhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/exceptionsummaryhandler.py b/heron/tools/tracker/src/python/handlers/exceptionsummaryhandler.py
index b943e5f..b613807 100644
--- a/heron/tools/tracker/src/python/handlers/exceptionsummaryhandler.py
+++ b/heron/tools/tracker/src/python/handlers/exceptionsummaryhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/executionstatehandler.py b/heron/tools/tracker/src/python/handlers/executionstatehandler.py
index 4d1793b..b97c217 100644
--- a/heron/tools/tracker/src/python/handlers/executionstatehandler.py
+++ b/heron/tools/tracker/src/python/handlers/executionstatehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/jmaphandler.py b/heron/tools/tracker/src/python/handlers/jmaphandler.py
index 2e102cd..6d41c4c 100644
--- a/heron/tools/tracker/src/python/handlers/jmaphandler.py
+++ b/heron/tools/tracker/src/python/handlers/jmaphandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/jstackhandler.py b/heron/tools/tracker/src/python/handlers/jstackhandler.py
index 48c583f..d5f4a64 100644
--- a/heron/tools/tracker/src/python/handlers/jstackhandler.py
+++ b/heron/tools/tracker/src/python/handlers/jstackhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/logicalplanhandler.py b/heron/tools/tracker/src/python/handlers/logicalplanhandler.py
index 2da1cd2..25c2acf 100644
--- a/heron/tools/tracker/src/python/handlers/logicalplanhandler.py
+++ b/heron/tools/tracker/src/python/handlers/logicalplanhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/machineshandler.py b/heron/tools/tracker/src/python/handlers/machineshandler.py
index bef23d9..6dfa838 100644
--- a/heron/tools/tracker/src/python/handlers/machineshandler.py
+++ b/heron/tools/tracker/src/python/handlers/machineshandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/mainhandler.py b/heron/tools/tracker/src/python/handlers/mainhandler.py
index 90e9347..1b09e08 100644
--- a/heron/tools/tracker/src/python/handlers/mainhandler.py
+++ b/heron/tools/tracker/src/python/handlers/mainhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/memoryhistogramhandler.py b/heron/tools/tracker/src/python/handlers/memoryhistogramhandler.py
index 10c32b9..3ac3293 100644
--- a/heron/tools/tracker/src/python/handlers/memoryhistogramhandler.py
+++ b/heron/tools/tracker/src/python/handlers/memoryhistogramhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/metadatahandler.py b/heron/tools/tracker/src/python/handlers/metadatahandler.py
index f143258..c581ff8 100644
--- a/heron/tools/tracker/src/python/handlers/metadatahandler.py
+++ b/heron/tools/tracker/src/python/handlers/metadatahandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/metricshandler.py b/heron/tools/tracker/src/python/handlers/metricshandler.py
index 5047116..5f79f62 100644
--- a/heron/tools/tracker/src/python/handlers/metricshandler.py
+++ b/heron/tools/tracker/src/python/handlers/metricshandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/metricsqueryhandler.py b/heron/tools/tracker/src/python/handlers/metricsqueryhandler.py
index 2ea9183..a0e367a 100644
--- a/heron/tools/tracker/src/python/handlers/metricsqueryhandler.py
+++ b/heron/tools/tracker/src/python/handlers/metricsqueryhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/metricstimelinehandler.py b/heron/tools/tracker/src/python/handlers/metricstimelinehandler.py
index fb41385..267734e 100644
--- a/heron/tools/tracker/src/python/handlers/metricstimelinehandler.py
+++ b/heron/tools/tracker/src/python/handlers/metricstimelinehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/packingplanhandler.py b/heron/tools/tracker/src/python/handlers/packingplanhandler.py
index ced6eb1..387b488 100644
--- a/heron/tools/tracker/src/python/handlers/packingplanhandler.py
+++ b/heron/tools/tracker/src/python/handlers/packingplanhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/physicalplanhandler.py b/heron/tools/tracker/src/python/handlers/physicalplanhandler.py
index ff98e5f..2fd6223 100644
--- a/heron/tools/tracker/src/python/handlers/physicalplanhandler.py
+++ b/heron/tools/tracker/src/python/handlers/physicalplanhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/pidhandler.py b/heron/tools/tracker/src/python/handlers/pidhandler.py
index 39d3e9b..0eb5491 100644
--- a/heron/tools/tracker/src/python/handlers/pidhandler.py
+++ b/heron/tools/tracker/src/python/handlers/pidhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/runtimestatehandler.py b/heron/tools/tracker/src/python/handlers/runtimestatehandler.py
index b776517..44e2fc1 100644
--- a/heron/tools/tracker/src/python/handlers/runtimestatehandler.py
+++ b/heron/tools/tracker/src/python/handlers/runtimestatehandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/schedulerlocationhandler.py b/heron/tools/tracker/src/python/handlers/schedulerlocationhandler.py
index 502e922..670e1a6 100644
--- a/heron/tools/tracker/src/python/handlers/schedulerlocationhandler.py
+++ b/heron/tools/tracker/src/python/handlers/schedulerlocationhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/stateshandler.py b/heron/tools/tracker/src/python/handlers/stateshandler.py
index c2f398f..2cf80cb 100644
--- a/heron/tools/tracker/src/python/handlers/stateshandler.py
+++ b/heron/tools/tracker/src/python/handlers/stateshandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/topologieshandler.py b/heron/tools/tracker/src/python/handlers/topologieshandler.py
index 1beb567..cbb9e1b 100644
--- a/heron/tools/tracker/src/python/handlers/topologieshandler.py
+++ b/heron/tools/tracker/src/python/handlers/topologieshandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/topologyconfighandler.py b/heron/tools/tracker/src/python/handlers/topologyconfighandler.py
index c16a969..14cbf42 100644
--- a/heron/tools/tracker/src/python/handlers/topologyconfighandler.py
+++ b/heron/tools/tracker/src/python/handlers/topologyconfighandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/handlers/topologyhandler.py b/heron/tools/tracker/src/python/handlers/topologyhandler.py
index 242cfc1..0a1b417 100644
--- a/heron/tools/tracker/src/python/handlers/topologyhandler.py
+++ b/heron/tools/tracker/src/python/handlers/topologyhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/javaobj.py b/heron/tools/tracker/src/python/javaobj.py
index d5978ba..2bf1b23 100644
--- a/heron/tools/tracker/src/python/javaobj.py
+++ b/heron/tools/tracker/src/python/javaobj.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -28,8 +28,8 @@
 See: http://download.oracle.com/javase/6/docs/platform/serialization/spec/protocol.html
 """
 
+import io
 import struct
-import six
 
 from heron.common.src.python.utils.log import Log
 
@@ -54,12 +54,12 @@
 
 
 # pylint: disable=undefined-variable
-def loads(value):
+def loads(value: bytes):
   """
   Deserializes Java objects and primitive data serialized by ObjectOutputStream
   from a string.
   """
-  f = six.StringIO(value)
+  f = io.BytesIO(value)
   marshaller = JavaObjectUnmarshaller(f)
   marshaller.add_transformer(DefaultObjectTransformer())
   return marshaller.readObject()
@@ -78,7 +78,7 @@
     "java.lang.Integer",
     "java.lang.Long"])
 
-class JavaClass(object):
+class JavaClass:
   """Java class representation"""
   def __init__(self):
     self.name = None
@@ -95,7 +95,7 @@
     return "[%s:0x%X]" % (self.name, self.serialVersionUID)
 
 
-class JavaObject(object):
+class JavaObject:
   """Java object representation"""
   def __init__(self):
     self.classdesc = None
@@ -132,7 +132,7 @@
     for name in self.classdesc.fields_names:
       new_object.__setattr__(name, getattr(self, name))
 
-class JavaObjectConstants(object):
+class JavaObjectConstants:
   """class about Java object constants"""
 
   STREAM_MAGIC = 0xaced
@@ -222,7 +222,7 @@
 
       position_bak = self.object_stream.tell()
       the_rest = self.object_stream.read()
-      if len(the_rest):
+      if the_rest:
         log_error("Warning!!!!: Stream still has %s bytes left.\
 Enable debug mode of logging to see the hexdump." % len(the_rest))
         log_debug(self._create_hexdump(the_rest))
@@ -457,7 +457,7 @@
     assert type_char == self.TYPE_ARRAY
     type_char = classdesc.name[1]
 
-    if type_char == self.TYPE_OBJECT or type_char == self.TYPE_ARRAY:
+    if type_char in (self.TYPE_OBJECT, self.TYPE_ARRAY):
       for _ in range(size):
         _, res = self._read_and_exec_opcode(ident=ident+1)
         log_debug("Object value: %s" % str(res), ident)
@@ -519,7 +519,7 @@
       (res, ) = self._readStruct(">f")
     elif field_type == self.TYPE_DOUBLE:
       (res, ) = self._readStruct(">d")
-    elif field_type == self.TYPE_OBJECT or field_type == self.TYPE_ARRAY:
+    elif field_type in (self.TYPE_OBJECT, self.TYPE_ARRAY):
       _, res = self._read_and_exec_opcode(ident=ident+1)
     else:
       raise RuntimeError("Unknown typecode: %s" % field_type)
@@ -533,8 +533,7 @@
 
     if typecode in self.TYPECODES_LIST:
       return typecode
-    else:
-      raise RuntimeError("Typecode %s (%s) isn't supported." % (type_char, typecode))
+    raise RuntimeError("Typecode %s (%s) isn't supported." % (type_char, typecode))
 
   def _add_reference(self, obj):
     self.references.append(obj)
@@ -545,7 +544,7 @@
     log_error("Stream seeking back at -16 byte (2nd line is an actual position!):")
     self.object_stream.seek(-16, mode=1)
     the_rest = self.object_stream.read()
-    if len(the_rest):
+    if the_rest:
       log_error("Warning!!!!: Stream still has %s bytes left." % len(the_rest))
       log_error(self._create_hexdump(the_rest))
     log_error("=" * 30)
@@ -559,7 +558,7 @@
   # pylint: disable=attribute-defined-outside-init
   def dump(self, obj):
     self.object_obj = obj
-    self.object_stream = six.StringIO()
+    self.object_stream = io.BytesIO()
     self._writeStreamHeader()
     self.writeObject(obj)
     return self.object_stream.getvalue()
@@ -597,7 +596,7 @@
     self._writeStruct(">B", 1, (self.TC_OBJECT, ))
     self._writeStruct(">B", 1, (self.TC_CLASSDESC, ))
 
-class DefaultObjectTransformer(object):
+class DefaultObjectTransformer:
 
   class JavaList(list, JavaObject):
     pass
@@ -619,7 +618,6 @@
       obj.copy(new_object)
       new_object.extend(obj.annotations[1:])
       return new_object
-    # pylint: disable=redefined-variable-type
     if obj.get_class().name == "java.util.HashMap":
       new_object = self.JavaMap()
       obj.copy(new_object)
diff --git a/heron/tools/tracker/src/python/main.py b/heron/tools/tracker/src/python/main.py
index 581894d..689d032 100644
--- a/heron/tools/tracker/src/python/main.py
+++ b/heron/tools/tracker/src/python/main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/metricstimeline.py b/heron/tools/tracker/src/python/metricstimeline.py
index 5cc0cf1..29fcc2a 100644
--- a/heron/tools/tracker/src/python/metricstimeline.py
+++ b/heron/tools/tracker/src/python/metricstimeline.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/tracker/src/python/pyutils.py b/heron/tools/tracker/src/python/pyutils.py
index 8bdff1a..c7cb277 100644
--- a/heron/tools/tracker/src/python/pyutils.py
+++ b/heron/tools/tracker/src/python/pyutils.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -27,5 +27,4 @@
 def is_str_instance(obj):
   if isPY3:
     return isinstance(obj, str)
-  else:
-    return str(type(obj)) == "<type 'unicode'>" or str(type(obj)) == "<type 'str'>"
+  return str(type(obj)) == "<type 'unicode'>" or str(type(obj)) == "<type 'str'>"
diff --git a/heron/tools/tracker/src/python/query.py b/heron/tools/tracker/src/python/query.py
index 67599a6..5d89a52 100644
--- a/heron/tools/tracker/src/python/query.py
+++ b/heron/tools/tracker/src/python/query.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -30,7 +30,7 @@
 ####################################################################
 
 # pylint: disable=no-self-use
-class Query(object):
+class Query:
   """Execute the query for metrics. Uses Tracker to get
      individual metrics that are part of the query.
      Example usage:
@@ -109,8 +109,7 @@
       # This must be the last index, since this was an NOP starting brace
       if index != len(query) - 1:
         raise Exception("Invalid syntax")
-      else:
-        return self.parse_query_string(query[1:-1])
+      return self.parse_query_string(query[1:-1])
     start_index = query.find("(")
     # There must be a ( in the query
     if start_index < 0:
diff --git a/heron/tools/tracker/src/python/query_operators.py b/heron/tools/tracker/src/python/query_operators.py
index dd348a7..dd460c5 100644
--- a/heron/tools/tracker/src/python/query_operators.py
+++ b/heron/tools/tracker/src/python/query_operators.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -32,13 +32,12 @@
 def is_str_instance(obj):
   if isPY3:
     return isinstance(obj, str)
-  else:
-    return str(type(obj)) == "<type 'unicode'>" or str(type(obj)) == "<type 'str'>"
+  return str(type(obj)) == "<type 'unicode'>" or str(type(obj)) == "<type 'str'>"
 
 #####################################################################
 # Data Structure for fetched Metrics
 #####################################################################
-class Metrics(object):
+class Metrics:
   """Represents a univariate timeseries.
   Multivariate timeseries is simply a list of this."""
   def __init__(self, componentName, metricName, instance, start, end, timeline):
@@ -55,17 +54,17 @@
     """ floor timestamp """
     ret = {}
     for timestamp, value in list(timeline.items()):
-      ts = timestamp / 60 * 60
+      ts = timestamp // 60 * 60
       if start <= ts <= end:
         ret[ts] = value
     return ret
 
   def setDefault(self, constant, start, end):
     """ set default time """
-    starttime = start / 60 * 60
+    starttime = start // 60 * 60
     if starttime < start:
       starttime += 60
-    endtime = end / 60 * 60
+    endtime = end // 60 * 60
     while starttime <= endtime:
       # STREAMCOMP-1559
       # Second check is a work around, because the response from tmaster
@@ -81,7 +80,7 @@
 ################################################################
 
 # pylint: disable=no-self-use
-class Operator(object):
+class Operator:
   """Base class for all operators"""
   def __init__(self, _):
     raise Exception("Not implemented exception")
@@ -438,7 +437,7 @@
         allMetrics.append(met)
       raise tornado.gen.Return(allMetrics)
     # If first is univariate
-    elif len(metrics) == 1 and "" in metrics:
+    if len(metrics) == 1 and "" in metrics:
       allMetrics = []
       for key, metric in list(metrics2.items()):
         # Initialize with first metrics timeline, but second metric's instance
@@ -452,19 +451,17 @@
         allMetrics.append(met)
       raise tornado.gen.Return(allMetrics)
     # If second is univariate
-    else:
-      allMetrics = []
-      for key, metric in list(metrics.items()):
-        # Initialize with first metrics timeline and its instance
-        met = Metrics(None, None, metric.instance, start, end, dict(metric.timeline))
-        for timestamp in list(met.timeline.keys()):
-          if timestamp not in metrics2[""].timeline or metrics2[""].timeline[timestamp] == 0:
-            met.timeline.pop(timestamp)
-          else:
-            met.timeline[timestamp] /= metrics2[""].timeline[timestamp]
-        allMetrics.append(met)
-      raise tornado.gen.Return(allMetrics)
-    raise Exception("This should not be generated.")
+    allMetrics = []
+    for key, metric in list(metrics.items()):
+      # Initialize with first metrics timeline and its instance
+      met = Metrics(None, None, metric.instance, start, end, dict(metric.timeline))
+      for timestamp in list(met.timeline.keys()):
+        if timestamp not in metrics2[""].timeline or metrics2[""].timeline[timestamp] == 0:
+          met.timeline.pop(timestamp)
+        else:
+          met.timeline[timestamp] /= metrics2[""].timeline[timestamp]
+      allMetrics.append(met)
+    raise tornado.gen.Return(allMetrics)
 
 class Multiply(Operator):
   """Multiply Operator. Has same conditions as division operator.
@@ -563,7 +560,7 @@
         allMetrics.append(met)
       raise tornado.gen.Return(allMetrics)
     # If first is univariate
-    elif len(metrics) == 1 and "" in metrics:
+    if len(metrics) == 1 and "" in metrics:
       allMetrics = []
       for key, metric in list(metrics2.items()):
         # Initialize with first metrics timeline, but second metric's instance
@@ -577,19 +574,17 @@
         allMetrics.append(met)
       raise tornado.gen.Return(allMetrics)
     # If second is univariate
-    else:
-      allMetrics = []
-      for key, metric in list(metrics.items()):
-        # Initialize with first metrics timeline and its instance
-        met = Metrics(None, None, metric.instance, start, end, dict(metric.timeline))
-        for timestamp in list(met.timeline.keys()):
-          if timestamp not in metrics2[""].timeline:
-            met.timeline.pop(timestamp)
-          else:
-            met.timeline[timestamp] *= metrics2[""].timeline[timestamp]
-        allMetrics.append(met)
-      raise tornado.gen.Return(allMetrics)
-    raise Exception("This should not be generated.")
+    allMetrics = []
+    for key, metric in list(metrics.items()):
+      # Initialize with first metrics timeline and its instance
+      met = Metrics(None, None, metric.instance, start, end, dict(metric.timeline))
+      for timestamp in list(met.timeline.keys()):
+        if timestamp not in metrics2[""].timeline:
+          met.timeline.pop(timestamp)
+        else:
+          met.timeline[timestamp] *= metrics2[""].timeline[timestamp]
+      allMetrics.append(met)
+    raise tornado.gen.Return(allMetrics)
 
 class Subtract(Operator):
   """Subtract Operator. Has same conditions as division operator.
@@ -686,7 +681,7 @@
         allMetrics.append(met)
       raise tornado.gen.Return(allMetrics)
     # If first is univariate
-    elif len(metrics) == 1 and "" in metrics:
+    if len(metrics) == 1 and "" in metrics:
       allMetrics = []
       for key, metric in list(metrics2.items()):
         # Initialize with first metrics timeline, but second metric's instance
@@ -700,19 +695,17 @@
         allMetrics.append(met)
       raise tornado.gen.Return(allMetrics)
     # If second is univariate
-    else:
-      allMetrics = []
-      for key, metric in list(metrics.items()):
-        # Initialize with first metrics timeline and its instance
-        met = Metrics(None, None, metric.instance, start, end, dict(metric.timeline))
-        for timestamp in list(met.timeline.keys()):
-          if timestamp not in metrics2[""].timeline:
-            met.timeline.pop(timestamp)
-          else:
-            met.timeline[timestamp] -= metrics2[""].timeline[timestamp]
-        allMetrics.append(met)
-      raise tornado.gen.Return(allMetrics)
-    raise Exception("This should not be generated.")
+    allMetrics = []
+    for key, metric in list(metrics.items()):
+      # Initialize with first metrics timeline and its instance
+      met = Metrics(None, None, metric.instance, start, end, dict(metric.timeline))
+      for timestamp in list(met.timeline.keys()):
+        if timestamp not in metrics2[""].timeline:
+          met.timeline.pop(timestamp)
+        else:
+          met.timeline[timestamp] -= metrics2[""].timeline[timestamp]
+      allMetrics.append(met)
+    raise tornado.gen.Return(allMetrics)
 
 class Rate(Operator):
   """Rate Operator. This operator is used to find rate of change for all timeseries.
diff --git a/heron/tools/tracker/src/python/topology.py b/heron/tools/tracker/src/python/topology.py
index 0c55856..ab51374 100644
--- a/heron/tools/tracker/src/python/topology.py
+++ b/heron/tools/tracker/src/python/topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -27,7 +27,7 @@
 from heronpy.api import api_constants
 
 # pylint: disable=too-many-instance-attributes
-class Topology(object):
+class Topology:
   """
   Class Topology
     Contains all the relevant information about
@@ -242,9 +242,8 @@
 
     if status == 1:
       return "Running"
-    elif status == 2:
+    if status == 2:
       return "Paused"
-    elif status == 3:
+    if status == 3:
       return "Killed"
-    else:
-      return "Unknown"
+    return "Unknown"
diff --git a/heron/tools/tracker/src/python/tracker.py b/heron/tools/tracker/src/python/tracker.py
index e2d16fc..81cc06b 100644
--- a/heron/tools/tracker/src/python/tracker.py
+++ b/heron/tools/tracker/src/python/tracker.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -81,8 +81,7 @@
     Log.exception("Failed to parse data as java object")
     if include_non_primitives:
       return _raw_value(kv)
-    else:
-      return None
+    return None
 
 def _raw_value(kv):
   return {
@@ -91,7 +90,7 @@
       'raw' : utils.hex_escape(kv.serialized_value)}
 
 
-class Tracker(object):
+class Tracker:
   """
   Tracker is a stateless cache of all the topologies
   for the given state managers. It watches for
@@ -129,7 +128,6 @@
       traceback.print_exc()
       sys.exit(1)
 
-    # pylint: disable=deprecated-lambda
     def on_topologies_watch(state_manager, topologies):
       """watch topologies"""
       Log.info("State watch triggered for topologies.")
@@ -157,7 +155,6 @@
     for state_manager in self.state_managers:
       state_manager.stop()
 
-  # pylint: disable=deprecated-lambda
   def getTopologyByClusterRoleEnvironAndName(self, cluster, role, environ, topologyName):
     """
     Find and return the topology given its cluster, environ, topology name, and
@@ -172,9 +169,8 @@
       if role is not None:
         raise Exception("Topology not found for {0}, {1}, {2}, {3}".format(
             cluster, role, environ, topologyName))
-      else:
-        raise Exception("Topology not found for {0}, {1}, {2}".format(
-            cluster, environ, topologyName))
+      raise Exception("Topology not found for {0}, {1}, {2}".format(
+          cluster, environ, topologyName))
 
     # There is only one topology which is returned.
     return topologies[0]
@@ -316,14 +312,10 @@
   @staticmethod
   def extract_runtime_state(topology):
     runtime_state = {}
-    runtime_state["has_physical_plan"] = \
-      True if topology.physical_plan else False
-    runtime_state["has_packing_plan"] = \
-      True if topology.packing_plan else False
-    runtime_state["has_tmaster_location"] = \
-      True if topology.tmaster else False
-    runtime_state["has_scheduler_location"] = \
-      True if topology.scheduler_location else False
+    runtime_state["has_physical_plan"] = bool(topology.physical_plan)
+    runtime_state["has_packing_plan"] = bool(topology.packing_plan)
+    runtime_state["has_tmaster_location"] = bool(topology.tmaster)
+    runtime_state["has_scheduler_location"] = bool(topology.scheduler_location)
     # "stmgrs" listed runtime state for each stream manager
     # however it is possible that physical plan is not complete
     # yet and we do not know how many stmgrs there are. That said,
diff --git a/heron/tools/tracker/src/python/utils.py b/heron/tools/tracker/src/python/utils.py
index 6e7bed0..c78f1b4 100644
--- a/heron/tools/tracker/src/python/utils.py
+++ b/heron/tools/tracker/src/python/utils.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -27,8 +27,10 @@
 import string
 import sys
 import subprocess
+from pathlib import Path
 import yaml
 
+
 # directories for heron tools distribution
 BIN_DIR = "bin"
 CONF_DIR = "conf"
@@ -76,8 +78,7 @@
     return None
   if not instance_id:
     return "http://%s:%d/browse/log-files" % (host, shell_port)
-  else:
-    return "http://%s:%d/file/log-files/%s.log.0" % (host, shell_port, instance_id)
+  return "http://%s:%d/file/log-files/%s.log.0" % (host, shell_port, instance_id)
 
 def make_shell_logfile_data_url(host, shell_port, instance_id, offset, length):
   """
@@ -117,7 +118,7 @@
   :return: the path in windows
   """
   command = ['cygpath', '-wp', x]
-  p = subprocess.Popen(command, stdout=subprocess.PIPE)
+  p = subprocess.Popen(command, stdout=subprocess.PIPE, universal_newlines=True)
   output, _ = p.communicate()
   lines = output.split("\n")
   return lines[0]
@@ -139,8 +140,9 @@
   This will extract heron tracker directory from .pex file.
   :return: root location for heron-tools.
   """
-  path = "/".join(os.path.realpath(__file__).split('/')[:-8])
-  return normalized_class_path(path)
+  # assuming the tracker runs from $HERON_ROOT/bin/heron-tracker
+  root = Path(sys.argv[0]).resolve(strict=True).parent.parent
+  return normalized_class_path(str(root))
 
 def get_heron_tracker_bin_dir():
   """
diff --git a/heron/tools/tracker/tests/python/mock_proto.py b/heron/tools/tracker/tests/python/mock_proto.py
index b92662b..82cd4de 100644
--- a/heron/tools/tracker/tests/python/mock_proto.py
+++ b/heron/tools/tracker/tests/python/mock_proto.py
@@ -23,7 +23,7 @@
 import heron.proto.topology_pb2 as protoTopology
 
 # pylint: disable=no-self-use, missing-docstring
-class MockProto(object):
+class MockProto:
   ''' Mocking Proto'''
   topology_name = "mock_topology_name"
   topology_id = "mock_topology_id"
diff --git a/heron/tools/ui/src/python/args.py b/heron/tools/ui/src/python/args.py
index 3fbf806..54022f4 100644
--- a/heron/tools/ui/src/python/args.py
+++ b/heron/tools/ui/src/python/args.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/consts.py b/heron/tools/ui/src/python/consts.py
index 7c643a7..7116a53 100644
--- a/heron/tools/ui/src/python/consts.py
+++ b/heron/tools/ui/src/python/consts.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -32,4 +32,4 @@
 
 DEFAULT_BASE_URL = ""
 
-VERSION = common_config.get_version_number(zipped_pex=True)
+VERSION = common_config.get_version_number()
diff --git a/heron/tools/ui/src/python/handlers/api/metrics.py b/heron/tools/ui/src/python/handlers/api/metrics.py
index 770d270..d70b1fd 100644
--- a/heron/tools/ui/src/python/handlers/api/metrics.py
+++ b/heron/tools/ui/src/python/handlers/api/metrics.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/api/topology.py b/heron/tools/ui/src/python/handlers/api/topology.py
index f7c9100..c519bc5 100644
--- a/heron/tools/ui/src/python/handlers/api/topology.py
+++ b/heron/tools/ui/src/python/handlers/api/topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -58,13 +58,13 @@
     else:
       comp_names = [comp_name]
     exception_infos = dict()
-    for comp_name in comp_names:
-      exception_infos[comp_name] = yield access.get_component_exceptionsummary(
-          cluster, environ, topology, comp_name)
+    for comp_name_ in comp_names:
+      exception_infos[comp_name_] = yield access.get_component_exceptionsummary(
+          cluster, environ, topology, comp_name_)
 
     # Combine exceptions from multiple component
     aggregate_exceptions = dict()
-    for comp_name, exception_logs in list(exception_infos.items()):
+    for comp_name_, exception_logs in list(exception_infos.items()):
       for exception_log in exception_logs:
         class_name = exception_log['class_name']
         if class_name != '':
diff --git a/heron/tools/ui/src/python/handlers/base.py b/heron/tools/ui/src/python/handlers/base.py
index 1d0ee7d..059f3f6 100644
--- a/heron/tools/ui/src/python/handlers/base.py
+++ b/heron/tools/ui/src/python/handlers/base.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/common/consts.py b/heron/tools/ui/src/python/handlers/common/consts.py
index b9a3421..d033153 100644
--- a/heron/tools/ui/src/python/handlers/common/consts.py
+++ b/heron/tools/ui/src/python/handlers/common/consts.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/common/utils.py b/heron/tools/ui/src/python/handlers/common/utils.py
index 5b49236..ccb7b85 100644
--- a/heron/tools/ui/src/python/handlers/common/utils.py
+++ b/heron/tools/ui/src/python/handlers/common/utils.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/mainhandler.py b/heron/tools/ui/src/python/handlers/mainhandler.py
index a3d82f2..59b5b5d 100644
--- a/heron/tools/ui/src/python/handlers/mainhandler.py
+++ b/heron/tools/ui/src/python/handlers/mainhandler.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/notfound.py b/heron/tools/ui/src/python/handlers/notfound.py
index 57547dc..57c62a0 100644
--- a/heron/tools/ui/src/python/handlers/notfound.py
+++ b/heron/tools/ui/src/python/handlers/notfound.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/ranges.py b/heron/tools/ui/src/python/handlers/ranges.py
index be1f5ec..eca3d42 100644
--- a/heron/tools/ui/src/python/handlers/ranges.py
+++ b/heron/tools/ui/src/python/handlers/ranges.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/handlers/topology.py b/heron/tools/ui/src/python/handlers/topology.py
index 441a174..98f502e 100644
--- a/heron/tools/ui/src/python/handlers/topology.py
+++ b/heron/tools/ui/src/python/handlers/topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heron/tools/ui/src/python/main.py b/heron/tools/ui/src/python/main.py
index ee96365..224937a 100644
--- a/heron/tools/ui/src/python/main.py
+++ b/heron/tools/ui/src/python/main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -163,7 +163,7 @@
     r = child_parser.parse_args(args=remaining, namespace=parsed_args)
     namespace = vars(r)
     if 'version' in namespace:
-      common_config.print_build_info(zipped_pex=True)
+      common_config.print_build_info()
     else:
       parser.print_help()
     parser.exit()
diff --git a/heronpy/api/api_constants.py b/heronpy/api/api_constants.py
index 77867f1..0460e24 100644
--- a/heronpy/api/api_constants.py
+++ b/heronpy/api/api_constants.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -24,7 +24,7 @@
 ###########################  Constants for topology configuration ##################################
 ####################################################################################################
 
-class TopologyReliabilityMode(object):
+class TopologyReliabilityMode:
   ATMOST_ONCE = "ATMOST_ONCE"
   ATLEAST_ONCE = "ATLEAST_ONCE"
   EFFECTIVELY_ONCE = "EFFECTIVELY_ONCE"
diff --git a/heronpy/api/bolt/bolt.py b/heronpy/api/bolt/bolt.py
index 89bef6a..0f522ed 100644
--- a/heronpy/api/bolt/bolt.py
+++ b/heronpy/api/bolt/bolt.py
@@ -42,7 +42,6 @@
                     topology, including the task id and component id of this task, input and output
                     information, etc.
     """
-    pass
 
   @abstractmethod
   def process(self, tup):
@@ -77,4 +76,3 @@
     :type tup: :class:`Tuple`
     :param tup: the tick tuple to be processed
     """
-    pass
diff --git a/heronpy/api/bolt/window_bolt.py b/heronpy/api/bolt/window_bolt.py
index 901e08e..6e29eab 100644
--- a/heronpy/api/bolt/window_bolt.py
+++ b/heronpy/api/bolt/window_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -59,7 +59,6 @@
     :type tuples: :class:`list of Tuples`
     :param tuples: The list of tuples in this window
     """
-    pass
 
   # pylint: disable=unused-argument
   def initialize(self, config, context):
@@ -109,8 +108,8 @@
     curtime = int(time.time())
     window_info = WindowContext(curtime - self.window_duration, curtime)
     tuple_batch = []
-    for (tup, tm) in self.current_tuples:
-      tuple_batch.append(tup)
+    for (tuple_, tm) in self.current_tuples:
+      tuple_batch.append(tuple_)
     self.processWindow(window_info, tuple_batch)
     self._expire(curtime)
 
@@ -145,7 +144,6 @@
     :type tuples: :class:`list of Tuples`
     :param tuples: The list of tuples in this window
     """
-    pass
 
   # pylint: disable=unused-argument
   def initialize(self, config, context):
@@ -178,6 +176,6 @@
     curtime = int(time.time())
     window_info = WindowContext(curtime - self.window_duration, curtime)
     self.processWindow(window_info, list(self.current_tuples))
-    for tup in self.current_tuples:
-      self.ack(tup)
+    for tuple_ in self.current_tuples:
+      self.ack(tuple_)
     self.current_tuples.clear()
diff --git a/heronpy/api/cloudpickle.py b/heronpy/api/cloudpickle.py
index eb3bfef..27a2180 100644
--- a/heronpy/api/cloudpickle.py
+++ b/heronpy/api/cloudpickle.py
@@ -6,22 +6,22 @@
 -Deal with other non-serializable objects
 It does not include an unpickler, as standard python unpickling suffices.
 This module was extracted from the `cloud` package, developed by `PiCloud, Inc.
-<http://www.picloud.com>`_.
+<https://web.archive.org/web/20140626004012/http://www.picloud.com/>`_.
 Copyright (c) 2012, Regents of the University of California.
-Copyright (c) 2009 `PiCloud, Inc. <http://www.picloud.com>`_.
+Copyright (c) 2009 `PiCloud, Inc. <https://web.archive.org/web/20140626004012/http://www.picloud.com/>`_.
 All rights reserved.
 Redistribution and use in source and binary forms, with or without
 modification, are permitted provided that the following conditions
 are met:
-  * Redistributions of source code must retain the above copyright
-    notice, this list of conditions and the following disclaimer.
-  * Redistributions in binary form must reproduce the above copyright
-    notice, this list of conditions and the following disclaimer in the
-    documentation and/or other materials provided with the distribution.
-  * Neither the name of the University of California, Berkeley nor the
-    names of its contributors may be used to endorse or promote
-    products derived from this software without specific prior written
-    permission.
+    * Redistributions of source code must retain the above copyright
+      notice, this list of conditions and the following disclaimer.
+    * Redistributions in binary form must reproduce the above copyright
+      notice, this list of conditions and the following disclaimer in the
+      documentation and/or other materials provided with the distribution.
+    * Neither the name of the University of California, Berkeley nor the
+      names of its contributors may be used to endorse or promote
+      products derived from this software without specific prior written
+      permission.
 THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
 "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
 LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
@@ -34,38 +34,331 @@
 NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
 SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 """
+# pylint: skip-file
 from __future__ import print_function
 
-import operator
-import opcode
-import os
+import abc
+import builtins
+import dis
 import io
+import itertools
+import logging
+import opcode
+import operator
 import pickle
+import platform
 import struct
 import sys
 import types
-from functools import partial
-import itertools
-import dis
-import traceback
 import weakref
+import uuid
+import threading
+import typing
+from enum import Enum
 
-# pylint: disable-all
+from typing import Generic, Union, Tuple, Callable
+from pickle import _Pickler as Pickler
+from pickle import _getattribute
+from io import BytesIO
+from importlib._bootstrap import _find_spec
 
-if sys.version < '3':
-  from pickle import Pickler # pylint: disable=ungrouped-imports
-  try:
-    from cStringIO import StringIO
-  except ImportError:
-    from StringIO import StringIO
-  PY3 = False
-else:
-  types.ClassType = type
-  from pickle import _Pickler as Pickler # pylint: disable=ungrouped-imports
-  from io import BytesIO as StringIO # pylint: disable=ungrouped-imports
-  PY3 = True
+try:  # pragma: no branch
+    import typing_extensions as _typing_extensions
+    from typing_extensions import Literal, Final
+except ImportError:
+    _typing_extensions = Literal = Final = None
 
-#relevant opcodes
+if sys.version_info >= (3, 5, 3):
+    from typing import ClassVar
+else:  # pragma: no cover
+    ClassVar = None
+
+
+# cloudpickle is meant for inter process communication: we expect all
+# communicating processes to run the same Python version hence we favor
+# communication speed over compatibility:
+DEFAULT_PROTOCOL = pickle.HIGHEST_PROTOCOL
+
+# Track the provenance of reconstructed dynamic classes to make it possible to
+# recontruct instances from the matching singleton class definition when
+# appropriate and preserve the usual "isinstance" semantics of Python objects.
+_DYNAMIC_CLASS_TRACKER_BY_CLASS = weakref.WeakKeyDictionary()
+_DYNAMIC_CLASS_TRACKER_BY_ID = weakref.WeakValueDictionary()
+_DYNAMIC_CLASS_TRACKER_LOCK = threading.Lock()
+
+PYPY = platform.python_implementation() == "PyPy"
+
+builtin_code_type = None
+if PYPY:
+    # builtin-code objects only exist in pypy
+    builtin_code_type = type(float.__new__.__code__)
+
+_extract_code_globals_cache = weakref.WeakKeyDictionary()
+
+
+def _get_or_create_tracker_id(class_def):
+    with _DYNAMIC_CLASS_TRACKER_LOCK:
+        class_tracker_id = _DYNAMIC_CLASS_TRACKER_BY_CLASS.get(class_def)
+        if class_tracker_id is None:
+            class_tracker_id = uuid.uuid4().hex
+            _DYNAMIC_CLASS_TRACKER_BY_CLASS[class_def] = class_tracker_id
+            _DYNAMIC_CLASS_TRACKER_BY_ID[class_tracker_id] = class_def
+    return class_tracker_id
+
+
+def _lookup_class_or_track(class_tracker_id, class_def):
+    if class_tracker_id is not None:
+        with _DYNAMIC_CLASS_TRACKER_LOCK:
+            class_def = _DYNAMIC_CLASS_TRACKER_BY_ID.setdefault(
+                class_tracker_id, class_def)
+            _DYNAMIC_CLASS_TRACKER_BY_CLASS[class_def] = class_tracker_id
+    return class_def
+
+
+def _whichmodule(obj, name):
+    """Find the module an object belongs to.
+    This function differs from ``pickle.whichmodule`` in two ways:
+    - it does not mangle the cases where obj's module is __main__ and obj was
+      not found in any module.
+    - Errors arising during module introspection are ignored, as those errors
+      are considered unwanted side effects.
+    """
+    if sys.version_info[:2] < (3, 7) and isinstance(obj, typing.TypeVar):  # pragma: no branch  # noqa
+        # Workaround bug in old Python versions: prior to Python 3.7,
+        # T.__module__ would always be set to "typing" even when the TypeVar T
+        # would be defined in a different module.
+        #
+        # For such older Python versions, we ignore the __module__ attribute of
+        # TypeVar instances and instead exhaustively lookup those instances in
+        # all currently imported modules.
+        module_name = None
+    else:
+        module_name = getattr(obj, '__module__', None)
+
+    if module_name is not None:
+        return module_name
+    # Protect the iteration by using a copy of sys.modules against dynamic
+    # modules that trigger imports of other modules upon calls to getattr or
+    # other threads importing at the same time.
+    for module_name, module in sys.modules.copy().items():
+        # Some modules such as coverage can inject non-module objects inside
+        # sys.modules
+        if (
+                module_name == '__main__' or
+                module is None or
+                not isinstance(module, types.ModuleType)
+        ):
+            continue
+        try:
+            if _getattribute(module, name)[0] is obj:
+                return module_name
+        except Exception:
+            pass
+    return None
+
+
+def _is_importable_by_name(obj, name=None):
+    """Determine if obj can be pickled as attribute of a file-backed module"""
+    return _lookup_module_and_qualname(obj, name=name) is not None
+
+
+def _lookup_module_and_qualname(obj, name=None):
+    if name is None:
+        name = getattr(obj, '__qualname__', None)
+    if name is None:  # pragma: no cover
+        # This used to be needed for Python 2.7 support but is probably not
+        # needed anymore. However we keep the __name__ introspection in case
+        # users of cloudpickle rely on this old behavior for unknown reasons.
+        name = getattr(obj, '__name__', None)
+
+    module_name = _whichmodule(obj, name)
+
+    if module_name is None:
+        # In this case, obj.__module__ is None AND obj was not found in any
+        # imported module. obj is thus treated as dynamic.
+        return None
+
+    if module_name == "__main__":
+        return None
+
+    module = sys.modules.get(module_name, None)
+    if module is None:
+        # The main reason why obj's module would not be imported is that this
+        # module has been dynamically created, using for example
+        # types.ModuleType. The other possibility is that module was removed
+        # from sys.modules after obj was created/imported. But this case is not
+        # supported, as the standard pickle does not support it either.
+        return None
+
+    # module has been added to sys.modules, but it can still be dynamic.
+    if _is_dynamic(module):
+        return None
+
+    try:
+        obj2, parent = _getattribute(module, name)
+    except AttributeError:
+        # obj was not found inside the module it points to
+        return None
+    if obj2 is not obj:
+        return None
+    return module, name
+
+
+def _extract_code_globals(co):
+    """
+    Find all globals names read or written to by codeblock co
+    """
+    out_names = _extract_code_globals_cache.get(co)
+    if out_names is None:
+        names = co.co_names
+        out_names = {names[oparg] for _, oparg in _walk_global_ops(co)}
+
+        # Declaring a function inside another one using the "def ..."
+        # syntax generates a constant code object corresonding to the one
+        # of the nested function's As the nested function may itself need
+        # global variables, we need to introspect its code, extract its
+        # globals, (look for code object in it's co_consts attribute..) and
+        # add the result to code_globals
+        if co.co_consts:
+            for const in co.co_consts:
+                if isinstance(const, types.CodeType):
+                    out_names |= _extract_code_globals(const)
+
+        _extract_code_globals_cache[co] = out_names
+
+    return out_names
+
+
+def _find_imported_submodules(code, top_level_dependencies):
+    """
+    Find currently imported submodules used by a function.
+    Submodules used by a function need to be detected and referenced for the
+    function to work correctly at depickling time. Because submodules can be
+    referenced as attribute of their parent package (``package.submodule``), we
+    need a special introspection technique that does not rely on GLOBAL-related
+    opcodes to find references of them in a code object.
+    Example:
+    ```
+    import concurrent.futures
+    import cloudpickle
+    def func():
+        x = concurrent.futures.ThreadPoolExecutor
+    if __name__ == '__main__':
+        cloudpickle.dumps(func)
+    ```
+    The globals extracted by cloudpickle in the function's state include the
+    concurrent package, but not its submodule (here, concurrent.futures), which
+    is the module used by func. Find_imported_submodules will detect the usage
+    of concurrent.futures. Saving this module alongside with func will ensure
+    that calling func once depickled does not fail due to concurrent.futures
+    not being imported
+    """
+
+    subimports = []
+    # check if any known dependency is an imported package
+    for x in top_level_dependencies:
+        if (isinstance(x, types.ModuleType) and
+                hasattr(x, '__package__') and x.__package__):
+            # check if the package has any currently loaded sub-imports
+            prefix = x.__name__ + '.'
+            # A concurrent thread could mutate sys.modules,
+            # make sure we iterate over a copy to avoid exceptions
+            for name in list(sys.modules):
+                # Older versions of pytest will add a "None" module to
+                # sys.modules.
+                if name is not None and name.startswith(prefix):
+                    # check whether the function can address the sub-module
+                    tokens = set(name[len(prefix):].split('.'))
+                    if not tokens - set(code.co_names):
+                        subimports.append(sys.modules[name])
+    return subimports
+
+
+def cell_set(cell, value):
+    """Set the value of a closure cell.
+    The point of this function is to set the cell_contents attribute of a cell
+    after its creation. This operation is necessary in case the cell contains a
+    reference to the function the cell belongs to, as when calling the
+    function's constructor
+    ``f = types.FunctionType(code, globals, name, argdefs, closure)``,
+    closure will not be able to contain the yet-to-be-created f.
+    In Python3.7, cell_contents is writeable, so setting the contents of a cell
+    can be done simply using
+    >>> cell.cell_contents = value
+    In earlier Python3 versions, the cell_contents attribute of a cell is read
+    only, but this limitation can be worked around by leveraging the Python 3
+    ``nonlocal`` keyword.
+    In Python2 however, this attribute is read only, and there is no
+    ``nonlocal`` keyword. For this reason, we need to come up with more
+    complicated hacks to set this attribute.
+    The chosen approach is to create a function with a STORE_DEREF opcode,
+    which sets the content of a closure variable. Typically:
+    >>> def inner(value):
+    ...     lambda: cell  # the lambda makes cell a closure
+    ...     cell = value  # cell is a closure, so this triggers a STORE_DEREF
+    (Note that in Python2, A STORE_DEREF can never be triggered from an inner
+    function. The function g for example here
+    >>> def f(var):
+    ...     def g():
+    ...         var += 1
+    ...     return g
+    will not modify the closure variable ``var```inplace, but instead try to
+    load a local variable var and increment it. As g does not assign the local
+    variable ``var`` any initial value, calling f(1)() will fail at runtime.)
+    Our objective is to set the value of a given cell ``cell``. So we need to
+    somewhat reference our ``cell`` object into the ``inner`` function so that
+    this object (and not the smoke cell of the lambda function) gets affected
+    by the STORE_DEREF operation.
+    In inner, ``cell`` is referenced as a cell variable (an enclosing variable
+    that is referenced by the inner function). If we create a new function
+    cell_set with the exact same code as ``inner``, but with ``cell`` marked as
+    a free variable instead, the STORE_DEREF will be applied on its closure -
+    ``cell``, which we can specify explicitly during construction! The new
+    cell_set variable thus actually sets the contents of a specified cell!
+    Note: we do not make use of the ``nonlocal`` keyword to set the contents of
+    a cell in early python3 versions to limit possible syntax errors in case
+    test and checker libraries decide to parse the whole file.
+    """
+
+    if sys.version_info[:2] >= (3, 7):  # pragma: no branch
+        cell.cell_contents = value
+    else:
+        _cell_set = types.FunctionType(
+            _cell_set_template_code, {}, '_cell_set', (), (cell,),)
+        _cell_set(value)
+
+
+def _make_cell_set_template_code():
+    def _cell_set_factory(value):
+        lambda: cell
+        cell = value
+
+    co = _cell_set_factory.__code__
+
+    _cell_set_template_code = types.CodeType(
+        co.co_argcount,
+        co.co_kwonlyargcount,   # Python 3 only argument
+        co.co_nlocals,
+        co.co_stacksize,
+        co.co_flags,
+        co.co_code,
+        co.co_consts,
+        co.co_names,
+        co.co_varnames,
+        co.co_filename,
+        co.co_name,
+        co.co_firstlineno,
+        co.co_lnotab,
+        co.co_cellvars,  # co_freevars is initialized with co_cellvars
+        (),  # co_cellvars is made empty
+    )
+    return _cell_set_template_code
+
+
+if sys.version_info[:2] < (3, 7):
+    _cell_set_template_code = _make_cell_set_template_code()
+
+# relevant opcodes
 STORE_GLOBAL = opcode.opmap['STORE_GLOBAL']
 DELETE_GLOBAL = opcode.opmap['DELETE_GLOBAL']
 LOAD_GLOBAL = opcode.opmap['LOAD_GLOBAL']
@@ -74,745 +367,979 @@
 EXTENDED_ARG = dis.EXTENDED_ARG
 
 
-def islambda(func):
-  return getattr(func, '__name__') == '<lambda>'
-
-
 _BUILTIN_TYPE_NAMES = {}
-for k1, v1 in list(types.__dict__.items()):
-  if type(v1) is type: # pylint: disable=unidiomatic-typecheck
-    _BUILTIN_TYPE_NAMES[v1] = k1
+for k, v in types.__dict__.items():
+    if type(v) is type:
+        _BUILTIN_TYPE_NAMES[v] = k
 
 
 def _builtin_type(name):
-  return getattr(types, name)
+    if name == "ClassType":  # pragma: no cover
+        # Backward compat to load pickle files generated with cloudpickle
+        # < 1.3 even if loading pickle files from older versions is not
+        # officially supported.
+        return type
+    return getattr(types, name)
 
 
-if sys.version_info < (3, 4):
-  def _walk_global_ops(code):
+def _walk_global_ops(code):
     """
     Yield (opcode, argument number) tuples for all
     global-referencing instructions in *code*.
     """
-    code = getattr(code, 'co_code', b'')
-    if not PY3:
-      code = list(map(ord, code))
-
-    n = len(code)
-    i = 0
-    extended_arg = 0
-    while i < n:
-      op = code[i]
-      i += 1
-      if op >= HAVE_ARGUMENT:
-        oparg = code[i] + code[i + 1] * 256 + extended_arg
-        extended_arg = 0
-        i += 2
-        if op == EXTENDED_ARG:
-          extended_arg = oparg * 65536
+    for instr in dis.get_instructions(code):
+        op = instr.opcode
         if op in GLOBAL_OPS:
-          yield op, oparg
-
-else:
-  def _walk_global_ops(code):
-    """
-    Yield (opcode, argument number) tuples for all
-    global-referencing instructions in *code*.
-    """
-    for instr in dis.get_instructions(code): # pylint: disable=no-member
-      op = instr.opcode
-      if op in GLOBAL_OPS:
-        yield op, instr.arg
+            yield op, instr.arg
 
 
-class CloudPickler(Pickler): # pylint: disable=too-many-public-methods
-  """
-  CloudPickler class
-  """
-  dispatch = Pickler.dispatch.copy()
+def _extract_class_dict(cls):
+    """Retrieve a copy of the dict of a class without the inherited methods"""
+    clsdict = dict(cls.__dict__)  # copy dict proxy to a dict
+    if len(cls.__bases__) == 1:
+        inherited_dict = cls.__bases__[0].__dict__
+    else:
+        inherited_dict = {}
+        for base in reversed(cls.__bases__):
+            inherited_dict.update(base.__dict__)
+    to_remove = []
+    for name, value in clsdict.items():
+        try:
+            base_value = inherited_dict[name]
+            if value is base_value:
+                to_remove.append(name)
+        except KeyError:
+            pass
+    for name in to_remove:
+        clsdict.pop(name)
+    return clsdict
 
-  def __init__(self, filen, protocol=None):
-    Pickler.__init__(self, filen, protocol)
-    # set of modules to unpickle
-    self.modules = set()
-    # map ids to dictionary. used to ensure that functions can share global env
-    self.globals_ref = {}
 
-  def dump(self, obj):
-    self.inject_addons()
-    try:
-      return Pickler.dump(self, obj)
-    except RuntimeError as e:
-      if 'recursion' in e.args[0]:
-        msg = """Could not pickle object as excessively deep recursion required."""
-        raise pickle.PicklingError(msg)
-    except pickle.PickleError:
-      raise
-    except Exception as e:
-      print_exec(sys.stderr)
-      raise pickle.PicklingError(str(e))
+if sys.version_info[:2] < (3, 7):  # pragma: no branch
+    def _is_parametrized_type_hint(obj):
+        # This is very cheap but might generate false positives.
+        # general typing Constructs
+        is_typing = getattr(obj, '__origin__', None) is not None
 
-  def save_memoryview(self, obj):
-    """Fallback to save_string"""
-    Pickler.save_string(self, str(obj))
+        # typing_extensions.Literal
+        is_litteral = getattr(obj, '__values__', None) is not None
 
-  def save_buffer(self, obj):
-    """Fallback to save_string"""
-    Pickler.save_string(self, str(obj))
-  if PY3:
+        # typing_extensions.Final
+        is_final = getattr(obj, '__type__', None) is not None
+
+        # typing.Union/Tuple for old Python 3.5
+        is_union = getattr(obj, '__union_params__', None) is not None
+        is_tuple = getattr(obj, '__tuple_params__', None) is not None
+        is_callable = (
+            getattr(obj, '__result__', None) is not None and
+            getattr(obj, '__args__', None) is not None
+        )
+        return any((is_typing, is_litteral, is_final, is_union, is_tuple,
+                    is_callable))
+
+    def _create_parametrized_type_hint(origin, args):
+        return origin[args]
+
+
+class CloudPickler(Pickler):
+
+    dispatch = Pickler.dispatch.copy()
+
+    def __init__(self, file, protocol=None):
+        if protocol is None:
+            protocol = DEFAULT_PROTOCOL
+        Pickler.__init__(self, file, protocol=protocol)
+        # map ids to dictionary. used to ensure that functions can share global env
+        self.globals_ref = {}
+
+    def dump(self, obj):
+        self.inject_addons()
+        try:
+            return Pickler.dump(self, obj)
+        except RuntimeError as e:
+            if 'recursion' in e.args[0]:
+                msg = """Could not pickle object as excessively deep recursion required."""
+                raise pickle.PicklingError(msg)
+            else:
+                raise
+
+    def save_typevar(self, obj):
+        self.save_reduce(*_typevar_reduce(obj), obj=obj)
+
+    dispatch[typing.TypeVar] = save_typevar
+
+    def save_memoryview(self, obj):
+        self.save(obj.tobytes())
+
     dispatch[memoryview] = save_memoryview
-  else:
-    dispatch[buffer] = save_buffer
 
-  def save_unsupported(self, obj): # pylint: disable=no-self-use
-    raise pickle.PicklingError("Cannot pickle objects of type %s" % type(obj))
-  dispatch[types.GeneratorType] = save_unsupported
-
-  # itertools objects do not pickle!
-  for v in list(itertools.__dict__.values()):
-    if type(v) is type: # pylint: disable=unidiomatic-typecheck
-      dispatch[v] = save_unsupported
-
-  def save_module(self, obj):
-    """
-    Save a module as an import
-    """
-    self.modules.add(obj)
-    self.save_reduce(subimport, (obj.__name__,), obj=obj)
-  dispatch[types.ModuleType] = save_module
-
-  def save_codeobject(self, obj):
-    """
-    Save a code object
-    """
-    if PY3:
-      args = (
-          obj.co_argcount, obj.co_kwonlyargcount, obj.co_nlocals, obj.co_stacksize,
-          obj.co_flags, obj.co_code, obj.co_consts, obj.co_names, obj.co_varnames,
-          obj.co_filename, obj.co_name, obj.co_firstlineno, obj.co_lnotab, obj.co_freevars,
-          obj.co_cellvars
-      )
-    else:
-      args = (
-          obj.co_argcount, obj.co_nlocals, obj.co_stacksize, obj.co_flags, obj.co_code,
-          obj.co_consts, obj.co_names, obj.co_varnames, obj.co_filename, obj.co_name,
-          obj.co_firstlineno, obj.co_lnotab, obj.co_freevars, obj.co_cellvars
-      )
-    self.save_reduce(types.CodeType, args, obj=obj)
-  dispatch[types.CodeType] = save_codeobject
-
-  def save_function(self, obj, name=None):
-    """ Registered with the dispatch to handle all function types.
-    Determines what kind of function obj is (e.g. lambda, defined at
-    interactive prompt, etc) and handles the pickling appropriately.
-    """
-    write = self.write
-
-    if name is None:
-      name = obj.__name__
-    try:
-      # whichmodule() could fail, see
-      # https://bitbucket.org/gutworth/six/issues/63/importing-six-breaks-pickling
-      modname = pickle.whichmodule(obj, name)
-    except Exception:
-      modname = None
-    # print('which gives %s %s %s' % (modname, obj, name))
-    try:
-      themodule = sys.modules[modname]
-    except KeyError:
-      # eval'd items such as namedtuple give invalid items for their function __module__
-      modname = '__main__'
-
-    if modname == '__main__':
-      themodule = None
-
-    if themodule:
-      self.modules.add(themodule)
-      if getattr(themodule, name, None) is obj:
-        return self.save_global(obj, name)
-
-    # if func is lambda, def'ed at prompt, is in main, or is nested, then
-    # we'll pickle the actual function object rather than simply saving a
-    # reference (as is done in default pickler), via save_function_tuple.
-    if islambda(obj) or obj.__code__.co_filename == '<stdin>' or themodule is None:
-      #print("save global", islambda(obj), obj.__code__.co_filename, modname, themodule)
-      self.save_function_tuple(obj)
-      return
-    else:
-      # func is nested
-      klass = getattr(themodule, name, None)
-      if klass is None or klass is not obj:
-        self.save_function_tuple(obj)
-        return
-
-    if obj.__dict__:
-      # essentially save_reduce, but workaround needed to avoid recursion
-      self.save(_restore_attr)
-      write(pickle.MARK + pickle.GLOBAL + modname + '\n' + name + '\n')
-      self.memoize(obj)
-      self.save(obj.__dict__)
-      write(pickle.TUPLE + pickle.REDUCE)
-    else:
-      write(pickle.GLOBAL + modname + '\n' + name + '\n')
-      self.memoize(obj)
-  dispatch[types.FunctionType] = save_function
-
-  def save_function_tuple(self, func):
-    """  Pickles an actual func object.
-    A func comprises: code, globals, defaults, closure, and dict.  We
-    extract and save these, injecting reducing functions at certain points
-    to recreate the func object.  Keep in mind that some of these pieces
-    can contain a ref to the func itself.  Thus, a naive save on these
-    pieces could trigger an infinite loop of save's.  To get around that,
-    we first create a skeleton func object using just the code (this is
-    safe, since this won't contain a ref to the func), and memoize it as
-    soon as it's created.  The other stuff can then be filled in later.
-    """
-    save = self.save
-    write = self.write
-
-    code, f_globals, defaults, closure, dct, base_globals = self.extract_func_data(func)
-
-    save(_fill_function)  # skeleton function updater
-    write(pickle.MARK)    # beginning of tuple that _fill_function expects
-
-    # create a skeleton function object and memoize it
-    save(_make_skel_func)
-    save((code, closure, base_globals))
-    write(pickle.REDUCE)
-    self.memoize(func)
-
-    # save the rest of the func data needed by _fill_function
-    save(f_globals)
-    save(defaults)
-    save(dct)
-    save(func.__module__)
-    write(pickle.TUPLE)
-    write(pickle.REDUCE)  # applies _fill_function on the tuple
-
-  _extract_code_globals_cache = (
-      weakref.WeakKeyDictionary()
-      if sys.version_info >= (2, 7) and not hasattr(sys, "pypy_version_info")
-      else {}
-  )
-
-  @classmethod
-  def extract_code_globals(cls, co):
-    """
-    Find all globals names read or written to by codeblock co
-    """
-    out_names = cls._extract_code_globals_cache.get(co)
-    if out_names is None:
-      try:
-        names = co.co_names
-      except AttributeError:
-        # PyPy "builtin-code" object
-        out_names = set()
-      else:
-        out_names = set(names[oparg]
-                        for op, oparg in _walk_global_ops(co))
-
-        # see if nested function have any global refs
-        if co.co_consts:
-          for const in co.co_consts:
-            if type(const) is types.CodeType: # pylint: disable=unidiomatic-typecheck
-              out_names |= cls.extract_code_globals(const)
-
-      cls._extract_code_globals_cache[co] = out_names
-
-    return out_names
-
-  def extract_func_data(self, func):
-    """
-    Turn the function into a tuple of data necessary to recreate it:
-      code, globals, defaults, closure, dict
-    """
-    code = func.__code__
-
-    # extract all global ref's
-    func_global_refs = self.extract_code_globals(code)
-
-    # process all variables referenced by global environment
-    f_globals = {}
-    for var in func_global_refs:
-      if var in func.__globals__:
-        f_globals[var] = func.__globals__[var]
-
-    # defaults requires no processing
-    defaults = func.__defaults__
-
-    # process closure
-    closure = [c.cell_contents for c in func.__closure__] if func.__closure__ else []
-
-    # save the dict
-    dct = func.__dict__
-
-    base_globals = self.globals_ref.get(id(func.__globals__), {})
-    self.globals_ref[id(func.__globals__)] = base_globals
-
-    return (code, f_globals, defaults, closure, dct, base_globals)
-
-  def save_builtin_function(self, obj):
-    if obj.__module__ is "__builtin__":
-      return self.save_global(obj)
-    return self.save_function(obj)
-  dispatch[types.BuiltinFunctionType] = save_builtin_function
-
-  def save_global(self, obj, name=None, pack=struct.pack): # pylint: disable=too-many-branches
-    if obj.__module__ == "__builtin__" or obj.__module__ == "builtins":
-      if obj in _BUILTIN_TYPE_NAMES:
-        return self.save_reduce(_builtin_type, (_BUILTIN_TYPE_NAMES[obj],), obj=obj)
-
-    if name is None:
-      name = obj.__name__
-
-    modname = getattr(obj, "__module__", None)
-    if modname is None:
-      try:
-        # whichmodule() could fail, see
-        # https://bitbucket.org/gutworth/six/issues/63/importing-six-breaks-pickling
-        modname = pickle.whichmodule(obj, name)
-      except Exception:
-        modname = '__main__'
-
-    if modname == '__main__':
-      themodule = None
-    else:
-      __import__(modname)
-      themodule = sys.modules[modname]
-      self.modules.add(themodule)
-
-    if hasattr(themodule, name) and getattr(themodule, name) is obj:
-      return Pickler.save_global(self, obj, name)
-
-    typ = type(obj)
-    if typ is not obj and isinstance(obj, (type, types.ClassType)):
-      d = dict(obj.__dict__)  # copy dict proxy to a dict
-      if not isinstance(d.get('__dict__', None), property):
-        # don't extract dict that are properties
-        d.pop('__dict__', None)
-      d.pop('__weakref__', None)
-
-      # hack as __new__ is stored differently in the __dict__
-      new_override = d.get('__new__', None)
-      if new_override:
-        d['__new__'] = obj.__new__
-
-      # workaround for namedtuple (hijacked by PySpark)
-      if getattr(obj, '_is_namedtuple_', False):
-        self.save_reduce(_load_namedtuple, (obj.__name__, obj._fields))
-        return
-
-      self.save(_load_class)
-      self.save_reduce(typ, (obj.__name__, obj.__bases__, {"__doc__": obj.__doc__}), obj=obj)
-      d.pop('__doc__', None)
-      # handle property and staticmethod
-      dd = {}
-      for k, v in list(d.items()):
-        if isinstance(v, property):
-          k = ('property', k)
-          v = (v.fget, v.fset, v.fdel, v.__doc__)
-        elif isinstance(v, staticmethod) and hasattr(v, '__func__'):
-          k = ('staticmethod', k)
-          v = v.__func__
-        elif isinstance(v, classmethod) and hasattr(v, '__func__'):
-          k = ('classmethod', k)
-          v = v.__func__
-        dd[k] = v
-      self.save(dd)
-      self.write(pickle.TUPLE2)
-      self.write(pickle.REDUCE)
-
-    else:
-      raise pickle.PicklingError("Can't pickle %r" % obj)
-
-  dispatch[type] = save_global
-  dispatch[types.ClassType] = save_global
-
-  def save_instancemethod(self, obj):
-    # Memoization rarely is ever useful due to python bounding
-    if PY3:
-      self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
-    else:
-      self.save_reduce(
-          types.MethodType, (obj.__func__, obj.__self__, obj.__self__.__class__),
-          obj=obj)
-  dispatch[types.MethodType] = save_instancemethod
-
-  def save_inst(self, obj):
-    """Inner logic to save instance. Based off pickle.save_inst
-    Supports __transient__"""
-    cls = obj.__class__
-
-    memo = self.memo
-    write = self.write
-    save = self.save
-
-    if hasattr(obj, '__getinitargs__'):
-      args = obj.__getinitargs__()
-      len(args)  # assert it's a sequence
-      pickle._keep_alive(args, memo) # pylint: disable=protected-access
-    else:
-      args = ()
-
-    write(pickle.MARK)
-
-    if self.bin:
-      save(cls)
-      for arg in args:
-        save(arg)
-      write(pickle.OBJ)
-    else:
-      for arg in args:
-        save(arg)
-      write(pickle.INST + cls.__module__ + '\n' + cls.__name__ + '\n')
-
-    self.memoize(obj)
-
-    try:
-      getstate = obj.__getstate__
-    except AttributeError:
-      stuff = obj.__dict__
-      #remove items if transient
-      if hasattr(obj, '__transient__'):
-        transient = obj.__transient__
-        stuff = stuff.copy()
-        for k in list(stuff.keys()):
-          if k in transient:
-            del stuff[k]
-    else:
-      stuff = getstate()
-      pickle._keep_alive(stuff, memo) # pylint: disable=protected-access
-    save(stuff)
-    write(pickle.BUILD)
-
-  if not PY3:
-    dispatch[types.InstanceType] = save_inst
-
-  def save_property(self, obj):
-    # properties not correctly saved in python
-    self.save_reduce(property, (obj.fget, obj.fset, obj.fdel, obj.__doc__), obj=obj)
-  dispatch[property] = save_property
-
-  def save_itemgetter(self, obj):
-    """itemgetter serializer (needed for namedtuple support)"""
-    class Dummy: # pylint: disable=old-style-class
-      def __init__(self):
-        pass
-      def __getitem__(self, item):
-        return item
-    items = obj(Dummy())
-    if not isinstance(items, tuple):
-      items = (items, )
-    return self.save_reduce(operator.itemgetter, items)
-
-  if type(operator.itemgetter) is type: # pylint: disable=unidiomatic-typecheck
-    dispatch[operator.itemgetter] = save_itemgetter
-
-  def save_attrgetter(self, obj):
-    """attrgetter serializer"""
-    class Dummy(object):
-      def __init__(self, attrs, index=None):
-        self.attrs = attrs
-        self.index = index
-      def __getattribute__(self, item):
-        attrs = object.__getattribute__(self, "attrs")
-        index = object.__getattribute__(self, "index")
-        if index is None:
-          index = len(attrs)
-          attrs.append(item)
+    def save_module(self, obj):
+        """
+        Save a module as an import
+        """
+        if _is_dynamic(obj):
+            obj.__dict__.pop('__builtins__', None)
+            self.save_reduce(dynamic_subimport, (obj.__name__, vars(obj)),
+                             obj=obj)
         else:
-          attrs[index] = ".".join([attrs[index], item])
-        return type(self)(attrs, index)
-    attrs = []
-    obj(Dummy(attrs))
-    return self.save_reduce(operator.attrgetter, tuple(attrs))
+            self.save_reduce(subimport, (obj.__name__,), obj=obj)
 
-  if type(operator.attrgetter) is type: # pylint: disable=unidiomatic-typecheck
-    dispatch[operator.attrgetter] = save_attrgetter
+    dispatch[types.ModuleType] = save_module
 
-  def save_reduce(self, func, args, state=None, # pylint: disable=too-many-branches
-                  listitems=None, dictitems=None, obj=None):
-    """Modified to support __transient__ on new objects
-    Change only affects protocol level 2 (which is always used by PiCloud"""
-    # Assert that args is a tuple or None
-    if not isinstance(args, tuple):
-      raise pickle.PicklingError("args from reduce() should be a tuple")
+    def save_codeobject(self, obj):
+        """
+        Save a code object
+        """
+        if hasattr(obj, "co_posonlyargcount"):  # pragma: no branch
+            args = (
+                obj.co_argcount, obj.co_posonlyargcount,
+                obj.co_kwonlyargcount, obj.co_nlocals, obj.co_stacksize,
+                obj.co_flags, obj.co_code, obj.co_consts, obj.co_names,
+                obj.co_varnames, obj.co_filename, obj.co_name,
+                obj.co_firstlineno, obj.co_lnotab, obj.co_freevars,
+                obj.co_cellvars
+            )
+        else:
+            args = (
+                obj.co_argcount, obj.co_kwonlyargcount, obj.co_nlocals,
+                obj.co_stacksize, obj.co_flags, obj.co_code, obj.co_consts,
+                obj.co_names, obj.co_varnames, obj.co_filename,
+                obj.co_name, obj.co_firstlineno, obj.co_lnotab,
+                obj.co_freevars, obj.co_cellvars
+            )
+        self.save_reduce(types.CodeType, args, obj=obj)
 
-    # Assert that func is callable
-    if not hasattr(func, '__call__'):
-      raise pickle.PicklingError("func from reduce should be callable")
+    dispatch[types.CodeType] = save_codeobject
 
-    save = self.save
-    write = self.write
+    def save_function(self, obj, name=None):
+        """ Registered with the dispatch to handle all function types.
+        Determines what kind of function obj is (e.g. lambda, defined at
+        interactive prompt, etc) and handles the pickling appropriately.
+        """
+        if _is_importable_by_name(obj, name=name):
+            return Pickler.save_global(self, obj, name=name)
+        elif PYPY and isinstance(obj.__code__, builtin_code_type):
+            return self.save_pypy_builtin_func(obj)
+        else:
+            return self.save_function_tuple(obj)
 
-    # Protocol 2 special case: if func's name is __newobj__, use NEWOBJ
-    if self.proto >= 2 and getattr(func, "__name__", "") == "__newobj__":
-      #Added fix to allow transient
-      cls = args[0]
-      if not hasattr(cls, "__new__"):
-        raise pickle.PicklingError(
-            "args[0] from __newobj__ args has no __new__")
-      if obj is not None and cls is not obj.__class__:
-        raise pickle.PicklingError(
-            "args[0] from __newobj__ args has the wrong class")
-      args = args[1:]
-      save(cls)
+    dispatch[types.FunctionType] = save_function
 
-      #Don't pickle transient entries
-      if hasattr(obj, '__transient__'):
-        transient = obj.__transient__
-        state = state.copy()
+    def save_pypy_builtin_func(self, obj):
+        """Save pypy equivalent of builtin functions.
+        PyPy does not have the concept of builtin-functions. Instead,
+        builtin-functions are simple function instances, but with a
+        builtin-code attribute.
+        Most of the time, builtin functions should be pickled by attribute. But
+        PyPy has flaky support for __qualname__, so some builtin functions such
+        as float.__new__ will be classified as dynamic. For this reason only,
+        we created this special routine. Because builtin-functions are not
+        expected to have closure or globals, there is no additional hack
+        (compared the one already implemented in pickle) to protect ourselves
+        from reference cycles. A simple (reconstructor, newargs, obj.__dict__)
+        tuple is save_reduced.
+        Note also that PyPy improved their support for __qualname__ in v3.6, so
+        this routing should be removed when cloudpickle supports only PyPy 3.6
+        and later.
+        """
+        rv = (types.FunctionType, (obj.__code__, {}, obj.__name__,
+                                   obj.__defaults__, obj.__closure__),
+              obj.__dict__)
+        self.save_reduce(*rv, obj=obj)
 
-        for k in list(state.keys()):
-          if k in transient:
-            del state[k]
+    def _save_dynamic_enum(self, obj, clsdict):
+        """Special handling for dynamic Enum subclasses
+        Use a dedicated Enum constructor (inspired by EnumMeta.__call__) as the
+        EnumMeta metaclass has complex initialization that makes the Enum
+        subclasses hold references to their own instances.
+        """
+        members = dict((e.name, e.value) for e in obj)
 
-      save(args)
-      write(pickle.NEWOBJ)
-    else:
-      save(func)
-      save(args)
-      write(pickle.REDUCE)
+        self.save_reduce(
+                _make_skeleton_enum,
+                (obj.__bases__, obj.__name__, obj.__qualname__,
+                 members, obj.__module__, _get_or_create_tracker_id(obj), None),
+                obj=obj
+         )
 
-    if obj is not None:
-      self.memoize(obj)
+        # Cleanup the clsdict that will be passed to _rehydrate_skeleton_class:
+        # Those attributes are already handled by the metaclass.
+        for attrname in ["_generate_next_value_", "_member_names_",
+                         "_member_map_", "_member_type_",
+                         "_value2member_map_"]:
+            clsdict.pop(attrname, None)
+        for member in members:
+            clsdict.pop(member)
 
-    # More new special cases (that work with older protocols as
-    # well): when __reduce__ returns a tuple with 4 or 5 items,
-    # the 4th and 5th item should be iterators that provide list
-    # items and dict items (as (key, value) tuples), or None.
+    def save_dynamic_class(self, obj):
+        """Save a class that can't be stored as module global.
+        This method is used to serialize classes that are defined inside
+        functions, or that otherwise can't be serialized as attribute lookups
+        from global modules.
+        """
+        clsdict = _extract_class_dict(obj)
+        clsdict.pop('__weakref__', None)
 
-    if listitems is not None:
-      self._batch_appends(listitems)
+        if issubclass(type(obj), abc.ABCMeta):
+            # If obj is an instance of an ABCMeta subclass, dont pickle the
+            # cache/negative caches populated during isinstance/issubclass
+            # checks, but pickle the list of registered subclasses of obj.
+            clsdict.pop('_abc_cache', None)
+            clsdict.pop('_abc_negative_cache', None)
+            clsdict.pop('_abc_negative_cache_version', None)
+            registry = clsdict.pop('_abc_registry', None)
+            if registry is None:
+                # in Python3.7+, the abc caches and registered subclasses of a
+                # class are bundled into the single _abc_impl attribute
+                clsdict.pop('_abc_impl', None)
+                (registry, _, _, _) = abc._get_dump(obj)
 
-    if dictitems is not None:
-      self._batch_setitems(dictitems)
+                clsdict["_abc_impl"] = [subclass_weakref()
+                                        for subclass_weakref in registry]
+            else:
+                # In the above if clause, registry is a set of weakrefs -- in
+                # this case, registry is a WeakSet
+                clsdict["_abc_impl"] = [type_ for type_ in registry]
 
-    if state is not None:
-      save(state)
-      write(pickle.BUILD)
+        # On PyPy, __doc__ is a readonly attribute, so we need to include it in
+        # the initial skeleton class.  This is safe because we know that the
+        # doc can't participate in a cycle with the original class.
+        type_kwargs = {'__doc__': clsdict.pop('__doc__', None)}
 
-  def save_partial(self, obj):
-    """Partial objects do not serialize correctly in python2.x -- this fixes the bugs"""
-    self.save_reduce(_genpartial, (obj.func, obj.args, obj.keywords))
+        if "__slots__" in clsdict:
+            type_kwargs['__slots__'] = obj.__slots__
+            # pickle string length optimization: member descriptors of obj are
+            # created automatically from obj's __slots__ attribute, no need to
+            # save them in obj's state
+            if isinstance(obj.__slots__, str):
+                clsdict.pop(obj.__slots__)
+            else:
+                for k in obj.__slots__:
+                    clsdict.pop(k, None)
 
-  if sys.version_info < (2, 7):  # 2.7 supports partial pickling
-    dispatch[partial] = save_partial
+        # If type overrides __dict__ as a property, include it in the type
+        # kwargs. In Python 2, we can't set this attribute after construction.
+        # XXX: can this ever happen in Python 3? If so add a test.
+        __dict__ = clsdict.pop('__dict__', None)
+        if isinstance(__dict__, property):
+            type_kwargs['__dict__'] = __dict__
 
+        save = self.save
+        write = self.write
 
-  def save_file(self, obj): # pylint: disable=too-many-branches
-    """Save a file"""
-    try:
-      import StringIO as pystringIO #we can't use cStringIO as it lacks the name attribute
-    except ImportError:
-      import io as pystringIO # pylint: disable=reimported
+        # We write pickle instructions explicitly here to handle the
+        # possibility that the type object participates in a cycle with its own
+        # __dict__. We first write an empty "skeleton" version of the class and
+        # memoize it before writing the class' __dict__ itself. We then write
+        # instructions to "rehydrate" the skeleton class by restoring the
+        # attributes from the __dict__.
+        #
+        # A type can appear in a cycle with its __dict__ if an instance of the
+        # type appears in the type's __dict__ (which happens for the stdlib
+        # Enum class), or if the type defines methods that close over the name
+        # of the type, (which is common for Python 2-style super() calls).
 
-    if not hasattr(obj, 'name') or  not hasattr(obj, 'mode'):
-      raise pickle.PicklingError("Cannot pickle files that do not map to an actual file")
-    if obj is sys.stdout:
-      return self.save_reduce(getattr, (sys, 'stdout'), obj=obj)
-    if obj is sys.stderr:
-      return self.save_reduce(getattr, (sys, 'stderr'), obj=obj)
-    if obj is sys.stdin:
-      raise pickle.PicklingError("Cannot pickle standard input")
-    if  hasattr(obj, 'isatty') and obj.isatty():
-      raise pickle.PicklingError("Cannot pickle files that map to tty objects")
-    if 'r' not in obj.mode:
-      raise pickle.PicklingError("Cannot pickle files that are not opened for reading")
-    name = obj.name
-    try:
-      fsize = os.stat(name).st_size
-    except OSError:
-      raise pickle.PicklingError("Cannot pickle file %s as it cannot be stat" % name)
+        # Push the rehydration function.
+        save(_rehydrate_skeleton_class)
 
-    if obj.closed:
-      #create an empty closed string io
-      retval = pystringIO.StringIO("")
-      retval.close()
-    elif not fsize: #empty file
-      retval = pystringIO.StringIO("")
-      try:
-        tmpfile = file(name)
-        tst = tmpfile.read(1)
-      except IOError:
-        raise pickle.PicklingError("Cannot pickle file %s as it cannot be read" % name)
-      tmpfile.close()
-      if tst != '':
-        raise pickle.PicklingError(
-            "Cannot pickle file %s as it does not appear to map to a physical, real file" % name)
-    else:
-      try:
-        tmpfile = file(name)
-        contents = tmpfile.read()
-        tmpfile.close()
-      except IOError:
-        raise pickle.PicklingError("Cannot pickle file %s as it cannot be read" % name)
-      retval = pystringIO.StringIO(contents)
-      curloc = obj.tell()
-      retval.seek(curloc)
+        # Mark the start of the args tuple for the rehydration function.
+        write(pickle.MARK)
 
-    retval.name = name
-    self.save(retval)
-    self.memoize(obj)
+        # Create and memoize an skeleton class with obj's name and bases.
+        if Enum is not None and issubclass(obj, Enum):
+            # Special handling of Enum subclasses
+            self._save_dynamic_enum(obj, clsdict)
+        else:
+            # "Regular" class definition:
+            tp = type(obj)
+            self.save_reduce(_make_skeleton_class,
+                             (tp, obj.__name__, _get_bases(obj), type_kwargs,
+                              _get_or_create_tracker_id(obj), None),
+                             obj=obj)
 
-  if PY3:
+        # Now save the rest of obj's __dict__. Any references to obj
+        # encountered while saving will point to the skeleton class.
+        save(clsdict)
+
+        # Write a tuple of (skeleton_class, clsdict).
+        write(pickle.TUPLE)
+
+        # Call _rehydrate_skeleton_class(skeleton_class, clsdict)
+        write(pickle.REDUCE)
+
+    def save_function_tuple(self, func):
+        """  Pickles an actual func object.
+        A func comprises: code, globals, defaults, closure, and dict.  We
+        extract and save these, injecting reducing functions at certain points
+        to recreate the func object.  Keep in mind that some of these pieces
+        can contain a ref to the func itself.  Thus, a naive save on these
+        pieces could trigger an infinite loop of save's.  To get around that,
+        we first create a skeleton func object using just the code (this is
+        safe, since this won't contain a ref to the func), and memoize it as
+        soon as it's created.  The other stuff can then be filled in later.
+        """
+        if is_tornado_coroutine(func):
+            self.save_reduce(_rebuild_tornado_coroutine, (func.__wrapped__,),
+                             obj=func)
+            return
+
+        save = self.save
+        write = self.write
+
+        code, f_globals, defaults, closure_values, dct, base_globals = self.extract_func_data(func)
+
+        save(_fill_function)  # skeleton function updater
+        write(pickle.MARK)    # beginning of tuple that _fill_function expects
+
+        # Extract currently-imported submodules used by func. Storing these
+        # modules in a smoke _cloudpickle_subimports attribute of the object's
+        # state will trigger the side effect of importing these modules at
+        # unpickling time (which is necessary for func to work correctly once
+        # depickled)
+        submodules = _find_imported_submodules(
+            code,
+            itertools.chain(f_globals.values(), closure_values or ()),
+        )
+
+        # create a skeleton function object and memoize it
+        save(_make_skel_func)
+        save((
+            code,
+            len(closure_values) if closure_values is not None else -1,
+            base_globals,
+        ))
+        write(pickle.REDUCE)
+        self.memoize(func)
+
+        # save the rest of the func data needed by _fill_function
+        state = {
+            'globals': f_globals,
+            'defaults': defaults,
+            'dict': dct,
+            'closure_values': closure_values,
+            'module': func.__module__,
+            'name': func.__name__,
+            'doc': func.__doc__,
+            '_cloudpickle_submodules': submodules
+        }
+        if hasattr(func, '__annotations__'):
+            state['annotations'] = func.__annotations__
+        if hasattr(func, '__qualname__'):
+            state['qualname'] = func.__qualname__
+        if hasattr(func, '__kwdefaults__'):
+            state['kwdefaults'] = func.__kwdefaults__
+        save(state)
+        write(pickle.TUPLE)
+        write(pickle.REDUCE)  # applies _fill_function on the tuple
+
+    def extract_func_data(self, func):
+        """
+        Turn the function into a tuple of data necessary to recreate it:
+            code, globals, defaults, closure_values, dict
+        """
+        code = func.__code__
+
+        # extract all global ref's
+        func_global_refs = _extract_code_globals(code)
+
+        # process all variables referenced by global environment
+        f_globals = {}
+        for var in func_global_refs:
+            if var in func.__globals__:
+                f_globals[var] = func.__globals__[var]
+
+        # defaults requires no processing
+        defaults = func.__defaults__
+
+        # process closure
+        closure = (
+            list(map(_get_cell_contents, func.__closure__))
+            if func.__closure__ is not None
+            else None
+        )
+
+        # save the dict
+        dct = func.__dict__
+
+        # base_globals represents the future global namespace of func at
+        # unpickling time. Looking it up and storing it in globals_ref allow
+        # functions sharing the same globals at pickling time to also
+        # share them once unpickled, at one condition: since globals_ref is
+        # an attribute of a Cloudpickler instance, and that a new CloudPickler is
+        # created each time pickle.dump or pickle.dumps is called, functions
+        # also need to be saved within the same invokation of
+        # cloudpickle.dump/cloudpickle.dumps (for example: cloudpickle.dumps([f1, f2])). There
+        # is no such limitation when using Cloudpickler.dump, as long as the
+        # multiple invokations are bound to the same Cloudpickler.
+        base_globals = self.globals_ref.setdefault(id(func.__globals__), {})
+
+        if base_globals == {}:
+            # Add module attributes used to resolve relative imports
+            # instructions inside func.
+            for k in ["__package__", "__name__", "__path__", "__file__"]:
+                # Some built-in functions/methods such as object.__new__  have
+                # their __globals__ set to None in PyPy
+                if func.__globals__ is not None and k in func.__globals__:
+                    base_globals[k] = func.__globals__[k]
+
+        return (code, f_globals, defaults, closure, dct, base_globals)
+
+    def save_getset_descriptor(self, obj):
+        return self.save_reduce(getattr, (obj.__objclass__, obj.__name__))
+
+    dispatch[types.GetSetDescriptorType] = save_getset_descriptor
+
+    def save_global(self, obj, name=None, pack=struct.pack):
+        """
+        Save a "global".
+        The name of this method is somewhat misleading: all types get
+        dispatched here.
+        """
+        if obj is type(None):
+            return self.save_reduce(type, (None,), obj=obj)
+        elif obj is type(Ellipsis):
+            return self.save_reduce(type, (Ellipsis,), obj=obj)
+        elif obj is type(NotImplemented):
+            return self.save_reduce(type, (NotImplemented,), obj=obj)
+        elif obj in _BUILTIN_TYPE_NAMES:
+            return self.save_reduce(
+                _builtin_type, (_BUILTIN_TYPE_NAMES[obj],), obj=obj)
+
+        if sys.version_info[:2] < (3, 7) and _is_parametrized_type_hint(obj):  # noqa  # pragma: no branch
+            # Parametrized typing constructs in Python < 3.7 are not compatible
+            # with type checks and ``isinstance`` semantics. For this reason,
+            # it is easier to detect them using a duck-typing-based check
+            # (``_is_parametrized_type_hint``) than to populate the Pickler's
+            # dispatch with type-specific savers.
+            self._save_parametrized_type_hint(obj)
+        elif name is not None:
+            Pickler.save_global(self, obj, name=name)
+        elif not _is_importable_by_name(obj, name=name):
+            self.save_dynamic_class(obj)
+        else:
+            Pickler.save_global(self, obj, name=name)
+
+    dispatch[type] = save_global
+
+    def save_instancemethod(self, obj):
+        # Memoization rarely is ever useful due to python bounding
+        if obj.__self__ is None:
+            self.save_reduce(getattr, (obj.im_class, obj.__name__))
+        else:
+            self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
+
+    dispatch[types.MethodType] = save_instancemethod
+
+    def save_property(self, obj):
+        # properties not correctly saved in python
+        self.save_reduce(property, (obj.fget, obj.fset, obj.fdel, obj.__doc__),
+                         obj=obj)
+
+    dispatch[property] = save_property
+
+    def save_classmethod(self, obj):
+        orig_func = obj.__func__
+        self.save_reduce(type(obj), (orig_func,), obj=obj)
+
+    dispatch[classmethod] = save_classmethod
+    dispatch[staticmethod] = save_classmethod
+
+    def save_itemgetter(self, obj):
+        """itemgetter serializer (needed for namedtuple support)"""
+        class Dummy:
+            def __getitem__(self, item):
+                return item
+        items = obj(Dummy())
+        if not isinstance(items, tuple):
+            items = (items,)
+        return self.save_reduce(operator.itemgetter, items)
+
+    if type(operator.itemgetter) is type:
+        dispatch[operator.itemgetter] = save_itemgetter
+
+    def save_attrgetter(self, obj):
+        """attrgetter serializer"""
+        class Dummy(object):
+            def __init__(self, attrs, index=None):
+                self.attrs = attrs
+                self.index = index
+            def __getattribute__(self, item):
+                attrs = object.__getattribute__(self, "attrs")
+                index = object.__getattribute__(self, "index")
+                if index is None:
+                    index = len(attrs)
+                    attrs.append(item)
+                else:
+                    attrs[index] = ".".join([attrs[index], item])
+                return type(self)(attrs, index)
+        attrs = []
+        obj(Dummy(attrs))
+        return self.save_reduce(operator.attrgetter, tuple(attrs))
+
+    if type(operator.attrgetter) is type:
+        dispatch[operator.attrgetter] = save_attrgetter
+
+    def save_file(self, obj):
+        """Save a file"""
+
+        if not hasattr(obj, 'name') or not hasattr(obj, 'mode'):
+            raise pickle.PicklingError("Cannot pickle files that do not map to an actual file")
+        if obj is sys.stdout:
+            return self.save_reduce(getattr, (sys, 'stdout'), obj=obj)
+        if obj is sys.stderr:
+            return self.save_reduce(getattr, (sys, 'stderr'), obj=obj)
+        if obj is sys.stdin:
+            raise pickle.PicklingError("Cannot pickle standard input")
+        if obj.closed:
+            raise pickle.PicklingError("Cannot pickle closed files")
+        if hasattr(obj, 'isatty') and obj.isatty():
+            raise pickle.PicklingError("Cannot pickle files that map to tty objects")
+        if 'r' not in obj.mode and '+' not in obj.mode:
+            raise pickle.PicklingError("Cannot pickle files that are not opened for reading: %s" % obj.mode)
+
+        name = obj.name
+
+        # TODO: also support binary mode files with io.BytesIO
+        retval = io.StringIO()
+
+        try:
+            # Read the whole file
+            curloc = obj.tell()
+            obj.seek(0)
+            contents = obj.read()
+            obj.seek(curloc)
+        except IOError:
+            raise pickle.PicklingError("Cannot pickle file %s as it cannot be read" % name)
+        retval.write(contents)
+        retval.seek(curloc)
+
+        retval.name = name
+        self.save(retval)
+        self.memoize(obj)
+
+    def save_ellipsis(self, obj):
+        self.save_reduce(_gen_ellipsis, ())
+
+    def save_not_implemented(self, obj):
+        self.save_reduce(_gen_not_implemented, ())
+
     dispatch[io.TextIOWrapper] = save_file
-  else:
-    dispatch[file] = save_file
+    dispatch[type(Ellipsis)] = save_ellipsis
+    dispatch[type(NotImplemented)] = save_not_implemented
 
-  # Special functions for Add-on libraries
+    def save_weakset(self, obj):
+        self.save_reduce(weakref.WeakSet, (list(obj),))
 
-  def inject_numpy(self):
-    numpy = sys.modules.get('numpy')
-    if not numpy or not hasattr(numpy, 'ufunc'):
-      return
-    self.dispatch[numpy.ufunc] = self.__class__.save_ufunc
+    dispatch[weakref.WeakSet] = save_weakset
 
-  def save_ufunc(self, obj):
-    """Hack function for saving numpy ufunc objects"""
-    name = obj.__name__
-    numpy_tst_mods = ['numpy', 'scipy.special']
-    for tst_mod_name in numpy_tst_mods:
-      tst_mod = sys.modules.get(tst_mod_name, None)
-      if tst_mod and name in tst_mod.__dict__:
-        return self.save_reduce(_getobject, (tst_mod_name, name))
-    raise pickle.PicklingError(
-        'cannot save %s. Cannot resolve what module it is defined in' % str(obj))
+    def save_logger(self, obj):
+        self.save_reduce(logging.getLogger, (obj.name,), obj=obj)
 
-  def inject_addons(self):
-    """Plug in system. Register additional pickling functions if modules already loaded"""
-    self.inject_numpy()
+    dispatch[logging.Logger] = save_logger
+
+    def save_root_logger(self, obj):
+        self.save_reduce(logging.getLogger, (), obj=obj)
+
+    dispatch[logging.RootLogger] = save_root_logger
+
+    if hasattr(types, "MappingProxyType"):  # pragma: no branch
+        def save_mappingproxy(self, obj):
+            self.save_reduce(types.MappingProxyType, (dict(obj),), obj=obj)
+
+        dispatch[types.MappingProxyType] = save_mappingproxy
+
+    """Special functions for Add-on libraries"""
+    def inject_addons(self):
+        """Plug in system. Register additional pickling functions if modules already loaded"""
+        pass
+
+    if sys.version_info < (3, 7):  # pragma: no branch
+        def _save_parametrized_type_hint(self, obj):
+            # The distorted type check sematic for typing construct becomes:
+            # ``type(obj) is type(TypeHint)``, which means "obj is a
+            # parametrized TypeHint"
+            if type(obj) is type(Literal):  # pragma: no branch
+                initargs = (Literal, obj.__values__)
+            elif type(obj) is type(Final):  # pragma: no branch
+                initargs = (Final, obj.__type__)
+            elif type(obj) is type(ClassVar):
+                initargs = (ClassVar, obj.__type__)
+            elif type(obj) is type(Generic):
+                parameters = obj.__parameters__
+                if len(obj.__parameters__) > 0:
+                    # in early Python 3.5, __parameters__ was sometimes
+                    # preferred to __args__
+                    initargs = (obj.__origin__, parameters)
+                else:
+                    initargs = (obj.__origin__, obj.__args__)
+            elif type(obj) is type(Union):
+                if sys.version_info < (3, 5, 3):  # pragma: no cover
+                    initargs = (Union, obj.__union_params__)
+                else:
+                    initargs = (Union, obj.__args__)
+            elif type(obj) is type(Tuple):
+                if sys.version_info < (3, 5, 3):  # pragma: no cover
+                    initargs = (Tuple, obj.__tuple_params__)
+                else:
+                    initargs = (Tuple, obj.__args__)
+            elif type(obj) is type(Callable):
+                if sys.version_info < (3, 5, 3):  # pragma: no cover
+                    args = obj.__args__
+                    result = obj.__result__
+                    if args != Ellipsis:
+                        if isinstance(args, tuple):
+                            args = list(args)
+                        else:
+                            args = [args]
+                else:
+                    (*args, result) = obj.__args__
+                    if len(args) == 1 and args[0] is Ellipsis:
+                        args = Ellipsis
+                    else:
+                        args = list(args)
+                initargs = (Callable, (args, result))
+            else:  # pragma: no cover
+                raise pickle.PicklingError(
+                    "Cloudpickle Error: Unknown type {}".format(type(obj))
+                )
+            self.save_reduce(_create_parametrized_type_hint, initargs, obj=obj)
+
+
+# Tornado support
+
+def is_tornado_coroutine(func):
+    """
+    Return whether *func* is a Tornado coroutine function.
+    Running coroutines are not supported.
+    """
+    if 'tornado.gen' not in sys.modules:
+        return False
+    gen = sys.modules['tornado.gen']
+    if not hasattr(gen, "is_coroutine_function"):
+        # Tornado version is too old
+        return False
+    return gen.is_coroutine_function(func)
+
+
+def _rebuild_tornado_coroutine(func):
+    from tornado import gen
+    return gen.coroutine(func)
 
 
 # Shorthands for legacy support
 
-def dump(obj, filen, protocol=2):
-  CloudPickler(filen, protocol).dump(obj)
+def dump(obj, file, protocol=None):
+    """Serialize obj as bytes streamed into file
+    protocol defaults to cloudpickle.DEFAULT_PROTOCOL which is an alias to
+    pickle.HIGHEST_PROTOCOL. This setting favors maximum communication speed
+    between processes running the same Python version.
+    Set protocol=pickle.DEFAULT_PROTOCOL instead if you need to ensure
+    compatibility with older versions of Python.
+    """
+    CloudPickler(file, protocol=protocol).dump(obj)
 
 
-def dumps(obj, protocol=2):
-  filen = StringIO()
-
-  cp = CloudPickler(filen, protocol)
-  cp.dump(obj)
-
-  return filen.getvalue()
+def dumps(obj, protocol=None):
+    """Serialize obj as a string of bytes allocated in memory
+    protocol defaults to cloudpickle.DEFAULT_PROTOCOL which is an alias to
+    pickle.HIGHEST_PROTOCOL. This setting favors maximum communication speed
+    between processes running the same Python version.
+    Set protocol=pickle.DEFAULT_PROTOCOL instead if you need to ensure
+    compatibility with older versions of Python.
+    """
+    file = BytesIO()
+    try:
+        cp = CloudPickler(file, protocol=protocol)
+        cp.dump(obj)
+        return file.getvalue()
+    finally:
+        file.close()
 
 
-#hack for __import__ not working as desired
+# including pickles unloading functions in this namespace
+load = pickle.load
+loads = pickle.loads
+
+
+# hack for __import__ not working as desired
 def subimport(name):
-  __import__(name)
-  return sys.modules[name]
+    __import__(name)
+    return sys.modules[name]
 
 
-# restores function attributes
-def _restore_attr(obj, attr):
-  for key, val in list(attr.items()):
-    setattr(obj, key, val)
-  return obj
+def dynamic_subimport(name, vars):
+    mod = types.ModuleType(name)
+    mod.__dict__.update(vars)
+    mod.__dict__['__builtins__'] = builtins.__dict__
+    return mod
 
 
-def _get_module_builtins():
-  return pickle.__builtins__ # pylint: disable=no-member
+def _gen_ellipsis():
+    return Ellipsis
 
 
-def print_exec(stream):
-  ei = sys.exc_info()
-  traceback.print_exception(ei[0], ei[1], ei[2], None, stream)
+def _gen_not_implemented():
+    return NotImplemented
 
 
-def _modules_to_main(modList):
-  """Force every module in modList to be placed into main"""
-  if not modList:
-    return
-
-  main = sys.modules['__main__']
-  for modname in modList:
-    if isinstance(modname, str):
-      try:
-        mod = __import__(modname)
-      except Exception:
-        sys.stderr.write(
-            'warning: could not import %s\n.  '
-            'Your function may unexpectedly error due to this import failing;'
-            'A version mismatch is likely.  Specific error was:\n' % modname)
-        print_exec(sys.stderr)
-      else:
-        setattr(main, mod.__name__, mod)
+def _get_cell_contents(cell):
+    try:
+        return cell.cell_contents
+    except ValueError:
+        # sentinel used by ``_fill_function`` which will leave the cell empty
+        return _empty_cell_value
 
 
-#object generators:
-def _genpartial(func, args, kwds):
-  if not args:
-    args = ()
-  if not kwds:
-    kwds = {}
-  return partial(func, *args, **kwds)
+def instance(cls):
+    """Create a new instance of a class.
+    Parameters
+    ----------
+    cls : type
+        The class to create an instance of.
+    Returns
+    -------
+    instance : cls
+        A new instance of ``cls``.
+    """
+    return cls()
 
 
-def _fill_function(func, globalsn, defaults, dictn, module):
-  """ Fills in the rest of function data into the skeleton function object
-    that were created via _make_skel_func().
-     """
-  func.__globals__.update(globalsn)
-  func.__defaults__ = defaults
-  func.__dict__ = dictn
-  func.__module__ = module
-
-  return func
+@instance
+class _empty_cell_value(object):
+    """sentinel for empty closures
+    """
+    @classmethod
+    def __reduce__(cls):
+        return cls.__name__
 
 
-def _make_cell(value):
-  return (lambda: value).__closure__[0]
+def _fill_function(*args):
+    """Fills in the rest of function data into the skeleton function object
+    The skeleton itself is create by _make_skel_func().
+    """
+    if len(args) == 2:
+        func = args[0]
+        state = args[1]
+    elif len(args) == 5:
+        # Backwards compat for cloudpickle v0.4.0, after which the `module`
+        # argument was introduced
+        func = args[0]
+        keys = ['globals', 'defaults', 'dict', 'closure_values']
+        state = dict(zip(keys, args[1:]))
+    elif len(args) == 6:
+        # Backwards compat for cloudpickle v0.4.1, after which the function
+        # state was passed as a dict to the _fill_function it-self.
+        func = args[0]
+        keys = ['globals', 'defaults', 'dict', 'module', 'closure_values']
+        state = dict(zip(keys, args[1:]))
+    else:
+        raise ValueError('Unexpected _fill_value arguments: %r' % (args,))
+
+    # - At pickling time, any dynamic global variable used by func is
+    #   serialized by value (in state['globals']).
+    # - At unpickling time, func's __globals__ attribute is initialized by
+    #   first retrieving an empty isolated namespace that will be shared
+    #   with other functions pickled from the same original module
+    #   by the same CloudPickler instance and then updated with the
+    #   content of state['globals'] to populate the shared isolated
+    #   namespace with all the global variables that are specifically
+    #   referenced for this function.
+    func.__globals__.update(state['globals'])
+
+    func.__defaults__ = state['defaults']
+    func.__dict__ = state['dict']
+    if 'annotations' in state:
+        func.__annotations__ = state['annotations']
+    if 'doc' in state:
+        func.__doc__  = state['doc']
+    if 'name' in state:
+        func.__name__ = state['name']
+    if 'module' in state:
+        func.__module__ = state['module']
+    if 'qualname' in state:
+        func.__qualname__ = state['qualname']
+    if 'kwdefaults' in state:
+        func.__kwdefaults__ = state['kwdefaults']
+    # _cloudpickle_subimports is a set of submodules that must be loaded for
+    # the pickled function to work correctly at unpickling time. Now that these
+    # submodules are depickled (hence imported), they can be removed from the
+    # object's state (the object state only served as a reference holder to
+    # these submodules)
+    if '_cloudpickle_submodules' in state:
+        state.pop('_cloudpickle_submodules')
+
+    cells = func.__closure__
+    if cells is not None:
+        for cell, value in zip(cells, state['closure_values']):
+            if value is not _empty_cell_value:
+                cell_set(cell, value)
+
+    return func
 
 
-def _reconstruct_closure(values):
-  return tuple([_make_cell(v) for v in values])
+def _make_empty_cell():
+    if False:
+        # trick the compiler into creating an empty cell in our lambda
+        cell = None
+        raise AssertionError('this route should not be executed')
+
+    return (lambda: cell).__closure__[0]
 
 
-def _make_skel_func(code, closures, base_globals=None):
-  """ Creates a skeleton function object that contains just the provided
-    code and the correct number of cells in func_closure.  All other
-    func attributes (e.g. func_globals) are empty.
-  """
-  closure = _reconstruct_closure(closures) if closures else None
+def _make_skel_func(code, cell_count, base_globals=None):
+    """ Creates a skeleton function object that contains just the provided
+        code and the correct number of cells in func_closure.  All other
+        func attributes (e.g. func_globals) are empty.
+    """
+    # This is backward-compatibility code: for cloudpickle versions between
+    # 0.5.4 and 0.7, base_globals could be a string or None. base_globals
+    # should now always be a dictionary.
+    if base_globals is None or isinstance(base_globals, str):
+        base_globals = {}
 
-  if base_globals is None:
-    base_globals = {}
-  base_globals['__builtins__'] = __builtins__
+    base_globals['__builtins__'] = __builtins__
 
-  return types.FunctionType(code, base_globals, None, None, closure)
+    closure = (
+        tuple(_make_empty_cell() for _ in range(cell_count))
+        if cell_count >= 0 else
+        None
+    )
+    return types.FunctionType(code, base_globals, None, None, closure)
 
 
-def _load_class(cls, d):
-  """
-  Loads additional properties into class `cls`.
-  """
-  for k, v in list(d.items()):
-    if isinstance(k, tuple):
-      typ, k = k
-      if typ == 'property':
-        v = property(*v)
-      elif typ == 'staticmethod':
-        v = staticmethod(v) # pylint: disable=redefined-variable-type
-      elif typ == 'classmethod':
-        v = classmethod(v)
-    setattr(cls, k, v)
-  return cls
+def _make_skeleton_class(type_constructor, name, bases, type_kwargs,
+                         class_tracker_id, extra):
+    """Build dynamic class with an empty __dict__ to be filled once memoized
+    If class_tracker_id is not None, try to lookup an existing class definition
+    matching that id. If none is found, track a newly reconstructed class
+    definition under that id so that other instances stemming from the same
+    class id will also reuse this class definition.
+    The "extra" variable is meant to be a dict (or None) that can be used for
+    forward compatibility shall the need arise.
+    """
+    skeleton_class = types.new_class(
+        name, bases, {'metaclass': type_constructor},
+        lambda ns: ns.update(type_kwargs)
+    )
+    return _lookup_class_or_track(class_tracker_id, skeleton_class)
 
 
-def _load_namedtuple(name, fields):
-  """
-  Loads a class generated by namedtuple
-  """
-  from collections import namedtuple
-  return namedtuple(name, fields)
+def _rehydrate_skeleton_class(skeleton_class, class_dict):
+    """Put attributes from `class_dict` back on `skeleton_class`.
+    See CloudPickler.save_dynamic_class for more info.
+    """
+    registry = None
+    for attrname, attr in class_dict.items():
+        if attrname == "_abc_impl":
+            registry = attr
+        else:
+            setattr(skeleton_class, attrname, attr)
+    if registry is not None:
+        for subclass in registry:
+            skeleton_class.register(subclass)
+
+    return skeleton_class
 
 
-# Constructors for 3rd party libraries
-# Note: These can never be renamed due to client compatibility issues
+def _make_skeleton_enum(bases, name, qualname, members, module,
+                        class_tracker_id, extra):
+    """Build dynamic enum with an empty __dict__ to be filled once memoized
+    The creation of the enum class is inspired by the code of
+    EnumMeta._create_.
+    If class_tracker_id is not None, try to lookup an existing enum definition
+    matching that id. If none is found, track a newly reconstructed enum
+    definition under that id so that other instances stemming from the same
+    class id will also reuse this enum definition.
+    The "extra" variable is meant to be a dict (or None) that can be used for
+    forward compatibility shall the need arise.
+    """
+    # enums always inherit from their base Enum class at the last position in
+    # the list of base classes:
+    enum_base = bases[-1]
+    metacls = enum_base.__class__
+    classdict = metacls.__prepare__(name, bases)
 
-def _getobject(modname, attribute):
-  mod = __import__(modname, fromlist=[attribute])
-  return mod.__dict__[attribute]
+    for member_name, member_value in members.items():
+        classdict[member_name] = member_value
+    enum_class = metacls.__new__(metacls, name, bases, classdict)
+    enum_class.__module__ = module
+    enum_class.__qualname__ = qualname
+
+    return _lookup_class_or_track(class_tracker_id, enum_class)
+
+
+def _is_dynamic(module):
+    """
+    Return True if the module is special module that cannot be imported by its
+    name.
+    """
+    # Quick check: module that have __file__ attribute are not dynamic modules.
+    if hasattr(module, '__file__'):
+        return False
+
+    if module.__spec__ is not None:
+        return False
+
+    # In PyPy, Some built-in modules such as _codecs can have their
+    # __spec__ attribute set to None despite being imported.  For such
+    # modules, the ``_find_spec`` utility of the standard library is used.
+    parent_name = module.__name__.rpartition('.')[0]
+    if parent_name:  # pragma: no cover
+        # This code handles the case where an imported package (and not
+        # module) remains with __spec__ set to None. It is however untested
+        # as no package in the PyPy stdlib has __spec__ set to None after
+        # it is imported.
+        try:
+            parent = sys.modules[parent_name]
+        except KeyError:
+            msg = "parent {!r} not in sys.modules"
+            raise ImportError(msg.format(parent_name))
+        else:
+            pkgpath = parent.__path__
+    else:
+        pkgpath = None
+    return _find_spec(module.__name__, pkgpath, module) is None
+
+
+def _make_typevar(name, bound, constraints, covariant, contravariant,
+                  class_tracker_id):
+    tv = typing.TypeVar(
+        name, *constraints, bound=bound,
+        covariant=covariant, contravariant=contravariant
+    )
+    if class_tracker_id is not None:
+        return _lookup_class_or_track(class_tracker_id, tv)
+    else:  # pragma: nocover
+        # Only for Python 3.5.3 compat.
+        return tv
+
+
+def _decompose_typevar(obj):
+    try:
+        class_tracker_id = _get_or_create_tracker_id(obj)
+    except TypeError:  # pragma: nocover
+        # TypeVar instances are not weakref-able in Python 3.5.3
+        class_tracker_id = None
+    return (
+        obj.__name__, obj.__bound__, obj.__constraints__,
+        obj.__covariant__, obj.__contravariant__,
+        class_tracker_id,
+    )
+
+
+def _typevar_reduce(obj):
+    # TypeVar instances have no __qualname__ hence we pass the name explicitly.
+    module_and_name = _lookup_module_and_qualname(obj, name=obj.__name__)
+    if module_and_name is None:
+        return (_make_typevar, _decompose_typevar(obj))
+    return (getattr, module_and_name)
+
+
+def _get_bases(typ):
+    if hasattr(typ, '__orig_bases__'):
+        # For generic types (see PEP 560)
+        bases_attr = '__orig_bases__'
+    else:
+        # For regular class objects
+        bases_attr = '__bases__'
+    return getattr(typ, bases_attr)
diff --git a/heronpy/api/component/base_component.py b/heronpy/api/component/base_component.py
index 47810d8..48a14b9 100644
--- a/heronpy/api/component/base_component.py
+++ b/heronpy/api/component/base_component.py
@@ -13,7 +13,7 @@
 # limitations under the License.
 '''base_component.py'''
 
-class BaseComponent(object):
+class BaseComponent:
   """Base component for Heron spout and bolt"""
   def __init__(self, delegate):
     """Initializes BaseComponent
diff --git a/heronpy/api/component/component_spec.py b/heronpy/api/component/component_spec.py
index bf134d7..29269ad 100644
--- a/heronpy/api/component/component_spec.py
+++ b/heronpy/api/component/component_spec.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -28,7 +28,7 @@
 from heronpy.api.stream import Stream, Grouping
 
 # pylint: disable=too-many-instance-attributes
-class HeronComponentSpec(object):
+class HeronComponentSpec:
   """Class to specify the information and location of components in a topology
 
   This class is generated by the ``spec()`` method of Spout and Bolt class and
@@ -67,8 +67,7 @@
     """Returns protobuf message (Spout or Bolt) of this component"""
     if self.is_spout:
       return self._get_spout()
-    else:
-      return self._get_bolt()
+    return self._get_bolt()
 
   def _get_spout(self):
     """Returns Spout protobuf message"""
@@ -191,7 +190,7 @@
     """Sanitizes input fields and returns a map <GlobalStreamId -> Grouping>"""
     ret = {}
     if self.inputs is None:
-      return
+      return None
 
     if isinstance(self.inputs, dict):
       # inputs are dictionary, must be either <HeronComponentSpec -> Grouping> or
@@ -248,7 +247,7 @@
     """Sanitizes output fields and returns a map <stream_id -> list of output fields>"""
     ret = {}
     if self.outputs is None:
-      return
+      return None
 
     if not isinstance(self.outputs, (list, tuple)):
       raise TypeError("Argument to outputs must be either list or tuple, given: %s"
@@ -314,7 +313,7 @@
 
     return stream_schema
 
-class GlobalStreamId(object):
+class GlobalStreamId:
   """Wrapper class to define stream_id and its component name
 
   Constructor method is compatible with StreamParse's GlobalStreamId class, although
@@ -358,11 +357,10 @@
         # appropriate this case.
         return "<No name available for HeronComponentSpec yet, uuid: %s>" % self._component_id.uuid
       return self._component_id.name
-    elif isinstance(self._component_id, str):
+    if isinstance(self._component_id, str):
       return self._component_id
-    else:
-      raise ValueError("Component Id for this GlobalStreamId is not properly set: <%s:%s>"
-                       % (str(type(self._component_id)), str(self._component_id)))
+    raise ValueError("Component Id for this GlobalStreamId is not properly set: <%s:%s>"
+                     % (str(type(self._component_id)), str(self._component_id)))
 
   def __eq__(self, other):
     return hasattr(other, 'component_id') and self.component_id == other.component_id \
diff --git a/heronpy/api/custom_grouping.py b/heronpy/api/custom_grouping.py
index 6c72a30..2d21ed9 100644
--- a/heronpy/api/custom_grouping.py
+++ b/heronpy/api/custom_grouping.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 '''custom_grouping.py: interface module for custom grouping'''
 from abc import abstractmethod
 
-class ICustomGrouping(object):
+class ICustomGrouping:
   '''Interface for custom grouping class'''
 
   @abstractmethod
@@ -36,7 +36,6 @@
     :type target_tasks: list of int
     :param target_tasks: list of target task ids
     """
-    pass
 
   @abstractmethod
   def choose_tasks(self, values):
@@ -46,4 +45,3 @@
     :rtype: list of int
     :return: list of task ids to which these values are emitted
     """
-    pass
diff --git a/heronpy/api/global_metrics.py b/heronpy/api/global_metrics.py
index cf052b1..a9bf599 100644
--- a/heronpy/api/global_metrics.py
+++ b/heronpy/api/global_metrics.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/api/metrics.py b/heronpy/api/metrics.py
index b944d58..6eabf47 100644
--- a/heronpy/api/metrics.py
+++ b/heronpy/api/metrics.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -23,12 +23,11 @@
 
 # pylint: disable=attribute-defined-outside-init
 
-class IMetric(object):
+class IMetric:
   """Interface for Heron Metric"""
   @abstractmethod
   def get_value_and_reset(self):
     """Returns the current value and reset"""
-    pass
 
 class CountMetric(IMetric):
   """Counter for a single value"""
@@ -68,22 +67,19 @@
     return ret
 
 # Reducer metric
-class IReducer(object):
+class IReducer:
   """Interface for Reducer"""
   @abstractmethod
   def init(self):
     """Called when this reducer is initialized/reinitialized"""
-    pass
 
   @abstractmethod
   def reduce(self, value):
     """Called to reduce the value"""
-    pass
 
   @abstractmethod
   def extract(self):
     """Called to extract the current value"""
-    pass
 
 class MeanReducer(IReducer):
   """Mean Reducer"""
@@ -98,8 +94,7 @@
   def extract(self):
     if self.count > 0:
       return float(self.sum)/self.count
-    else:
-      return None
+    return None
 
 class ReducedMetric(IMetric):
   """Reduced Metric"""
diff --git a/heronpy/api/serializer.py b/heronpy/api/serializer.py
index 6a922f6..dc55456 100644
--- a/heronpy/api/serializer.py
+++ b/heronpy/api/serializer.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -28,40 +28,37 @@
 
 import heronpy.api.cloudpickle as cloudpickle
 
-class IHeronSerializer(object):
+class IHeronSerializer:
   """Serializer interface for Heron"""
   @abstractmethod
   def initialize(self, config):
     """Initializes the serializer"""
-    pass
 
   @abstractmethod
-  def serialize(self, obj):
+  def serialize(self, obj) -> bytes:
     """Serialize an object
 
     :param obj: The object to be serialized
     :returns: Serialized object as byte string
     """
-    pass
 
   @abstractmethod
-  def deserialize(self, input_str):
+  def deserialize(self, input_str: bytes):
     """Deserialize an object
 
     :param input_str: Serialized object as byte string
     :returns: Deserialized object
     """
-    pass
 
 class PythonSerializer(IHeronSerializer):
   """Default serializer"""
   def initialize(self, config=None):
     pass
 
-  def serialize(self, obj):
+  def serialize(self, obj) -> bytes:
     return cloudpickle.dumps(obj)
 
-  def deserialize(self, input_str):
+  def deserialize(self, input_str: bytes):
     return pickle.loads(input_str)
 
 default_serializer = PythonSerializer()
diff --git a/heronpy/api/spout/spout.py b/heronpy/api/spout/spout.py
index 81ff5d7..6e86182 100644
--- a/heronpy/api/spout/spout.py
+++ b/heronpy/api/spout/spout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -52,7 +52,6 @@
                     topology, including the task id and component id of this task, input and output
                     information, etc.
     """
-    pass
 
   @abstractmethod
   def close(self):
@@ -60,7 +59,6 @@
 
     There is no guarantee that close() will be called.
     """
-    pass
 
   @abstractmethod
   def next_tuple(self):
@@ -92,7 +90,6 @@
 
     :param tup_id: the ID of the HeronTuple that has been fully acknowledged.
     """
-    pass
 
   @abstractmethod
   def fail(self, tup_id):
@@ -109,7 +106,6 @@
     :param tup_id: the ID of the HeronTuple that has failed either due to a bolt calling ``fail()``
                    or timeout
     """
-    pass
 
   @abstractmethod
   def activate(self):
@@ -119,7 +115,6 @@
     after having been deactivated when the topology is manipulated using the
     `heron` client.
     """
-    pass
 
   @abstractmethod
   def deactivate(self):
@@ -128,4 +123,3 @@
     next_tuple() will not be called while a spout is deactivated.
     The spout may or may not be reactivated in the future.
     """
-    pass
diff --git a/heronpy/api/state/state.py b/heronpy/api/state/state.py
index 7b15481..bea9a90 100644
--- a/heronpy/api/state/state.py
+++ b/heronpy/api/state/state.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 '''state.py'''
 from abc import abstractmethod
 
-class State(object):
+class State:
   """State represents the state interface as seen by stateful bolts and spouts.
   In Heron, state gives a notional Key/Value interface along with the
   ability to iterate over the key/values
@@ -32,7 +32,6 @@
     :param key: The key to get back the value
     :param value: The value associated with the key
     """
-    pass
 
   @abstractmethod
   def get(self, key):
@@ -40,20 +39,17 @@
     :param key: The key whose value we want back
     :return: The value associated with the key
     """
-    pass
 
   @abstractmethod
   def enumerate(self):
     """Allows one to enumerate over the state.
     :return: The enumerate object
     """
-    pass
 
   @abstractmethod
   def clear(self):
     """Clears the state to empty state
     """
-    pass
 
 class HashMapState(State):
   """HashMapState represents default implementation of the State interface
@@ -61,14 +57,11 @@
   def __init__(self):
     self._dict = {}
 
-  def put(self, k, v):
-    self._dict[k] = v
+  def put(self, key, value):
+    self._dict[key] = value
 
-  def get(self, k):
-    if k in self._dict:
-      return self._dict[k]
-    else:
-      return None
+  def get(self, key):
+    return self._dict.get(key)
 
   def enumerate(self):
     return enumerate(self._dict)
diff --git a/heronpy/api/state/stateful_component.py b/heronpy/api/state/stateful_component.py
index 33f32af..d61ce6e 100644
--- a/heronpy/api/state/stateful_component.py
+++ b/heronpy/api/state/stateful_component.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 '''stateful_component.py'''
 from abc import abstractmethod
 
-class StatefulComponent(object):
+class StatefulComponent:
   """Defines a component that saves its internal state using the State interface
   When running under effectively once semantics, the state is periodically checkpointed
   and is replayed when errors occur to a globally consistent checkpoint.
@@ -36,11 +36,9 @@
     Note that init_state() is called before initialize()
     :param state: the previously saved state of the component.
     """
-    pass
 
   @abstractmethod
   def pre_save(self):
     """This is a hook for the component to perform some actions just before the
     framework saves its state.
     """
-    pass
diff --git a/heronpy/api/stream.py b/heronpy/api/stream.py
index b959d03..828f553 100644
--- a/heronpy/api/stream.py
+++ b/heronpy/api/stream.py
@@ -19,7 +19,7 @@
 from heronpy.api.custom_grouping import ICustomGrouping
 from heronpy.proto import topology_pb2
 
-class Stream(object):
+class Stream:
   """Heron output stream
 
   It is compatible with StreamParse API.
@@ -50,7 +50,7 @@
 
     if name is None:
       raise TypeError("Stream's name cannot be None")
-    elif isinstance(name, str):
+    if isinstance(name, str):
       self.stream_id = name
     else:
       raise TypeError("Stream name must be a string, given: %s" % str(name))
@@ -62,7 +62,7 @@
     else:
       raise TypeError("'direct' must be either True or False, given: %s" % str(direct))
 
-class Grouping(object):
+class Grouping:
   """Helper class for defining Grouping for Python topology"""
   SHUFFLE = topology_pb2.Grouping.Value("SHUFFLE")
   ALL = topology_pb2.Grouping.Value("ALL")
@@ -83,18 +83,17 @@
   @classmethod
   def is_grouping_sane(cls, gtype):
     """Checks if a given gtype is sane"""
-    if gtype == cls.SHUFFLE or gtype == cls.ALL or gtype == cls.LOWEST or gtype == cls.NONE:
+    if gtype in (cls.SHUFFLE, cls.ALL, cls.LOWEST, cls.NONE):
       return True
-    elif isinstance(gtype, cls.FIELDS):
+    if isinstance(gtype, cls.FIELDS):
       return gtype.gtype == topology_pb2.Grouping.Value("FIELDS") and \
              gtype.fields is not None
-    elif isinstance(gtype, cls.CUSTOM):
+    if isinstance(gtype, cls.CUSTOM):
       return gtype.gtype == topology_pb2.Grouping.Value("CUSTOM") and \
              gtype.python_serialized is not None
-    else:
-      #pylint: disable=fixme
-      #TODO: DIRECT are not supported yet
-      return False
+    #pylint: disable=fixme
+    #TODO: DIRECT are not supported yet
+    return False
 
   @classmethod
   def fields(cls, *fields):
@@ -148,9 +147,8 @@
     if not is_java:
       return cls.CUSTOM(gtype=topology_pb2.Grouping.Value("CUSTOM"),
                         python_serialized=serialized)
-    else:
-      raise NotImplementedError("Custom grouping implemented in Java for Python topology"
-                                "is not yet supported.")
+    raise NotImplementedError("Custom grouping implemented in Java for Python topology"
+                              "is not yet supported.")
 
   @classmethod
   def custom_object(cls, java_class_name, arg_list):
diff --git a/heronpy/api/task_hook.py b/heronpy/api/task_hook.py
index c95a1d8..19fedcb 100644
--- a/heronpy/api/task_hook.py
+++ b/heronpy/api/task_hook.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,7 +22,7 @@
 from collections import namedtuple
 from abc import abstractmethod
 
-class ITaskHook(object):
+class ITaskHook:
   """ITaskHook is an interface for defining task hooks for a topology"""
 
   @abstractmethod
@@ -32,12 +32,10 @@
     :param conf: component-specific configuration passed to the topology
     :param context: topology context
     """
-    pass
 
   @abstractmethod
   def clean_up(self):
     """Called just before the spout/bolt's cleanup method is called"""
-    pass
 
   @abstractmethod
   def emit(self, emit_info):
@@ -45,7 +43,6 @@
 
     :param emit_info: EmitInfo object
     """
-    pass
 
   @abstractmethod
   def spout_ack(self, spout_ack_info):
@@ -53,7 +50,6 @@
 
     :param spout_ack_info: SpoutAckInfo object
     """
-    pass
 
   @abstractmethod
   def spout_fail(self, spout_fail_info):
@@ -61,7 +57,6 @@
 
     :param spout_fail_info: SpoutFailInfo object
     """
-    pass
 
   @abstractmethod
   def bolt_execute(self, bolt_execute_info):
@@ -69,7 +64,6 @@
 
     :param bolt_execute_info: BoltExecuteInfo object
     """
-    pass
 
   @abstractmethod
   def bolt_ack(self, bolt_ack_info):
@@ -77,7 +71,6 @@
 
     :param bolt_ack_info: BoltAckInfo object
     """
-    pass
 
   @abstractmethod
   def bolt_fail(self, bolt_fail_info):
@@ -85,7 +78,6 @@
 
     :param bolt_fail_info: BoltFailInfo object
     """
-    pass
 
 
 ##################################################################################
diff --git a/heronpy/api/tests/python/component_unittest.py b/heronpy/api/tests/python/component_unittest.py
index 9da4a8d..a93baac 100644
--- a/heronpy/api/tests/python/component_unittest.py
+++ b/heronpy/api/tests/python/component_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -175,7 +175,6 @@
     with self.assertRaises(ValueError):
       spec._sanitize_inputs()
 
-  # pylint: disable=redefined-variable-type
   # pylint: disable=pointless-statement
   def test_sanitize_outputs(self):
     # outputs is None (no argument to outputs)
diff --git a/heronpy/api/tests/python/metrics_unittest.py b/heronpy/api/tests/python/metrics_unittest.py
index 063651a..23f35a9 100644
--- a/heronpy/api/tests/python/metrics_unittest.py
+++ b/heronpy/api/tests/python/metrics_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/api/tests/python/serializer_unittest.py b/heronpy/api/tests/python/serializer_unittest.py
index 65db24a..8b8bf8b 100644
--- a/heronpy/api/tests/python/serializer_unittest.py
+++ b/heronpy/api/tests/python/serializer_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/api/tests/python/stream_unittest.py b/heronpy/api/tests/python/stream_unittest.py
index 3869bcb..013c1f0 100644
--- a/heronpy/api/tests/python/stream_unittest.py
+++ b/heronpy/api/tests/python/stream_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/api/tests/python/topology_unittest.py b/heronpy/api/tests/python/topology_unittest.py
index 80e27aa..bdd172b 100644
--- a/heronpy/api/tests/python/topology_unittest.py
+++ b/heronpy/api/tests/python/topology_unittest.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/api/topology.py b/heronpy/api/topology.py
index c62a6bd..a628224 100644
--- a/heronpy/api/topology.py
+++ b/heronpy/api/topology.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -80,8 +80,7 @@
           spec.name = name
         if spec.name in specs:
           raise ValueError("Duplicate component name: %s" % spec.name)
-        else:
-          specs[spec.name] = spec
+        specs[spec.name] = spec
     return specs
 
   @classmethod
@@ -258,7 +257,7 @@
     return sanitized
 
 @six.add_metaclass(TopologyType)
-class Topology(object):
+class Topology:
   """Topology is an abstract class for defining a topology
 
   Topology writers can define their custom topology by inheriting this class.
@@ -304,7 +303,7 @@
     with open(path, 'wb') as f:
       f.write(cls.protobuf_topology.SerializeToString())
 
-class TopologyBuilder(object):
+class TopologyBuilder:
   """Builder for heronpy.api.src.python topology
 
   This class dynamically creates a subclass of `Topology` with given spouts and
diff --git a/heronpy/api/topology_context.py b/heronpy/api/topology_context.py
index e3bb263..4f6083e 100644
--- a/heronpy/api/topology_context.py
+++ b/heronpy/api/topology_context.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 '''topology_context.py'''
 from abc import abstractmethod
 
-class TopologyContext(object):
+class TopologyContext:
   """Topology Context is the means for spouts/bolts to get information about
      the running topology. This file just is the interface to be used by spouts/bolts
 
@@ -33,14 +33,12 @@
     """Gets the task id of this component
     :return: the task_id of this component
     """
-    pass
 
   @abstractmethod
   def get_component_id(self):
     """Gets the component id of this component
     :return: the component_id of this component
     """
-    pass
 
   @abstractmethod
   def get_cluster_config(self):
@@ -48,14 +46,12 @@
     Note that the returned config is auto-typed map: <str -> any Python object>.
     :return: the dict of key -> value
     """
-    pass
 
   @abstractmethod
   def get_topology_name(self):
     """Returns the name of the topology
     :return: the name of the topology
     """
-    pass
 
   @abstractmethod
   def register_metric(self, name, metric, time_bucket_in_sec):
@@ -64,7 +60,6 @@
     :param metric: The IMetric that needs to be registered
     :param time_bucket_in_sec: The interval in seconds to do getValueAndReset
     """
-    pass
 
   @abstractmethod
   def get_sources(self, component_id):
@@ -74,7 +69,6 @@
     :return: map <streamId namedtuple (same structure as protobuf msg) -> gtype>, or
              None if not found
     """
-    pass
 
   def get_this_sources(self):
     """Returns the declared inputs to this component
@@ -90,7 +84,6 @@
 
     :return: list of task_ids or None if not found
     """
-    pass
 
   @abstractmethod
   def add_task_hook(self, task_hook):
@@ -99,4 +92,3 @@
     :type task_hook: ITaskHook
     :param task_hook: Implementation of ITaskHook
     """
-    pass
diff --git a/heronpy/api/tuple.py b/heronpy/api/tuple.py
index e21ff5d..6841135 100644
--- a/heronpy/api/tuple.py
+++ b/heronpy/api/tuple.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -37,7 +37,7 @@
 :type values: tuple
 """
 
-class TupleHelper(object):
+class TupleHelper:
   """Tuple generator, returns StreamParse compatible tuple"""
   TICK_TUPLE_ID = "__tick"
   TICK_SOURCE_COMPONENT = "__system"
diff --git a/heronpy/connectors/mock/arraylooper.py b/heronpy/connectors/mock/arraylooper.py
index 3824a2e..b225696 100644
--- a/heronpy/connectors/mock/arraylooper.py
+++ b/heronpy/connectors/mock/arraylooper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/connectors/pulsar/pulsarspout.py b/heronpy/connectors/pulsar/pulsarspout.py
index 83bc055..e81bfb6 100644
--- a/heronpy/connectors/pulsar/pulsarspout.py
+++ b/heronpy/connectors/pulsar/pulsarspout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/connectors/pulsar/pulsarstreamlet.py b/heronpy/connectors/pulsar/pulsarstreamlet.py
index 319692f..2b7665d 100644
--- a/heronpy/connectors/pulsar/pulsarstreamlet.py
+++ b/heronpy/connectors/pulsar/pulsarstreamlet.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/connectors/textfiles/textfilesgenerator.py b/heronpy/connectors/textfiles/textfilesgenerator.py
index 832ada6..129e046 100644
--- a/heronpy/connectors/textfiles/textfilesgenerator.py
+++ b/heronpy/connectors/textfiles/textfilesgenerator.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/proto/BUILD b/heronpy/proto/BUILD
index b0f5246..5eb5695 100644
--- a/heronpy/proto/BUILD
+++ b/heronpy/proto/BUILD
@@ -28,7 +28,7 @@
     srcs = glob(["**/*.py"]),
     reqs = [
         "protobuf==3.8.0",
-        "setuptools==18.8.1",
+        "setuptools==46.1.3",
     ],
     deps = [
         ":proto_ckptmgr_py",
diff --git a/heronpy/streamlet/builder.py b/heronpy/streamlet/builder.py
index aa2bd22..b46c0dd 100644
--- a/heronpy/streamlet/builder.py
+++ b/heronpy/streamlet/builder.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,13 +20,11 @@
 
 '''builder.py: module for creating streamlets'''
 
-import sets
-
 from heronpy.streamlet.generator import Generator
 from heronpy.streamlet.impl.supplierspout import SupplierStreamlet
 from heronpy.streamlet.impl.generatorspout import GeneratorStreamlet
 
-class Builder(object):
+class Builder:
   """A Builder object is used to build the functional API DAG in Heron."""
   def __init__(self):
     """
@@ -50,7 +48,7 @@
   # pylint: disable=protected-access
   def build(self, bldr):
     """Builds the topology and returns the builder"""
-    stage_names = sets.Set()
+    stage_names = set()
     for source in self._sources:
       source._build(bldr, stage_names)
     for source in self._sources:
diff --git a/heronpy/streamlet/config.py b/heronpy/streamlet/config.py
index d141ccc..7432735 100644
--- a/heronpy/streamlet/config.py
+++ b/heronpy/streamlet/config.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -23,7 +23,7 @@
 import heronpy.api.api_constants as api_constants
 from heronpy.streamlet.resources import Resources
 
-class Config(object):
+class Config:
   """Config is the way users configure the execution of the topology.
      Things like tuple delivery semantics, resources used, as well as
      user defined key/value pairs are passed on to the runner via
diff --git a/heronpy/streamlet/context.py b/heronpy/streamlet/context.py
index f1fe62e..c8967f4 100644
--- a/heronpy/streamlet/context.py
+++ b/heronpy/streamlet/context.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,7 +22,7 @@
 
 from abc import abstractmethod
 
-class Context(object):
+class Context:
   """Context is the information available at runtime for operators like transform.
      It contains basic things like config, runtime information like task,
      the stream that it is operating on, ProcessState, etc.
@@ -32,39 +32,32 @@
   def get_task_id(self):
     """Fetches the task id of the current instance of the operator
     """
-    pass
 
   @abstractmethod
   def get_config(self):
     """Fetches the config of the computation
     """
-    pass
 
   @abstractmethod
   def get_stream_name(self):
     """Fetches the stream name that we are operating on
     """
-    pass
 
   @abstractmethod
   def get_num_partitions(self):
     """Fetches the number of partitions of the stream we are operating on
     """
-    pass
 
   def get_partition_index(self):
     """Fetches the partition of the stream that we are operating on
     """
-    pass
 
   @abstractmethod
   def get_state(self):
     """The state where components can store any of their local state
     """
-    pass
 
   @abstractmethod
   def emit(self, values):
     """Emits the values in the output stream
     """
-    pass
diff --git a/heronpy/streamlet/generator.py b/heronpy/streamlet/generator.py
index 194bc2d..2eed5c9 100644
--- a/heronpy/streamlet/generator.py
+++ b/heronpy/streamlet/generator.py
@@ -14,7 +14,7 @@
 '''generator.py: API for defining generic sources in python'''
 from abc import abstractmethod
 
-class Generator(object):
+class Generator:
   """API for defining a generic source for Heron in the Python Streamlet API
   """
 
diff --git a/heronpy/streamlet/impl/consumebolt.py b/heronpy/streamlet/impl/consumebolt.py
index ae63e56..3e84d23 100644
--- a/heronpy/streamlet/impl/consumebolt.py
+++ b/heronpy/streamlet/impl/consumebolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/contextimpl.py b/heronpy/streamlet/impl/contextimpl.py
index d944ec7..d74542d 100644
--- a/heronpy/streamlet/impl/contextimpl.py
+++ b/heronpy/streamlet/impl/contextimpl.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/filterbolt.py b/heronpy/streamlet/impl/filterbolt.py
index 820a56e..30ffd8a 100644
--- a/heronpy/streamlet/impl/filterbolt.py
+++ b/heronpy/streamlet/impl/filterbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/flatmapbolt.py b/heronpy/streamlet/impl/flatmapbolt.py
index 19cd07b..3af381d 100644
--- a/heronpy/streamlet/impl/flatmapbolt.py
+++ b/heronpy/streamlet/impl/flatmapbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/generatorspout.py b/heronpy/streamlet/impl/generatorspout.py
index 4b250ce..265b71f 100644
--- a/heronpy/streamlet/impl/generatorspout.py
+++ b/heronpy/streamlet/impl/generatorspout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/joinbolt.py b/heronpy/streamlet/impl/joinbolt.py
index 199e33b..1dde527 100644
--- a/heronpy/streamlet/impl/joinbolt.py
+++ b/heronpy/streamlet/impl/joinbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -73,6 +73,7 @@
     self._join_type = config[JoinBolt.JOINTYPE]
 
   def processWindow(self, window_config, tuples):
+    """Process a window"""
     # our temporary map
     mymap = {}
     for tup in tuples:
diff --git a/heronpy/streamlet/impl/logbolt.py b/heronpy/streamlet/impl/logbolt.py
index 1740b01..c18ada4 100644
--- a/heronpy/streamlet/impl/logbolt.py
+++ b/heronpy/streamlet/impl/logbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/mapbolt.py b/heronpy/streamlet/impl/mapbolt.py
index 34d82bf..4445852 100644
--- a/heronpy/streamlet/impl/mapbolt.py
+++ b/heronpy/streamlet/impl/mapbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/reducebykeyandwindowbolt.py b/heronpy/streamlet/impl/reducebykeyandwindowbolt.py
index 7cd4a93..977ece1 100644
--- a/heronpy/streamlet/impl/reducebykeyandwindowbolt.py
+++ b/heronpy/streamlet/impl/reducebykeyandwindowbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/reducebywindowbolt.py b/heronpy/streamlet/impl/reducebywindowbolt.py
index 6b09ef7..4a76cd2 100644
--- a/heronpy/streamlet/impl/reducebywindowbolt.py
+++ b/heronpy/streamlet/impl/reducebywindowbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/repartitionbolt.py b/heronpy/streamlet/impl/repartitionbolt.py
index c4a44a1..1e70c91 100644
--- a/heronpy/streamlet/impl/repartitionbolt.py
+++ b/heronpy/streamlet/impl/repartitionbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -33,6 +33,8 @@
 
 # pylint: disable=unused-argument
 class RepartitionCustomGrouping(ICustomGrouping):
+  """Implementation of repartitioning grouping"""
+
   def __init__(self, repartition_function):
     self._repartition_function = repartition_function
 
diff --git a/heronpy/streamlet/impl/streamletboltbase.py b/heronpy/streamlet/impl/streamletboltbase.py
index 0d6af23..99dea5a 100644
--- a/heronpy/streamlet/impl/streamletboltbase.py
+++ b/heronpy/streamlet/impl/streamletboltbase.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 """module for base streamlet bolt: StreamletBoltBase"""
 from heronpy.api.stream import Stream
 
-class StreamletBoltBase(object):
+class StreamletBoltBase:
   """StreamletBoltBase"""
   # output declarer
   outputs = [Stream(fields=['_output_'], name='output')]
diff --git a/heronpy/streamlet/impl/streamletspoutbase.py b/heronpy/streamlet/impl/streamletspoutbase.py
index 46b6944..6822717 100644
--- a/heronpy/streamlet/impl/streamletspoutbase.py
+++ b/heronpy/streamlet/impl/streamletspoutbase.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 """module for base streamlet API spout: StreamletSpoutBase"""
 from heronpy.api.stream import Stream
 
-class StreamletSpoutBase(object):
+class StreamletSpoutBase:
   """StreamletSpoutBase"""
   # output declarer
   outputs = [Stream(fields=['_output_'], name='output')]
diff --git a/heronpy/streamlet/impl/supplierspout.py b/heronpy/streamlet/impl/supplierspout.py
index af656c3..46e1833 100644
--- a/heronpy/streamlet/impl/supplierspout.py
+++ b/heronpy/streamlet/impl/supplierspout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/transformbolt.py b/heronpy/streamlet/impl/transformbolt.py
index 4bf9fee..a0906a9 100644
--- a/heronpy/streamlet/impl/transformbolt.py
+++ b/heronpy/streamlet/impl/transformbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/impl/unionbolt.py b/heronpy/streamlet/impl/unionbolt.py
index 76ed957..fb9b3c5 100644
--- a/heronpy/streamlet/impl/unionbolt.py
+++ b/heronpy/streamlet/impl/unionbolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/heronpy/streamlet/keyedwindow.py b/heronpy/streamlet/keyedwindow.py
index b5255dc..495dfd2 100644
--- a/heronpy/streamlet/keyedwindow.py
+++ b/heronpy/streamlet/keyedwindow.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,7 +22,7 @@
 
 from heronpy.streamlet.window import Window
 
-class KeyedWindow(object):
+class KeyedWindow:
   """Transformation depending on Windowing pass on the window/key information
      using this class
   """
diff --git a/heronpy/streamlet/resources.py b/heronpy/streamlet/resources.py
index 75782c1..7052995 100644
--- a/heronpy/streamlet/resources.py
+++ b/heronpy/streamlet/resources.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 '''resources.py: module for defining resources'''
 
-class Resources(object):
+class Resources:
   """Resources needed by the topology are encapsulated in this class.
      Currently we deal with CPU and RAM. Others can be added later.
   """
diff --git a/heronpy/streamlet/runner.py b/heronpy/streamlet/runner.py
index 2d3d73d..c0b9e40 100644
--- a/heronpy/streamlet/runner.py
+++ b/heronpy/streamlet/runner.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -24,13 +24,12 @@
 from heronpy.streamlet.builder import Builder
 from heronpy.streamlet.config import Config
 
-class Runner(object):
+class Runner:
   """Runner is used to run a topology that is built by the builder.
      It exports a sole function called run that takes care of constructing the topology
   """
   def __init__(self):
     """Nothing really"""
-    pass
 
   # pylint: disable=protected-access, no-self-use
   def run(self, name, config, builder):
diff --git a/heronpy/streamlet/streamlet.py b/heronpy/streamlet/streamlet.py
index cad88dc..ef6f932 100644
--- a/heronpy/streamlet/streamlet.py
+++ b/heronpy/streamlet/streamlet.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -24,7 +24,7 @@
 from heronpy.streamlet.impl.streamletboltbase import StreamletBoltBase
 
 # pylint: disable=too-many-instance-attributes, protected-access
-class Streamlet(object):
+class Streamlet:
   """A Streamlet is a (potentially unbounded) ordered collection of tuples
      Streamlets originate from pub/sub systems(such Pulsar/Kafka), or from static data(such as
      csv files, HDFS files), or for that matter any other source. They are also created by
@@ -59,6 +59,7 @@
   def map(self, map_function):
     """Return a new Streamlet by applying map_function to each element of this Streamlet.
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.mapbolt import MapStreamlet
     map_streamlet = MapStreamlet(map_function, self)
     self._add_child(map_streamlet)
@@ -68,6 +69,7 @@
     """Return a new Streamlet by applying map_function to each element of this Streamlet
        and flattening the result
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.flatmapbolt import FlatMapStreamlet
     fm_streamlet = FlatMapStreamlet(flatmap_function, self)
     self._add_child(fm_streamlet)
@@ -76,6 +78,7 @@
   def filter(self, filter_function):
     """Return a new Streamlet containing only the elements that satisfy filter_function
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.filterbolt import FilterStreamlet
     filter_streamlet = FilterStreamlet(filter_function, self)
     self._add_child(filter_streamlet)
@@ -90,6 +93,7 @@
     It could also return a list of partitions if it wants to send it to multiple
     partitions.
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.repartitionbolt import RepartitionStreamlet
     if repartition_function is None:
       repartition_function = lambda x: x
@@ -113,6 +117,7 @@
       reduce_function takes two element at one time and reduces them to one element that
       is used in the subsequent operations.
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.reducebywindowbolt import ReduceByWindowStreamlet
     reduce_streamlet = ReduceByWindowStreamlet(window_config, reduce_function, self)
     self._add_child(reduce_streamlet)
@@ -122,6 +127,7 @@
   def union(self, other_streamlet):
     """Returns a new Streamlet that consists of elements of both this and other_streamlet
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.unionbolt import UnionStreamlet
     union_streamlet = UnionStreamlet(self, other_streamlet)
     self._add_child(union_streamlet)
@@ -134,6 +140,7 @@
     Before starting to cycle over the Streamlet, the open function of the transform_operator is
     called. This allows the transform_operator to do any kind of initialization/loading, etc.
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.transformbolt import TransformStreamlet
     transform_streamlet = TransformStreamlet(transform_operator, self)
     self._add_child(transform_streamlet)
@@ -142,22 +149,23 @@
   def log(self):
     """Logs all elements of this streamlet. This returns nothing
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.logbolt import LogStreamlet
     log_streamlet = LogStreamlet(self)
     self._add_child(log_streamlet)
-    return
 
   def consume(self, consume_function):
     """Calls consume_function for each element of this streamlet. This function returns nothing
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.consumebolt import ConsumeStreamlet
     consume_streamlet = ConsumeStreamlet(consume_function, self)
     self._add_child(consume_streamlet)
-    return
 
   def join(self, join_streamlet, window_config, join_function):
     """Return a new Streamlet by joining join_streamlet with this streamlet
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.joinbolt import JoinStreamlet, JoinBolt
     join_streamlet_result = JoinStreamlet(JoinBolt.INNER, window_config,
                                           join_function, self, join_streamlet)
@@ -168,6 +176,7 @@
   def outer_right_join(self, join_streamlet, window_config, join_function):
     """Return a new Streamlet by outer right join_streamlet with this streamlet
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.joinbolt import JoinStreamlet, JoinBolt
     join_streamlet_result = JoinStreamlet(JoinBolt.OUTER_RIGHT, window_config,
                                           join_function, self, join_streamlet)
@@ -178,6 +187,7 @@
   def outer_left_join(self, join_streamlet, window_config, join_function):
     """Return a new Streamlet by left join_streamlet with this streamlet
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.joinbolt import JoinStreamlet, JoinBolt
     join_streamlet_result = JoinStreamlet(JoinBolt.OUTER_LEFT, window_config,
                                           join_function, self, join_streamlet)
@@ -188,6 +198,7 @@
   def outer_join(self, join_streamlet, window_config, join_function):
     """Return a new Streamlet by outer join_streamlet with this streamlet
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.joinbolt import JoinStreamlet, JoinBolt
 
     join_streamlet_result = JoinStreamlet(JoinBolt.OUTER, window_config,
@@ -200,6 +211,7 @@
     """Return a new Streamlet in which each (key, value) pair of this Streamlet are collected
        over the time_window and then reduced using the reduce_function
     """
+    # pylint: disable=import-outside-toplevel
     from heronpy.streamlet.impl.reducebykeyandwindowbolt import ReduceByKeyAndWindowStreamlet
     reduce_streamlet = ReduceByKeyAndWindowStreamlet(window_config, reduce_function, self)
     self._add_child(reduce_streamlet)
diff --git a/heronpy/streamlet/transformoperator.py b/heronpy/streamlet/transformoperator.py
index 66b9965..ba7a651 100644
--- a/heronpy/streamlet/transformoperator.py
+++ b/heronpy/streamlet/transformoperator.py
@@ -14,7 +14,7 @@
 '''transformoperator.py: API for defining generic transformer in python'''
 from abc import abstractmethod
 
-class TransformOperator(object):
+class TransformOperator:
   """API for defining a generic transformer for Heron in the Python Streamlet API
   """
 
diff --git a/heronpy/streamlet/window.py b/heronpy/streamlet/window.py
index 04b895d..8938ba2 100644
--- a/heronpy/streamlet/window.py
+++ b/heronpy/streamlet/window.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 '''window.py: module for defining Window'''
 
-class Window(object):
+class Window:
   """Window is a container containing information about a particular window.
      Transformations that depend on Windowing, pass the window information
      inside their streamlets using this container.
diff --git a/heronpy/streamlet/windowconfig.py b/heronpy/streamlet/windowconfig.py
index 1fa0fb5..ef0a75d 100644
--- a/heronpy/streamlet/windowconfig.py
+++ b/heronpy/streamlet/windowconfig.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,7 +22,7 @@
 
 import datetime
 
-class WindowConfig(object):
+class WindowConfig:
   """WindowConfig allows streamlet API users to program window configuration for operations
      that rely on windowing. Currently we only support time/count based
      sliding/tumbling windows.
diff --git a/integration_test/src/python/common/status.py b/integration_test/src/python/common/status.py
index d2f69e4..a1f3277 100644
--- a/integration_test/src/python/common/status.py
+++ b/integration_test/src/python/common/status.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -27,11 +27,11 @@
   def __init__(self, message, error=None):
     Exception.__init__(self, message, error)
     if error:
-      logging.error("%s :: %s", message, traceback.format_exc(error))
+      logging.error("%s :: %s", message, error, exc_info=True)
     else:
       logging.error(message)
 
-class TestSuccess(object):
+class TestSuccess:
   def __init__(self, message=None):
     if message:
       logging.info(message)
diff --git a/integration_test/src/python/integration_test/common/bolt/count_aggregator_bolt.py b/integration_test/src/python/integration_test/common/bolt/count_aggregator_bolt.py
index d18ec24..6bfe7c3 100644
--- a/integration_test/src/python/integration_test/common/bolt/count_aggregator_bolt.py
+++ b/integration_test/src/python/integration_test/common/bolt/count_aggregator_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/common/bolt/double_tuples_bolt.py b/integration_test/src/python/integration_test/common/bolt/double_tuples_bolt.py
index 561db78..68341b4 100644
--- a/integration_test/src/python/integration_test/common/bolt/double_tuples_bolt.py
+++ b/integration_test/src/python/integration_test/common/bolt/double_tuples_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/common/bolt/identity_bolt.py b/integration_test/src/python/integration_test/common/bolt/identity_bolt.py
index 4740f38..1680ce3 100644
--- a/integration_test/src/python/integration_test/common/bolt/identity_bolt.py
+++ b/integration_test/src/python/integration_test/common/bolt/identity_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/common/bolt/word_count_bolt.py b/integration_test/src/python/integration_test/common/bolt/word_count_bolt.py
index 6824309..823aac3 100644
--- a/integration_test/src/python/integration_test/common/bolt/word_count_bolt.py
+++ b/integration_test/src/python/integration_test/common/bolt/word_count_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/common/spout/ab_spout.py b/integration_test/src/python/integration_test/common/spout/ab_spout.py
index 828420c..58d1c59 100644
--- a/integration_test/src/python/integration_test/common/spout/ab_spout.py
+++ b/integration_test/src/python/integration_test/common/spout/ab_spout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/aggregator_bolt.py b/integration_test/src/python/integration_test/core/aggregator_bolt.py
index 4fd6cf1..d2047c5 100644
--- a/integration_test/src/python/integration_test/core/aggregator_bolt.py
+++ b/integration_test/src/python/integration_test/core/aggregator_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/batch_bolt.py b/integration_test/src/python/integration_test/core/batch_bolt.py
index d35f09e..ecaa4c6 100644
--- a/integration_test/src/python/integration_test/core/batch_bolt.py
+++ b/integration_test/src/python/integration_test/core/batch_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/constants.py b/integration_test/src/python/integration_test/core/constants.py
index 5b72cd3..0530928 100644
--- a/integration_test/src/python/integration_test/core/constants.py
+++ b/integration_test/src/python/integration_test/core/constants.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/integration_test_bolt.py b/integration_test/src/python/integration_test/core/integration_test_bolt.py
index cfb4d7d..ea51c4c 100644
--- a/integration_test/src/python/integration_test/core/integration_test_bolt.py
+++ b/integration_test/src/python/integration_test/core/integration_test_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/integration_test_spout.py b/integration_test/src/python/integration_test/core/integration_test_spout.py
index 7c7ddcc..4edd807 100644
--- a/integration_test/src/python/integration_test/core/integration_test_spout.py
+++ b/integration_test/src/python/integration_test/core/integration_test_spout.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/terminal_bolt.py b/integration_test/src/python/integration_test/core/terminal_bolt.py
index c4f6afa..6a868bb 100644
--- a/integration_test/src/python/integration_test/core/terminal_bolt.py
+++ b/integration_test/src/python/integration_test/core/terminal_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/test_runner.py b/integration_test/src/python/integration_test/core/test_runner.py
index a4b6a10..eb5285f 100644
--- a/integration_test/src/python/integration_test/core/test_runner.py
+++ b/integration_test/src/python/integration_test/core/test_runner.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/core/test_topology_builder.py b/integration_test/src/python/integration_test/core/test_topology_builder.py
index d64a874..6881644 100644
--- a/integration_test/src/python/integration_test/core/test_topology_builder.py
+++ b/integration_test/src/python/integration_test/core/test_topology_builder.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/all_grouping/all_grouping.py b/integration_test/src/python/integration_test/topology/all_grouping/all_grouping.py
index ba5e671..4098045 100644
--- a/integration_test/src/python/integration_test/topology/all_grouping/all_grouping.py
+++ b/integration_test/src/python/integration_test/topology/all_grouping/all_grouping.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/basic_one_task/basic_one_task.py b/integration_test/src/python/integration_test/topology/basic_one_task/basic_one_task.py
index da1edcb..08e232c 100644
--- a/integration_test/src/python/integration_test/topology/basic_one_task/basic_one_task.py
+++ b/integration_test/src/python/integration_test/topology/basic_one_task/basic_one_task.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/bolt_double_emit_tuples/bolt_double_emit_tuples.py b/integration_test/src/python/integration_test/topology/bolt_double_emit_tuples/bolt_double_emit_tuples.py
index c4ddee1..fa817e7 100644
--- a/integration_test/src/python/integration_test/topology/bolt_double_emit_tuples/bolt_double_emit_tuples.py
+++ b/integration_test/src/python/integration_test/topology/bolt_double_emit_tuples/bolt_double_emit_tuples.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/fields_grouping/fields_grouping.py b/integration_test/src/python/integration_test/topology/fields_grouping/fields_grouping.py
index 8de7167..5aa00f4 100644
--- a/integration_test/src/python/integration_test/topology/fields_grouping/fields_grouping.py
+++ b/integration_test/src/python/integration_test/topology/fields_grouping/fields_grouping.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/global_grouping/global_grouping.py b/integration_test/src/python/integration_test/topology/global_grouping/global_grouping.py
index 898beb6..7a62424 100644
--- a/integration_test/src/python/integration_test/topology/global_grouping/global_grouping.py
+++ b/integration_test/src/python/integration_test/topology/global_grouping/global_grouping.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/multi_spouts_multi_tasks/multi_spouts_multi_tasks.py b/integration_test/src/python/integration_test/topology/multi_spouts_multi_tasks/multi_spouts_multi_tasks.py
index 52af55b..2d56ade 100644
--- a/integration_test/src/python/integration_test/topology/multi_spouts_multi_tasks/multi_spouts_multi_tasks.py
+++ b/integration_test/src/python/integration_test/topology/multi_spouts_multi_tasks/multi_spouts_multi_tasks.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/none_grouping/none_grouping.py b/integration_test/src/python/integration_test/topology/none_grouping/none_grouping.py
index c696094..66dcd36 100644
--- a/integration_test/src/python/integration_test/topology/none_grouping/none_grouping.py
+++ b/integration_test/src/python/integration_test/topology/none_grouping/none_grouping.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/one_bolt_multi_tasks/one_bolt_multi_tasks.py b/integration_test/src/python/integration_test/topology/one_bolt_multi_tasks/one_bolt_multi_tasks.py
index b2d808c..c94b622 100644
--- a/integration_test/src/python/integration_test/topology/one_bolt_multi_tasks/one_bolt_multi_tasks.py
+++ b/integration_test/src/python/integration_test/topology/one_bolt_multi_tasks/one_bolt_multi_tasks.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/one_spout_bolt_multi_tasks/one_spout_bolt_multi_tasks.py b/integration_test/src/python/integration_test/topology/one_spout_bolt_multi_tasks/one_spout_bolt_multi_tasks.py
index f0716cb..6b0ce64 100644
--- a/integration_test/src/python/integration_test/topology/one_spout_bolt_multi_tasks/one_spout_bolt_multi_tasks.py
+++ b/integration_test/src/python/integration_test/topology/one_spout_bolt_multi_tasks/one_spout_bolt_multi_tasks.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/one_spout_multi_tasks/one_spout_multi_tasks.py b/integration_test/src/python/integration_test/topology/one_spout_multi_tasks/one_spout_multi_tasks.py
index 310f15e..0a55597 100644
--- a/integration_test/src/python/integration_test/topology/one_spout_multi_tasks/one_spout_multi_tasks.py
+++ b/integration_test/src/python/integration_test/topology/one_spout_multi_tasks/one_spout_multi_tasks.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/one_spout_two_bolts/one_spout_two_bolts.py b/integration_test/src/python/integration_test/topology/one_spout_two_bolts/one_spout_two_bolts.py
index adf9b5f..83ba5ac 100644
--- a/integration_test/src/python/integration_test/topology/one_spout_two_bolts/one_spout_two_bolts.py
+++ b/integration_test/src/python/integration_test/topology/one_spout_two_bolts/one_spout_two_bolts.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/shuffle_grouping/shuffle_grouping.py b/integration_test/src/python/integration_test/topology/shuffle_grouping/shuffle_grouping.py
index 5bd2bbd..dafec6b 100644
--- a/integration_test/src/python/integration_test/topology/shuffle_grouping/shuffle_grouping.py
+++ b/integration_test/src/python/integration_test/topology/shuffle_grouping/shuffle_grouping.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/streamlet/word_count_streamlet.py b/integration_test/src/python/integration_test/topology/streamlet/word_count_streamlet.py
index 7cdc51f..704bac3 100644
--- a/integration_test/src/python/integration_test/topology/streamlet/word_count_streamlet.py
+++ b/integration_test/src/python/integration_test/topology/streamlet/word_count_streamlet.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/integration_test/topology/test_topology_main.py b/integration_test/src/python/integration_test/topology/test_topology_main.py
index e9b41c1..a83ca59 100644
--- a/integration_test/src/python/integration_test/topology/test_topology_main.py
+++ b/integration_test/src/python/integration_test/topology/test_topology_main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
diff --git a/integration_test/src/python/local_test_runner/main.py b/integration_test/src/python/local_test_runner/main.py
index 37e59f4..9c4d9c8 100644
--- a/integration_test/src/python/local_test_runner/main.py
+++ b/integration_test/src/python/local_test_runner/main.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -34,12 +34,12 @@
 from heron.common.src.python.utils import log
 
 # import test_kill_bolt
-import test_kill_metricsmgr
-import test_kill_stmgr
-import test_kill_stmgr_metricsmgr
-import test_kill_tmaster
-import test_scale_up
-import test_template
+from . import test_kill_metricsmgr
+from . import test_kill_stmgr
+from . import test_kill_stmgr_metricsmgr
+from . import test_kill_tmaster
+from . import test_scale_up
+from . import test_template
 
 TEST_CLASSES = [
     test_template.TestTemplate,
@@ -82,7 +82,7 @@
         failures += [testname]
 
   except Exception as e:
-    logging.error("Exception thrown while running tests: %s", str(e))
+    logging.error("Exception thrown while running tests: %s", str(e), exc_info=True)
   finally:
     tracker_process.kill()
 
@@ -110,7 +110,7 @@
 
   # Read the configuration file from package
   conf_file = DEFAULT_TEST_CONF_FILE
-  conf_string = pkgutil.get_data(__name__, conf_file)
+  conf_string = pkgutil.get_data(__name__, conf_file).decode()
   decoder = json.JSONDecoder(strict=False)
 
   # Convert the conf file to a json format
diff --git a/integration_test/src/python/local_test_runner/test_kill_bolt.py b/integration_test/src/python/local_test_runner/test_kill_bolt.py
index 01945e2..7a4eb16 100644
--- a/integration_test/src/python/local_test_runner/test_kill_bolt.py
+++ b/integration_test/src/python/local_test_runner/test_kill_bolt.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -21,7 +21,7 @@
 
 """test_kill_bolt.py"""
 import logging
-import test_template
+from . import test_template
 
 NON_TMASTER_SHARD = 1
 HERON_BOLT = 'identity-bolt_3'
diff --git a/integration_test/src/python/local_test_runner/test_kill_metricsmgr.py b/integration_test/src/python/local_test_runner/test_kill_metricsmgr.py
index dbff9c1..71c06f5 100644
--- a/integration_test/src/python/local_test_runner/test_kill_metricsmgr.py
+++ b/integration_test/src/python/local_test_runner/test_kill_metricsmgr.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 
 """test_kill_metricsmgr.py"""
-import test_template
+from . import test_template
 
 class TestKillMetricsMgr(test_template.TestTemplate):
 
diff --git a/integration_test/src/python/local_test_runner/test_kill_stmgr.py b/integration_test/src/python/local_test_runner/test_kill_stmgr.py
index bdb28f4..f6a7bfa 100644
--- a/integration_test/src/python/local_test_runner/test_kill_stmgr.py
+++ b/integration_test/src/python/local_test_runner/test_kill_stmgr.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 
 """test_kill_stmgr.py"""
-import test_template
+from . import test_template
 
 class TestKillStmgr(test_template.TestTemplate):
 
diff --git a/integration_test/src/python/local_test_runner/test_kill_stmgr_metricsmgr.py b/integration_test/src/python/local_test_runner/test_kill_stmgr_metricsmgr.py
index 5ae9f6a..34f9409 100644
--- a/integration_test/src/python/local_test_runner/test_kill_stmgr_metricsmgr.py
+++ b/integration_test/src/python/local_test_runner/test_kill_stmgr_metricsmgr.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -20,7 +20,7 @@
 
 
 """test_kill_stmgr_metricsmgr.py"""
-import test_template
+from . import test_template
 
 class TestKillStmgrMetricsMgr(test_template.TestTemplate):
 
diff --git a/integration_test/src/python/local_test_runner/test_kill_tmaster.py b/integration_test/src/python/local_test_runner/test_kill_tmaster.py
index ea7f6c6..0e0b49a 100644
--- a/integration_test/src/python/local_test_runner/test_kill_tmaster.py
+++ b/integration_test/src/python/local_test_runner/test_kill_tmaster.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -22,7 +22,7 @@
 """test_kill_tmaster.py"""
 import logging
 import subprocess
-import test_template
+from . import test_template
 
 TMASTER_SHARD = 0
 
diff --git a/integration_test/src/python/local_test_runner/test_scale_up.py b/integration_test/src/python/local_test_runner/test_scale_up.py
index 93d3fae..d798378 100644
--- a/integration_test/src/python/local_test_runner/test_scale_up.py
+++ b/integration_test/src/python/local_test_runner/test_scale_up.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -24,7 +24,7 @@
 import subprocess
 
 from ..common import status
-import test_template
+from . import test_template
 
 class TestScaleUp(test_template.TestTemplate):
 
diff --git a/integration_test/src/python/local_test_runner/test_template.py b/integration_test/src/python/local_test_runner/test_template.py
index c779f01..33219de 100644
--- a/integration_test/src/python/local_test_runner/test_template.py
+++ b/integration_test/src/python/local_test_runner/test_template.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # -*- encoding: utf-8 -*-
 
 #  Licensed to the Apache Software Foundation (ASF) under one
@@ -54,7 +54,7 @@
 HERON_STMGR_CMD = os.path.join(HERON_SANDBOX_HOME, HERON_CORE, HERON_BIN, HERON_STMGR)
 ProcessTuple = namedtuple('ProcessTuple', 'pid cmd')
 
-class TestTemplate(object):
+class TestTemplate:
   """ Class that encapsulates the template used for integration tests. Intended to be abstract and
   subclassed for specific tests. """
 
@@ -92,9 +92,9 @@
       return result
 
     except status.TestFailure as e:
-      raise e
+      raise
     except Exception as e:
-      raise status.TestFailure("Exception thrown during test", e)
+      raise status.TestFailure("Exception thrown during test", e) from e
     finally:
       if topology_submitted:
         self.cleanup_test()
@@ -215,7 +215,7 @@
     try:
       with open(process_pid_file, 'r') as f:
         pid = f.readline()
-        return pid
+        return int(pid)
     except Exception:
       logging.error("Unable to open file %s", process_pid_file)
       return -1
@@ -230,7 +230,7 @@
     logging.info("Killing process number %s", process_number)
 
     try:
-      os.kill(int(process_number), signal.SIGTERM)
+      os.kill(process_number, signal.SIGTERM)
     except OSError as ex:
       if "No such process" in str(ex): # killing a non-existing process condsidered as success
         logging.info(str(ex))
@@ -329,7 +329,7 @@
   """
   # pylint: disable=fixme
   # TODO: if the submit fails before we get here (e.g., Topology already exists), this hangs
-  processes = subprocess.check_output(['ps', '-o', 'pid,args'])
+  processes = subprocess.check_output(['ps', '-o', 'pid,args'], universal_newlines=True)
   processes = processes.split('\n')
   processes = processes[1:] # remove first line, which is name of columns
   process_list = []
diff --git a/integration_test/src/python/test_runner/main.py b/integration_test/src/python/test_runner/main.py
index a09803a..fb44a10 100644
--- a/integration_test/src/python/test_runner/main.py
+++ b/integration_test/src/python/test_runner/main.py
@@ -42,11 +42,11 @@
 successes = []
 failures = []
 
-class FileBasedExpectedResultsHandler(object):
+class FileBasedExpectedResultsHandler:
   def __init__(self, file_path):
     self.file_path = file_path
 
-  def fetch_results(self):
+  def fetch_results(self) -> str:
     # Read expected result from the expected result file
     try:
       if not os.path.exists(self.file_path):
@@ -57,14 +57,14 @@
     except Exception as e:
       raise status.TestFailure("Failed to read expected result file %s" % self.file_path, e)
 
-class HttpBasedExpectedResultsHandler(object):
+class HttpBasedExpectedResultsHandler:
   def __init__(self, server_host_port, topology_name, task_count):
     self.server_host_port = server_host_port
     self.topology_name = topology_name
     self.task_count = task_count
 
   # pylint: disable=unnecessary-lambda
-  def fetch_results(self):
+  def fetch_results(self) -> str:
     try:
       result = []
       decoder = json.JSONDecoder(strict=False)
@@ -87,12 +87,12 @@
       raise status.TestFailure(
           "Fetching expected result failed for %s topology" % self.topology_name, e)
 
-class HttpBasedActualResultsHandler(object):
+class HttpBasedActualResultsHandler:
   def __init__(self, server_host_port, topology_name):
     self.server_host_port = server_host_port
     self.topology_name = topology_name
 
-  def fetch_results(self):
+  def fetch_results(self) -> str:
     try:
       return fetch_from_server(self.server_host_port, self.topology_name,
                                'results', '/results/%s' % self.topology_name)
@@ -100,7 +100,7 @@
       raise status.TestFailure("Fetching result failed for %s topology" % self.topology_name, e)
 
 # pylint: disable=unnecessary-lambda
-class ExactlyOnceResultsChecker(object):
+class ExactlyOnceResultsChecker:
   """Compares what results we found against what was expected. Verifies and exact match"""
 
   def __init__(self, topology_name, expected_results_handler, actual_results_handler):
@@ -221,13 +221,13 @@
   response = connection.getresponse()
   return response.status == 200
 
-def fetch_from_server(server_host_port, topology_name, data_name, path):
+def fetch_from_server(server_host_port, topology_name, data_name, path) -> str:
   ''' Make a http get request to fetch actual results from http server '''
   for i in range(0, RETRY_ATTEMPTS):
     logging.info("Fetching %s for topology %s, retry count: %d", data_name, topology_name, i)
     response = get_http_response(server_host_port, path)
     if response.status == 200:
-      return response.read()
+      return response.read().decode()
     elif i != RETRY_ATTEMPTS:
       logging.info("Fetching %s failed with status: %s; reason: %s; body: %s",
                    data_name, response.status, response.reason, response.read())
@@ -425,7 +425,7 @@
   log.configure(level=logging.DEBUG)
   conf_file = DEFAULT_TEST_CONF_FILE
   # Read the configuration file from package
-  conf_string = pkgutil.get_data(__name__, conf_file)
+  conf_string = pkgutil.get_data(__name__, conf_file).decode()
   decoder = json.JSONDecoder(strict=False)
   # Convert the conf file to a json format
   conf = decoder.decode(conf_string)
diff --git a/integration_test/src/python/topology_test_runner/main.py b/integration_test/src/python/topology_test_runner/main.py
index 2ba8184..12d8685 100644
--- a/integration_test/src/python/topology_test_runner/main.py
+++ b/integration_test/src/python/topology_test_runner/main.py
@@ -44,7 +44,7 @@
 successes = []
 failures = []
 
-class TopologyStructureResultChecker(object):
+class TopologyStructureResultChecker:
   """
   Validate topology graph structure
   """
@@ -224,14 +224,14 @@
     return output
 
 
-class FileBasedExpectedResultsHandler(object):
+class FileBasedExpectedResultsHandler:
   """
   Get expected topology graph structure result from local file
   """
   def __init__(self, file_path):
     self.file_path = file_path
 
-  def fetch_results(self):
+  def fetch_results(self) -> str:
     """
     Read expected result from the expected result file
     """
@@ -245,7 +245,7 @@
       raise status.TestFailure("Failed to read expected result file %s" % self.file_path, e)
 
 
-class ZkFileBasedActualResultsHandler(object):
+class ZkFileBasedActualResultsHandler:
   """
   Get actual topology graph structure result from zk
   """
@@ -294,7 +294,7 @@
     self.state_mgr.stop()
 
 
-class HttpBasedActualResultsHandler(object):
+class HttpBasedActualResultsHandler:
   """
   Get actually loaded instance states
   TODO(yaoli): complete this class when stateful processing is ready
@@ -303,23 +303,23 @@
     self.server_host_port = server_host_port
     self.topology_name = topology_name
 
-  def fetch_results(self):
+  def fetch_results(self) -> str:
     try:
       return self.fetch_from_server(self.server_host_port, self.topology_name,
         'instance_state', '/stateResults/%s' % self.topology_name)
     except Exception as e:
       raise status.TestFailure("Fetching instance state failed for %s topology" % self.topology_name, e)
 
-  def fetch_from_server(self, server_host_port, topology_name, data_name, path):
+  def fetch_from_server(self, server_host_port, topology_name, data_name, path) -> str:
     ''' Make a http get request to fetch actual results from http server '''
     for i in range(0, RETRY_ATTEMPTS):
       logging.info("Fetching %s for topology %s, retry count: %d", data_name, topology_name, i)
       response = self.get_http_response(server_host_port, path)
       if response.status == 200:
-        return response.read()
+        return response.read().decode()
       elif i != RETRY_ATTEMPTS:
         logging.info("Fetching %s failed with status: %s; reason: %s; body: %s",
-          data_name, response.status, response.reason, response.read())
+          data_name, response.status, response.reason, response.read().decode())
         time.sleep(RETRY_INTERVAL)
 
     raise status.TestFailure("Failed to fetch %s after %d attempts" % (data_name, RETRY_ATTEMPTS))
@@ -632,7 +632,7 @@
   log.configure(level=logging.DEBUG)
   conf_file = DEFAULT_TEST_CONF_FILE
   # Read the configuration file from package
-  conf_string = pkgutil.get_data(__name__, conf_file)
+  conf_string = pkgutil.get_data(__name__, conf_file).decode()
   decoder = json.JSONDecoder(strict=False)
   # Convert the conf file to a json format
   conf = decoder.decode(conf_string)
diff --git a/scripts/applatix/build.sh b/scripts/applatix/build.sh
index d7c9900..f39bb3e 100755
--- a/scripts/applatix/build.sh
+++ b/scripts/applatix/build.sh
@@ -79,14 +79,14 @@
 # build heron
 T="heron build"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build.txt" bazel\
+${UTILS}/save-logs.py "heron_build.txt" bazel\
   --bazelrc=tools/applatix/bazel.rc build --config=$PLATFORM heron/...
 end_timer "$T"
 
 # run heron unit tests
 T="heron test non-flaky"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_test_non_flaky.txt" bazel\
+${UTILS}/save-logs.py "heron_test_non_flaky.txt" bazel\
   --bazelrc=tools/applatix/bazel.rc test\
   --test_summary=detailed --test_output=errors\
   --config=$PLATFORM --test_tag_filters=-flaky heron/...
@@ -96,7 +96,7 @@
 # which should be fixed. For now, run them serially
 T="heron test flaky"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_test_flaky.txt" bazel\
+${UTILS}/save-logs.py "heron_test_flaky.txt" bazel\
   --bazelrc=tools/applatix/bazel.rc test\
   --test_summary=detailed --test_output=errors\
   --config=$PLATFORM --test_tag_filters=flaky --jobs=1 heron/...
@@ -104,14 +104,14 @@
 
 T="heron build binpkgs"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
+${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
   --bazelrc=tools/applatix/bazel.rc build\
   --config=$PLATFORM scripts/packages:binpkgs
 end_timer "$T"
 
 T="heron build testpkgs"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
+${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
   --bazelrc=tools/applatix/bazel.rc build\
   --config=$PLATFORM scripts/packages:testpkgs
 end_timer "$T"
diff --git a/scripts/applatix/test.sh b/scripts/applatix/test.sh
index 78c023a..d0619b9 100755
--- a/scripts/applatix/test.sh
+++ b/scripts/applatix/test.sh
@@ -39,13 +39,13 @@
 # install clients and tools
 T="heron clients/tools install"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_install.txt" ./heron-install.sh --user
+${UTILS}/save-logs.py "heron_install.txt" ./heron-install.sh --user
 end_timer "$T"
 
 # install tests
 T="heron tests install"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_tests_install.txt" ./heron-tests-install.sh --user
+${UTILS}/save-logs.py "heron_tests_install.txt" ./heron-tests-install.sh --user
 end_timer "$T"
 
 # initialize http-server for integration tests
diff --git a/scripts/applatix/testutils.sh b/scripts/applatix/testutils.sh
index f526f2f..988e984 100755
--- a/scripts/applatix/testutils.sh
+++ b/scripts/applatix/testutils.sh
@@ -34,13 +34,13 @@
 # install clients and tools
 T="heron clients/tools install"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_install.txt" ./heron-install.sh --user
+${UTILS}/save-logs.py "heron_install.txt" ./heron-install.sh --user
 end_timer "$T"
 
 # install tests
 T="heron tests install"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_tests_install.txt" ./heron-tests-install.sh --user
+${UTILS}/save-logs.py "heron_tests_install.txt" ./heron-tests-install.sh --user
 end_timer "$T"
 
 print_timer_summary
diff --git a/scripts/packages/BUILD b/scripts/packages/BUILD
index 64cfb90..f3b7e1c 100644
--- a/scripts/packages/BUILD
+++ b/scripts/packages/BUILD
@@ -1,5 +1,4 @@
-# load("@bazel_tools//tools/build_defs/pkg:pkg.bzl", "pkg_deb", "pkg_tar")
-load("@bazel_tools//tools/build_defs/pkg:pkg.bzl", "pkg_tar")
+load("@rules_pkg//:pkg.bzl", "pkg_tar")
 load("//scripts/packages:self_extract_binary.bzl", "self_extract_binary")
 
 package(default_visibility = ["//visibility:public"])
@@ -646,12 +645,12 @@
         'find heronpy -type f -name "*.bak" -delete',
         "rm setup.py.template",
         "tree $$HERONPY_DIR",
-        "/usr/bin/env python2.7 setup.py sdist",
-        "/usr/bin/env python2.7 setup.py bdist_wheel --universal",
+        "/usr/bin/env python3 setup.py sdist",
+        "/usr/bin/env python3 setup.py bdist_wheel",
         "cd -",
         "ls -l $$HERONPY_DIR/dist",
-        "cp $$HERONPY_DIR/dist/heronpy-*-py2.py3-*.whl $$OUTPUT_DIR",
-        'cp $$HERONPY_DIR/dist/heronpy-*-py2.py3-*.whl "$@"',
+        "cp $$HERONPY_DIR/dist/heronpy-*-py3-*.whl $$OUTPUT_DIR",
+        'cp $$HERONPY_DIR/dist/heronpy-*-py3-*.whl "$@"',
         "cp $$HERONPY_DIR/dist/heronpy-*.tar.gz $$OUTPUT_DIR",
         "touch $$OUTPUT_DIR/heronpy.whl",
         "rm -rf $$TMP_DIR",
diff --git a/scripts/packages/heronpy/setup.py.template b/scripts/packages/heronpy/setup.py.template
index 6c7f521..9358140 100644
--- a/scripts/packages/heronpy/setup.py.template
+++ b/scripts/packages/heronpy/setup.py.template
@@ -29,7 +29,7 @@
   raw_requirements = f.read().strip()
 
 requirements = raw_requirements.split('\n')
-print "Requirements: %s" % requirements
+print("Requirements: %s" % requirements)
 
 long_description = "The heronpy package enables you to write Heron topologies in Python. " \
                    "Topologies can be run using a variety of schedulers, including Mesos/Aurora," \
@@ -51,14 +51,16 @@
 
     'Intended Audience :: Developers',
 
-    'Programming Language :: Python :: 2.7',
     'Programming Language :: Python :: 3.4',
     'Programming Language :: Python :: 3.5',
     'Programming Language :: Python :: 3.6',
+    'Programming Language :: Python :: 3.7',
+    'Programming Language :: Python :: 3.8',
   ],
 
   keywords='heron topology python',
   packages=find_packages(),
 
-  install_requires=requirements
+  install_requires=requirements,
+  python_requires='~=3.4',
 )
diff --git a/scripts/release/docker-images b/scripts/release/docker-images
new file mode 100755
index 0000000..7316abc
--- /dev/null
+++ b/scripts/release/docker-images
@@ -0,0 +1,145 @@
+#!/usr/bin/env python3
+"""
+This script cooridnates other scripts to put together a release.
+
+Generated images will have a label in the form heron/heron:<tag> and will be placed in the //distro/ directory.
+
+## Examples
+
+List available target distrobutions on separate stdout lines:
+  ./docker-images
+
+Build and tag a single distorbution image then print where the archive's path:
+  ./docker-images build 0.1.0-debian10 debian10
+
+Build and tag all distrobution images then print each archive's path:
+  ./docker-images build "$(git describe --tags)" --all
+
+"""
+from pathlib import Path
+
+import logging
+import re
+import shutil
+import subprocess
+import sys
+import tempfile
+import typing
+
+ROOT = Path(__file__).resolve().parent.parent.parent
+BUILD_ARTIFACTS = ROOT / "docker/scripts/build-artifacts.sh"
+BUILD_IMAGE = ROOT / "docker/scripts/build-docker.sh"
+
+
+class BuildFailure(Exception):
+    """Raised to indicate a failure buliding."""
+
+
+class BadDistrobutionName(BuildFailure):
+    """Raised when a bad distrobution name is provided."""
+
+
+def configure_logging(debug: bool):
+    """Use standard logging config and write to stdout and a logfile."""
+    logging.basicConfig(
+        format="[%(asctime)s] %(levelname)s: %(message)s",
+        level=(logging.DEBUG if debug else logging.INFO),
+    )
+
+
+def log_run(args: typing.List[str], log: typing.IO[str]) -> subprocess.CompletedProcess:
+    """Run an executable and direct its output to the given log file."""
+    return subprocess.run(
+        args, stdout=log, stderr=log, universal_newlines=True, check=True
+    )
+
+
+def build_dockerfile(
+    scratch: Path, dist: str, tag: str, out_dir: Path, log: typing.IO[str]
+) -> Path:
+    """
+    Raises CalledProcessError if either of the external scripts fail.
+    """
+    logging.info("building package for %s", dist)
+    log_run([str(BUILD_ARTIFACTS), dist, tag, scratch], log)
+    logging.info("building docker image for %s", dist)
+    log_run([str(BUILD_IMAGE), dist, tag, scratch], log)
+    tar = Path(scratch) / f"heron-docker-{tag}-{dist}.tar.gz"
+    tar_out = out_dir / tar.name
+    tar.replace(tar_out)
+    logging.info("docker image complete: %s", tar_out)
+    return tar_out
+
+
+def available_distrobutions() -> typing.List[str]:
+    """Return a list of available target distrobutions."""
+    compile_files = (ROOT / "docker/compile").glob("Dockerfile.*")
+    dist_files = (ROOT / "docker/dist").glob("Dockerfile.dist.*")
+    compile_distros = {re.sub(r"^Dockerfile\.", "", f.name) for f in compile_files}
+    dist_distros = {re.sub(r"^Dockerfile\.dist\.", "", f.name) for f in dist_files}
+    distros = compile_distros & dist_distros
+    mismatch = (compile_distros | dist_distros) ^ distros
+    if mismatch:
+        logging.warning(
+            "docker distros found without both compile+dist files: %s", mismatch
+        )
+
+    return sorted(distros)
+
+
+def build_target(tag: str, target: str) -> typing.List[Path]:
+    """Build docker images for the given target distrobutions."""
+    debug = True
+
+    distros = available_distrobutions()
+    logging.debug("available distro targets: %s", distros)
+    if target == "--all":
+        targets = distros
+    elif target not in distros:
+        raise BadDistrobutionName(f"distrobution {target!r} does not exist")
+    else:
+        targets = [target]
+
+    out_dir = ROOT / "dist"
+    out_dir.mkdir(exist_ok=True)
+
+    for target in targets:
+        scratch = Path(tempfile.mkdtemp(prefix=f"build-{target}-"))
+        log_path = scratch / "log.txt"
+        log = log_path.open("w")
+        logging.debug("building %s", target)
+
+        try:
+            tar = build_dockerfile(scratch, target, tag, out_dir, log)
+        except Exception as e:
+            logging.error(
+                "an error occurred building %s. See log in %s", target, log_path
+            )
+            if isinstance(e, subprocess.CalledProcessError):
+                raise BuildFailure("failure in underlying build scripts") from e
+            raise
+
+        if not debug:
+            shutil.rmtree(scratch)
+        yield tar
+
+
+def cli(args=sys.argv):
+    operation = sys.argv[1]
+    if operation == "list":
+        print("\n".join(available_distrobutions()))
+    elif operation == "build":
+        tag, target = sys.argv[2:]
+        try:
+            for archive in build_target(tag=tag, target=target):
+                print(archive)
+        except BuildFailure as e:
+            logging.error(e)
+            pass
+    else:
+        logging.error("unknown operation %r", operation)
+
+
+if __name__ == "__main__":
+    configure_logging(debug=True)
+    cli()
diff --git a/scripts/setup-intellij.sh b/scripts/setup-intellij.sh
index 32f4d6d..80cfd22 100755
--- a/scripts/setup-intellij.sh
+++ b/scripts/setup-intellij.sh
@@ -35,7 +35,7 @@
 <module type="JAVA_MODULE" version="4">
   <component name="FacetManager">
     <facet type="Python" name="Python">
-      <configuration sdkName="Python 2.7.10 (/usr/bin/python)" />
+      <configuration sdkName="Python 3 (/usr/bin/python3)" />
     </facet>
   </component>
   <component name="NewModuleRootManager">
@@ -159,7 +159,7 @@
 #write_jar_entry "bazel-bin/heron/metricsmgr/src/thrift"
 
 cat >> $iml_file <<'EOF'
-    <orderEntry type="library" name="Python 2.7.10 (/usr/bin/python) interpreter library" level="application" />
+    <orderEntry type="library" name="Python 3 (/usr/bin/python3) interpreter library" level="application" />
   </component>
 </module>
 EOF
diff --git a/scripts/shutils/common.sh b/scripts/shutils/common.sh
index 6413777..3c4857b 100755
--- a/scripts/shutils/common.sh
+++ b/scripts/shutils/common.sh
@@ -92,7 +92,7 @@
 
 # Discover the platform that we are running on
 function discover_platform {
-  discover=`python -mplatform`
+  discover="${PLATFORM-$(python3 -mplatform)}"
   if [[ $discover =~ ^.*centos.*$ ]]; then
     echo "centos"
   elif [[ $discover =~ ^.*Ubuntu.*$ ]]; then
diff --git a/scripts/shutils/save-logs.py b/scripts/shutils/save-logs.py
index c378936..cc50c99 100755
--- a/scripts/shutils/save-logs.py
+++ b/scripts/shutils/save-logs.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -21,6 +21,7 @@
 import os
 import subprocess
 import sys
+import shlex
 from datetime import datetime, timedelta
 
 
@@ -35,12 +36,15 @@
              n -= 1
              if n == -1:
                 break
-      return fm[i + 1 if i else 0:].splitlines()
+      return fm[i + 1 if i else 0:].decode().splitlines()
    finally:
         fm.close()
 
+def shell_cmd(cmd):
+    return " ".join(shlex.quote(c) for c in cmd)
+
 def main(file, cmd):
-  print("%s writing to: %s" % (cmd, file))
+  print("%s > %s" % (shell_cmd(cmd),file))
   with open(file, "w") as out:
    count = 0
    process = subprocess.Popen(cmd,
@@ -58,13 +62,13 @@
           sys.stdout.write("\r%d seconds %d log lines"%(diff.seconds, count))
           sys.stdout.flush()
           nextPrint = datetime.now() + timedelta(seconds=10)
-       out.write(line)
+       out.write(line.decode())
        line = pout.readline()
    out.close()
    errcode = process.wait()
    diff = datetime.now() - start
    sys.stdout.write("\r%d seconds %d log lines"%(diff.seconds, count))
-  print("\n %s finished with errcode: %d" % (cmd, errcode))
+  print("\n `%s` finished with errcode: %d" % (shell_cmd(cmd), errcode))
   if errcode != 0:
      lines = tail(file, 1000)
      print('\n'.join(lines))
@@ -72,9 +76,10 @@
   return errcode
 
 if __name__ == "__main__":
-  if sys.argv < 1:
-      print("Usage: %s [file info]" % sys.argv[0])
-      sys.exit(1)
-  file = sys.argv[1]
-  cmd = sys.argv[2:]
+  try:
+    _, file, *cmd = sys.argv
+  except ValueError:
+    print("Usage: %s [file info]" % sys.argv[0])
+    sys.exit(1)
+
   main(file, cmd)
diff --git a/scripts/travis/build.sh b/scripts/travis/build.sh
index a02eddf..55ff069 100755
--- a/scripts/travis/build.sh
+++ b/scripts/travis/build.sh
@@ -19,7 +19,6 @@
 # Script to kick off the travis CI build. We want the build to fail-fast if any
 # of the below commands fail so we need to chain them in this script.
 #
-
 set -e
 
 DIR=`dirname $0`
@@ -36,9 +35,8 @@
 fi
 
 # verify that eggs have not been added to the repo
-# ./third_party/pex/wheel-0.23.0-py2.7.egg should be the only one
 set +e
-EGGS=`find . -name "*.egg" | grep -v "third_party/pex/wheel"`
+#EGGS=`find . -name "*.egg"`
 set -e
 if [ "$EGGS" ]; then
   echo 'ERROR: The following eggs were found in the repo, '\
@@ -49,9 +47,8 @@
 fi
 
 # verify that wheels have not been added to the repo
-# ./third_party/pex/setuptools-18.0.1-py2.py3-none-any.whl should be the only one
 set +e
-WHEELS=`find . -name "*.whl" | grep -v "third_party/pex/setuptools"`
+#WHEELS=`find . -name "*.whl"`
 set -e
 if [ "$WHEELS" ]; then
   echo 'ERROR: The following wheels were found in the repo, '\
@@ -79,7 +76,7 @@
 # build heron
 T="heron build"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build.txt" bazel\
+${UTILS}/save-logs.py "heron_build.txt" bazel\
   --bazelrc=tools/travis/bazel.rc build --config=$PLATFORM heron/... \
   heronpy/... examples/... storm-compatibility-examples/... \
   eco-storm-examples/... eco-heron-examples/... contrib/...
@@ -88,7 +85,7 @@
 # run heron unit tests
 T="heron test non-flaky"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_test_non_flaky.txt" bazel\
+${UTILS}/save-logs.py "heron_test_non_flaky.txt" bazel\
   --bazelrc=tools/travis/bazel.rc test\
   --test_summary=detailed --test_output=errors\
   --config=$PLATFORM --test_tag_filters=-flaky heron/... \
@@ -100,7 +97,7 @@
 # which should be fixed. For now, run them serially
 T="heron test flaky"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_test_flaky.txt" bazel\
+${UTILS}/save-logs.py "heron_test_flaky.txt" bazel\
   --bazelrc=tools/travis/bazel.rc test\
   --test_summary=detailed --test_output=errors\
   --config=$PLATFORM --test_tag_filters=flaky --jobs=1 heron/... \
@@ -111,21 +108,21 @@
 # build packages
 T="heron build tarpkgs"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build_tarpkgs.txt" bazel\
+${UTILS}/save-logs.py "heron_build_tarpkgs.txt" bazel\
   --bazelrc=tools/travis/bazel.rc build\
   --config=$PLATFORM scripts/packages:tarpkgs
 end_timer "$T"
 
 T="heron build binpkgs"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
+${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
   --bazelrc=tools/travis/bazel.rc build\
   --config=$PLATFORM scripts/packages:binpkgs
 end_timer "$T"
 
 T="heron build docker images"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
+${UTILS}/save-logs.py "heron_build_binpkgs.txt" bazel\
   --bazelrc=tools/travis/bazel.rc build\
   --config=$PLATFORM scripts/images:heron.tar
 end_timer "$T"
diff --git a/scripts/travis/k8s.sh b/scripts/travis/k8s.sh
new file mode 100755
index 0000000..9cb5af7
--- /dev/null
+++ b/scripts/travis/k8s.sh
@@ -0,0 +1,92 @@
+#!/usr/bin/env bash
+:<<'DOC'
+set NO_CACHE=1 to always rebuild images.
+
+DOC
+set -o errexit -o nounset -o pipefail
+TAG=test
+HERE="$(cd "$(dirname "$0")"; pwd -P)"
+ROOT="$(cd "$HERE/../.."; pwd -P)"
+
+function bazel_file {
+    # bazel_file VAR_NAME //some/build:target
+    # this will set VAR_NAME to the path of the build artefact
+    local var="${1:?}"
+    local ref="${2:?}"
+    local path="$(bazel info bazel-genfiles)/$(echo "${ref##//}" | tr ':' '/')"
+    bazel build "$ref"
+    eval "$var=$path"
+}
+
+function kind_images {
+    # list all images in the kind registry
+    docker exec -it kind-control-plane crictl images
+}
+
+function install_helm3 {
+    pushd /tmp
+        curl --location https://get.helm.sh/helm-v3.2.1-linux-amd64.tar.gz --output helm.tar.gz
+        tar --extract --file=helm.tar.gz --strip-components=1 linux-amd64/helm
+        mv helm ~/.local/bin/
+    popd
+}
+
+function action {
+    (
+        tput setaf 4;
+        echo "[$(date --rfc-3339=seconds)] $*";
+        tput sgr0
+    ) > /dev/stderr
+}
+
+
+function create_cluster {
+    # trap "kind delete cluster" EXIT
+    if [ -z "$(kind get clusters)" ]; then
+        action "Creating kind cluster"
+        kind create cluster --config="$0.kind.yaml"
+    fi
+}
+
+function get_image {
+    # cannot use `bazel_file heron_archive //scripts/images:heron.tar` as not distro image
+    local tag="$TAG"
+    local distro="${1:?}"
+    local out
+    local expected="$ROOT/dist/heron-docker-$tag-$distro.tar"
+    if [ -f "$expected" ] && [ -z "${NO_CACHE-}" ]; then
+        action "Using pre-existing heron image"
+        out="$expected"
+    else
+        action "Creating heron image"
+        local gz="$(scripts/release/docker-images build test debian10)"
+        # XXX: must un .gz https://github.com/kubernetes-sigs/kind/issues/1636
+        gzip --decompress "$gz"
+        out="${gz%%.gz}"
+    fi
+    archive="$out"
+}
+
+create_cluster
+
+get_image debian10
+heron_archive="$archive"
+action "Loading heron docker image"
+kind load image-archive "$heron_archive"
+#image_heron="docker.io/bazel/scripts/images:heron"
+#image_heron="$heron_image"
+image_heron="heron/heron:$TAG"
+
+action "Loading bookkeeper image"
+image_bookkeeper="docker.io/apache/bookkeeper:4.7.3"
+docker pull "$image_bookkeeper"
+kind load docker-image "$image_bookkeeper"
+
+action "Deploying heron with helm"
+# install heron in kind using helm
+bazel_file helm_yaml //scripts/packages:index.yaml
+helm install heron "$(dirname "$helm_yaml")/heron-0.0.0.tgz" \
+    --set image="$image_heron" \
+    --set imagePullPolicy=IfNotPresent \
+    --set bookieReplicas=1 \
+    --set zkReplicas=1
diff --git a/scripts/travis/k8s.sh.kind.yaml b/scripts/travis/k8s.sh.kind.yaml
new file mode 100644
index 0000000..f76d19f
--- /dev/null
+++ b/scripts/travis/k8s.sh.kind.yaml
@@ -0,0 +1,4 @@
+kind: Cluster
+apiVersion: kind.x-k8s.io/v1alpha4
+nodes:
+- role: control-plane
diff --git a/scripts/travis/test.sh b/scripts/travis/test.sh
index 142b197..e1915fe 100755
--- a/scripts/travis/test.sh
+++ b/scripts/travis/test.sh
@@ -37,19 +37,19 @@
 # build test related jar
 T="heron build integration_test"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_build_integration_test.txt" bazel --bazelrc=tools/travis/bazel.rc build --config=$PLATFORM integration_test/src/...
+${UTILS}/save-logs.py "heron_build_integration_test.txt" bazel --bazelrc=tools/travis/bazel.rc build --config=$PLATFORM integration_test/src/...
 end_timer "$T"
 
 # install heron 
 T="heron install"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_install.txt" bazel --bazelrc=tools/travis/bazel.rc run --config=$PLATFORM -- scripts/packages:heron-install.sh --user
+${UTILS}/save-logs.py "heron_install.txt" bazel --bazelrc=tools/travis/bazel.rc run --config=$PLATFORM -- scripts/packages:heron-install.sh --user
 end_timer "$T"
 
 # install tests
 T="heron tests install"
 start_timer "$T"
-python ${UTILS}/save-logs.py "heron_tests_install.txt" bazel --bazelrc=tools/travis/bazel.rc run --config=$PLATFORM -- scripts/packages:heron-tests-install.sh --user
+${UTILS}/save-logs.py "heron_tests_install.txt" bazel --bazelrc=tools/travis/bazel.rc run --config=$PLATFORM -- scripts/packages:heron-tests-install.sh --user
 end_timer "$T"
 
 pathadd ${HOME}/bin/
@@ -57,7 +57,7 @@
 # run local integration test
 T="heron integration_test local"
 start_timer "$T"
-python ./bazel-bin/integration_test/src/python/local_test_runner/local-test-runner
+./bazel-bin/integration_test/src/python/local_test_runner/local-test-runner
 end_timer "$T"
 
 # initialize http-server for integration tests
diff --git a/third_party/python/cpplint/BUILD b/third_party/python/cpplint/BUILD
index b111678..6a67bcc 100644
--- a/third_party/python/cpplint/BUILD
+++ b/third_party/python/cpplint/BUILD
@@ -1,12 +1,9 @@
-load("@rules_python//python:defs.bzl", "py_binary")
-
 licenses(["notice"])
 
 package(default_visibility = ["//visibility:public"])
 
-py_binary(
+pex_binary(
     name = "cpplint",
     srcs = ["cpplint.py"],
     main = "cpplint.py",
-    stamp = 1,
 )
diff --git a/third_party/python/cpplint/cpplint.py b/third_party/python/cpplint/cpplint.py
index c30a3bc..a7be857 100755
--- a/third_party/python/cpplint/cpplint.py
+++ b/third_party/python/cpplint/cpplint.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
 #
 # Copyright (c) 2009 Google Inc. All rights reserved.
 #
@@ -794,7 +794,7 @@
   return s in GetNonHeaderExtensions()
 
 
-class _IncludeState(object):
+class _IncludeState:
   """Tracks line numbers for includes, and the order in which includes appear.
 
   include_list contains list of lists of (header, line number) pairs.
@@ -961,7 +961,7 @@
     return ''
 
 
-class _CppLintState(object):
+class _CppLintState:
   """Maintains module-wide state.."""
 
   def __init__(self):
@@ -1183,7 +1183,7 @@
   """ Restores filters previously backed up."""
   _cpplint_state.RestoreFilters()
 
-class _FunctionState(object):
+class _FunctionState:
   """Tracks current function name and the number of lines in its body."""
 
   _NORMAL_TRIGGER = 250  # for --v=0, 500 for --v=1, etc.
@@ -1247,7 +1247,7 @@
   pass
 
 
-class FileInfo(object):
+class FileInfo:
   """Provides utility functions for filenames.
 
   FileInfo provides easy access to the components of a file's path
@@ -1591,7 +1591,7 @@
   return _RE_PATTERN_CLEANSE_LINE_C_COMMENTS.sub('', line)
 
 
-class CleansedLines(object):
+class CleansedLines:
   """Holds 4 copies of all lines with different preprocessing applied to them.
 
   1) elided member contains lines without strings and comments.
@@ -2310,7 +2310,7 @@
   return Match(r'^\s*(\btemplate\b)*.*class\s+\w+;\s*$', clean_lines[linenum])
 
 
-class _BlockInfo(object):
+class _BlockInfo:
   """Stores information about a generic block of code."""
 
   def __init__(self, linenum, seen_open_brace):
@@ -2497,7 +2497,7 @@
                 'Anonymous namespace should be terminated with "// namespace"')
 
 
-class _PreprocessorInfo(object):
+class _PreprocessorInfo:
   """Stores checkpoints of nesting stacks when #if/#else is seen."""
 
   def __init__(self, stack_before_if):
@@ -2511,7 +2511,7 @@
     self.seen_else = False
 
 
-class NestingState(object):
+class NestingState:
   """Holds states related to parsing braces."""
 
   def __init__(self):
diff --git a/third_party/python/pylint/BUILD b/third_party/python/pylint/BUILD
index 40a8b18..040baf6 100644
--- a/third_party/python/pylint/BUILD
+++ b/third_party/python/pylint/BUILD
@@ -4,7 +4,6 @@
 
 pex_binary(
     name = "pylint",
-    srcs = ["main.py"],
-    main = "main.py",
-    reqs = ["pylint==1.5.5"],
+    entrypoint = "pylint",
+    reqs = ["pylint==2.5.0"],
 )
diff --git a/third_party/python/pylint/main.py b/third_party/python/pylint/main.py
deleted file mode 100644
index 7ead471..0000000
--- a/third_party/python/pylint/main.py
+++ /dev/null
@@ -1,21 +0,0 @@
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional information
-#  regarding copyright ownership.  The ASF licenses this file
-#  to you under the Apache License, Version 2.0 (the
-#  "License"); you may not use this file except in compliance
-#  with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-#  Unless required by applicable law or agreed to in writing,
-#  software distributed under the License is distributed on an
-#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-#  KIND, either express or implied.  See the License for the
-#  specific language governing permissions and limitations
-#  under the License.
-
-import pylint
-
-if __name__ == '__main__':
-    pylint.run_pylint()
diff --git a/tools/bazel.rc b/tools/bazel.rc
index 0e35097..6c62547 100644
--- a/tools/bazel.rc
+++ b/tools/bazel.rc
@@ -17,7 +17,7 @@
 
 
 build --genrule_strategy=standalone
-build --host_force_python=PY2
+build --host_force_python=PY3
 build --ignore_unsupported_sandboxing
 build --spawn_strategy=standalone
 build --workspace_status_command scripts/release/status.sh
diff --git a/tools/docker/bazel.rc b/tools/docker/bazel.rc
index 26d0445..05a67d7 100644
--- a/tools/docker/bazel.rc
+++ b/tools/docker/bazel.rc
@@ -17,6 +17,7 @@
 
 # This is so we understand failures better
 build --verbose_failures
+build --host_force_python=PY3
 
 # This is so we don't use sandboxed execution. Sandboxed execution
 # runs stuff in a container, and since Travis already runs its script
@@ -32,4 +33,4 @@
 build --local_cpu_resources=2
 
 # Echo all the configuration settings and their source
- build --announce_rc
\ No newline at end of file
+ build --announce_rc
diff --git a/tools/rules/genproto.bzl b/tools/rules/genproto.bzl
index 6457415..9e4d3fc 100644
--- a/tools/rules/genproto.bzl
+++ b/tools/rules/genproto.bzl
@@ -165,6 +165,11 @@
             proto_cmd = "$(location %s) --python_out=$(@D) %s" % (protoc, proto_path)
         else:
             proto_cmd = "$(location %s) %s --python_out=$(@D) %s" % (protoc, proto_include_paths, proto_path)
+        # hack to work around not having import_prefix from the official proto rules which is needed to sort out imports
+        # and without having https://github.com/protocolbuffers/protobuf/pull/7470
+        # import common_pb2 as common__pb2 -> import .common_pb2 as common__pb2
+        proto_cmd += "\nfind $(@D) -ignore_readdir_race -type f -name '*_pb2.py' -exec sed -i.bak -E 's/^(import )([^ .]+_pb2)/from . import \\2/' {} \\;"
+
         py_deps = []
         proto_deps = [src, protoc]
         for dep in deps:
diff --git a/tools/rules/pex/BUILD b/tools/rules/pex/BUILD
index e9d8463..26dca58 100644
--- a/tools/rules/pex/BUILD
+++ b/tools/rules/pex/BUILD
@@ -26,23 +26,27 @@
     'ln -sf "$$OUTDIR" "$$TMPF"',
     'VENV="$${TMPF}/venv"',
     '$(location @virtualenv//:virtualenv) --no-download --quiet --clear "$$VENV"',
-    'PYTHON="$$VENV/bin/python"',
+    '# this is just to keep the activate script happy',
+    'PS1=',
+    'source "$$VENV/bin/activate"',
 
-    '$$VENV/bin/pip install pex \
+    'pip install pex \
             --quiet --no-cache-dir --no-index --build $(@D)/pexbuild \
             --find-links $$(dirname $(location @pex_src//file)) \
             --find-links $$(dirname $(location @wheel_src//file)) \
-            --find-links $$(dirname $(location @setuptools_src//file))',
+            --find-links $$(dirname $(location @setuptools_wheel//file))',
 
     '# Work around setuptools insistance on writing to the source directory,',
     '# which is discouraged by Bazel (and annoying)',
     'cp -r $$(dirname $(location wrapper/setup.py)) $(@D)/.pex_wrapper',
 
     '# Use the bootstrapped pex to build pex_wrapper.pex',
-    '$$VENV/bin/pex $(@D)/.pex_wrapper \
-            --disable-cache --no-index -m pex_wrapper -o $@ \
+    'pex $(@D)/.pex_wrapper \
+            --disable-cache --no-index \
+            --entry-point=pex_wrapper \
+            --output-file=$@ \
             --find-links $$(dirname $(location @pex_src//file)) \
-            --find-links $$(dirname $(location @setuptools_src//file)) \
+            --find-links $$(dirname $(location @setuptools_wheel//file)) \
             --find-links $$(dirname $(location @requests_src//file)) \
             --find-links $$(dirname $(location @wheel_src//file))',
    ]
@@ -53,7 +57,7 @@
         "wrapper/setup.py",
         "wrapper/pex_wrapper.py",
         "wrapper/README",
-        "@setuptools_src//file",
+        "@setuptools_wheel//file",
         "@wheel_src//file",
         "@pex_src//file",
         "@requests_src//file",
diff --git a/tools/rules/pex/pex_rules.bzl b/tools/rules/pex/pex_rules.bzl
index 2dbe333..0ad7f19 100644
--- a/tools/rules/pex/pex_rules.bzl
+++ b/tools/rules/pex/pex_rules.bzl
@@ -195,6 +195,8 @@
         "--output-file",
         deploy_pex.path,
         "--disable-cache",
+        "--python-shebang", "#!/usr/bin/env python3",
+        "--no-compile",
         manifest_file.path,
     ]
     #EXTRA_PEX_ARGS#
@@ -321,7 +323,7 @@
     "pex_verbosity": attr.int(default = 0),
     "resources": attr.label_list(allow_files = True),
     "zip_safe": attr.bool(
-        default = True,
+        default = False,
         mandatory = False,
     ),
 })
diff --git a/tools/rules/pex/wrapper/pex_wrapper.py b/tools/rules/pex/wrapper/pex_wrapper.py
index edd9ca8..6454f75 100644
--- a/tools/rules/pex/wrapper/pex_wrapper.py
+++ b/tools/rules/pex/wrapper/pex_wrapper.py
@@ -1,4 +1,4 @@
-#!/usr/bin/env python2.7
+#!/usr/bin/env python3
 # Copyright 2014 Google Inc. All rights reserved.
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -52,29 +52,19 @@
 
 
 def main():
-    pparser, resolver_options_builder = pexbin.configure_clp()
+    pparser = pexbin.configure_clp()
     poptions, args = pparser.parse_args(sys.argv)
 
     manifest_file = args[1]
     manifest_text = open(manifest_file, 'r').read()
     manifest = parse_manifest(manifest_text)
 
-    if poptions.pex_root:
-        ENV.set('PEX_ROOT', poptions.pex_root)
-    else:
-        poptions.pex_root = ENV.PEX_ROOT
-
-    if poptions.cache_dir:
-        poptions.cache_dir = pexbin.make_relative_to_root(poptions.cache_dir)
-    poptions.interpreter_cache_dir = pexbin.make_relative_to_root(
-        poptions.interpreter_cache_dir)
-
     reqs = manifest.get('requirements', [])
 
-    with ENV.patch(PEX_VERBOSE=str(poptions.verbosity)):
+    with ENV.patch(PEX_VERBOSE=str(poptions.verbosity),
+                   PEX_ROOT=poptions.pex_root or ENV.PEX_ROOT):
         with TRACER.timed('Building pex'):
-            pex_builder = pexbin.build_pex(reqs, poptions,
-                                           resolver_options_builder)
+            pex_builder = pexbin.build_pex(reqs, poptions)
 
         # Add source files from the manifest
         for modmap in manifest.get('modules', []):
@@ -112,7 +102,7 @@
         # TODO(mikekap): Do something about manifest['nativeLibraries'].
 
         pexbin.log('Saving PEX file to %s' % poptions.pex_name,
-                   v=poptions.verbosity)
+                   V=poptions.verbosity)
         tmp_name = poptions.pex_name + '~'
         safe_delete(tmp_name)
         pex_builder.build(tmp_name)
diff --git a/tools/rules/pex/wrapper/setup.py b/tools/rules/pex/wrapper/setup.py
index 6324bef..b8bd36a 100644
--- a/tools/rules/pex/wrapper/setup.py
+++ b/tools/rules/pex/wrapper/setup.py
@@ -27,6 +27,7 @@
     version="0.1",
     install_requires=[
         "pex",
+        "setuptools",
         "wheel",
         # Not strictly required, but requests makes SSL more likely to work
         "requests",
diff --git a/tools/rules/proto.bzl b/tools/rules/proto.bzl
deleted file mode 100644
index 11308bd..0000000
--- a/tools/rules/proto.bzl
+++ /dev/null
@@ -1,183 +0,0 @@
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional information
-#  regarding copyright ownership.  The ASF licenses this file
-#  to you under the Apache License, Version 2.0 (the
-#  "License"); you may not use this file except in compliance
-#  with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-#  Unless required by applicable law or agreed to in writing,
-#  software distributed under the License is distributed on an
-#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-#  KIND, either express or implied.  See the License for the
-#  specific language governing permissions and limitations
-#  under the License.
-
-load("@rules_java//java:defs.bzl", "java_library")
-load("@rules_cc//cc:defs.bzl", "cc_library")
-load("pex_rules", "pex_library")
-
-def proto_package_impl(ctx):
-    return struct(proto_src = ctx.file.src)
-
-genproto_base_attrs = {
-    "src": attr.label(
-        allow_files = [".proto"],
-        allow_single_file = True,
-    ),
-    "deps": attr.label_list(
-        allow_files = False,
-        providers = ["proto_src"],
-    ),
-}
-
-proto_package = rule(
-    proto_package_impl,
-    attrs = genproto_base_attrs,
-)
-
-def genproto_java_impl(ctx):
-    src = ctx.file.src
-    protoc = ctx.file._protoc
-
-    srcjar = ctx.actions.declare_file(ctx.configuration.genfiles_dir, ctx.label.name + ".srcjar")
-    java_srcs = srcjar.path + ".srcs"
-
-    inputs = [src, protoc]
-    java_cmd = "\n".join([
-        "set -e",
-        "rm -rf " + java_srcs,
-        "mkdir " + java_srcs,
-        protoc.path + " -I heron/proto --java_out=" + java_srcs + " " + src.path,
-        "jar cMf " + srcjar.path + " -C " + java_srcs + " .",
-        "rm -rf " + java_srcs,
-    ])
-    ctx.actions.run(
-        inputs = inputs,
-        outputs = [srcjar],
-        mnemonic = "ProtocJava",
-        command = java_cmd,
-        use_default_shell_env = True,
-    )
-
-    return struct(files = set([srcjar]))
-
-genproto_java = rule(
-    genproto_java_impl,
-    attrs = genproto_base_attrs.update({
-        "_protoc": attr.label(
-            default = Label("//third_party/protobuf:protoc"),
-            allow_files = True,
-            allow_single_file = True,
-        ),
-    }),
-)
-
-def proto_library(
-        name,
-        src = None,
-        includes = [],
-        deps = [],
-        visibility = None,
-        gen_java = False,
-        gen_cc = False,
-        gen_py = False):
-    if not src:
-        if name.endswith("_proto"):
-            src = name[:-6] + ".proto"
-        else:
-            src = name + ".proto"
-    proto_package(name = name, src = src, deps = deps)
-
-    if gen_java:
-        genproto_java(
-            name = name + "_java_src",
-            src = src,
-            deps = deps,
-            visibility = ["//visibility:private"],
-        )
-        java_deps = ["@com_google_protobuf//:protobuf_java"]
-        for dep in deps:
-            java_deps.append(dep + "_java")
-        java_library(
-            name = name + "_java",
-            srcs = [name + "_java_src"],
-            deps = java_deps,
-            visibility = visibility,
-        )
-
-    if not includes:
-        proto_include_paths = ""
-    else:
-        proto_include_paths = "".join(["-I " + incl for incl in includes])
-
-    if gen_cc:
-        # We'll guess that the repository is set up such that a .proto in
-        # //foo/bar has the package foo.bar. `location` is substituted with the
-        # relative path to its label from the workspace root.
-        proto_path = "$(location %s)" % src
-        proto_hdr = src[:-6] + ".pb.h"
-        proto_src = src[:-6] + ".pb.cc"
-        proto_srcgen_rule = name + "_cc_src"
-        proto_lib = name + "_cc"
-        protoc = "//third_party/protobuf:protoc"
-        if not includes:
-            proto_cmd = "$(location %s) --cpp_out=$(@D) %s" % (protoc, proto_path)
-        else:
-            proto_cmd = "$(location %s) %s --cpp_out=$(@D) %s" % (protoc, proto_include_paths, proto_path)
-
-        cc_deps = ["//third_party/protobuf:protobuf-cxx"]
-        proto_deps = [src, protoc]
-        for dep in deps:
-            cc_deps.append(dep + "_cc")
-            proto_deps.append(dep)
-        native.genrule(
-            name = proto_srcgen_rule,
-            visibility = visibility,
-            outs = [proto_hdr, proto_src],
-            srcs = proto_deps,
-            cmd = proto_cmd,
-        )
-        cc_library(
-            name = proto_lib,
-            visibility = visibility,
-            hdrs = [proto_hdr],
-            srcs = [":" + proto_srcgen_rule],
-            defines = ["GOOGLE_PROTOBUF_NO_RTTI"],
-            deps = cc_deps,
-            linkstatic = 1,
-        )
-
-    if gen_py:
-        # We'll guess that the repository is set up such that a .proto in
-        # //foo/bar has the package foo.bar. `location` is substituted with the
-        # relative path to its label from the workspace root.
-        proto_path = "$(location %s)" % src
-        proto_src = src[:-6] + "_pb2.py"
-        proto_srcgen_rule = name + "_py_src"
-        proto_lib = name + "_py"
-        protoc = "//third_party/protobuf:protoc"
-        if not includes:
-            proto_cmd = "$(location %s) --python_out=$(@D) %s" % (protoc, proto_path)
-        else:
-            proto_cmd = "$(location %s) %s --python_out=$(@D) %s" % (protoc, proto_include_paths, proto_path)
-        py_deps = []
-        proto_deps = [src, protoc]
-        for dep in deps:
-            py_deps.append(dep + "_py")
-            proto_deps.append(dep)
-        native.genrule(
-            name = proto_srcgen_rule,
-            visibility = visibility,
-            outs = [proto_src],
-            srcs = proto_deps,
-            cmd = proto_cmd,
-        )
-        pex_library(
-            name = proto_lib,
-            visibility = visibility,
-            srcs = [proto_src],
-            deps = py_deps,
-        )
diff --git a/tools/travis/bazel.rc b/tools/travis/bazel.rc
index a96e92f..518f2d1 100644
--- a/tools/travis/bazel.rc
+++ b/tools/travis/bazel.rc
@@ -15,6 +15,8 @@
 #  specific language governing permissions and limitations
 #  under the License.
 
+build --host_force_python=PY3
+
 # This is from Bazel's former travis setup, to avoid blowing up the RAM usage.
 startup --host_jvm_args=-Xmx2500m
 startup --host_jvm_args=-Xms2500m
diff --git a/vagrant/README.md b/vagrant/README.md
index 08a6c48..662362e 100644
--- a/vagrant/README.md
+++ b/vagrant/README.md
@@ -16,6 +16,8 @@
     specific language governing permissions and limitations
     under the License.
 -->
-Vagrant VM to build and run Heron
+Vagrant VM for CI and debugging
 =================================
-vagrant up
\ No newline at end of file
+Running `vagrant up master` will bring up an environment similar to the one used by Travis for CI. If the build fails, it can be inspected by entering the machine with `vagrant ssh master`. When you're down with the VM, you can clean up with `vagrant destroy -f`.
+
+The advantage of this is you don't need to worry about the potential environment pollution, and others can reproduce the results from other platforms.
diff --git a/vagrant/Vagrantfile b/vagrant/Vagrantfile
index 7161cf7..ad39ee4 100644
--- a/vagrant/Vagrantfile
+++ b/vagrant/Vagrantfile
@@ -15,7 +15,7 @@
 # -*- mode: ruby -*-
 # vi: set ft=ruby :
 
-SLAVES=1
+SLAVES=0
 NET_PREFIX="192.168.25."
 
 NODES={"master" => NET_PREFIX + "5"}
@@ -29,12 +29,33 @@
 end
 
 Vagrant.configure(2) do |config|
-  config.vm.box = "ubuntu/trusty64"
+  # config.vm.box = "ubuntu/focal64"
+  config.vm.box = "bento/ubuntu-20.04"
   config.vm.synced_folder "../", "/vagrant"
+  config.vm.boot_timeout = 600
 
   config.vm.define "master" do |master|
     master.vm.provider "virtualbox" do |v|
-      v.memory = 4096
+      host = RbConfig::CONFIG['host_os']
+      mem_ratio = 1.0/2
+      cpu_exec_cap = 75
+      # Give VM 1/2 system memory & access to all cpu cores on the host
+      if host =~ /darwin/
+        cpus = `sysctl -n hw.ncpu`.to_i
+        # sysctl returns Bytes and we need to convert to MB
+        mem = `sysctl -n hw.memsize`.to_f / 1024**2 * mem_ratio
+      elsif host =~ /linux/
+        cpus = `nproc`.to_i
+        # meminfo shows KB and we need to convert to MB
+        mem = `grep 'MemTotal' /proc/meminfo | sed -E -e 's/MemTotal:\\s+//' -e 's/ kB//'`.to_i / 1024 * mem_ratio
+      else # Windows folks
+        cpus = `wmic cpu get NumberOfCores`.split("\n")[2].to_i
+        mem = `wmic OS get TotalVisibleMemorySize`.split("\n")[2].to_i / 1024 * mem_ratio
+      end
+      mem = mem.to_i
+      v.customize ["modifyvm", :id, "--cpuexecutioncap", cpu_exec_cap]
+      v.memory = mem
+      v.cpus = cpus
     end
 
     master.vm.hostname = "master"
diff --git a/vagrant/init.sh b/vagrant/init.sh
index c5186b2..5f4f092 100644
--- a/vagrant/init.sh
+++ b/vagrant/init.sh
@@ -1,4 +1,5 @@
-#!/bin/bash -ex
+#!/bin/bash
+set -o errexit -o nounset -o pipefail
 
 # Licensed to the Apache Software Foundation (ASF) under one or more
 # contributor license agreements.  See the NOTICE file distributed with
@@ -25,7 +26,7 @@
     ip=$(cat /etc/hosts | grep `hostname` | grep -E -o "([0-9]{1,3}[\.]){3}[0-9]{1,3}")
     echo $ip > "/etc/mesos-$mode/ip"
 
-    if [ $mode == "master" ]; then
+    if [ "$mode" == "master" ]; then
         ln -s /lib/init/upstart-job /etc/init.d/mesos-master
         service mesos-master start
     else
@@ -33,6 +34,7 @@
     fi
 
     ln -s /lib/init/upstart-job /etc/init.d/mesos-slave
+    echo 'docker,mesos' > /etc/mesos-slave/containerizers
     service mesos-slave start
 }
 
@@ -41,69 +43,40 @@
     service marathon start
 }
 
-install_docker() {
-    apt-get install -qy lxc-docker
-    echo 'docker,mesos' > /etc/mesos-slave/containerizers
-    service mesos-slave restart
-}
-
-install_jdk8() {
-    apt-get install -y software-properties-common python-software-properties
-    add-apt-repository -y ppa:webupd8team/java
-    apt-get -y update
-    /bin/echo debconf shared/accepted-oracle-license-v1-1 select true | /usr/bin/debconf-set-selections
-    apt-get -y install oracle-java8-installer oracle-java8-set-default vim wget screen git    
-}
-
 bazelVersion=3.0.0
 bazel_install() {
-    install_jdk8
-    apt-get install -y g++ automake cmake gcc-4.8 g++-4.8 zlib1g-dev zip pkg-config wget libssl-dev
+    apt-get install -y automake cmake gcc g++ zlib1g-dev zip pkg-config wget libssl-dev libunwind-dev
     mkdir -p /opt/bazel
     pushd /opt/bazel
-        pushd /tmp
-            wget http://download.savannah.gnu.org/releases/libunwind/libunwind-1.1.tar.gz
-            tar xvfz libunwind-1.1.tar.gz
-            cd libunwind-1.1 && ./configure --prefix=/usr && make install 
-        popd
         wget -O /tmp/bazel.sh https://github.com/bazelbuild/bazel/releases/download/${bazelVersion}/bazel-${bazelVersion}-installer-linux-x86_64.sh
         chmod +x /tmp/bazel.sh
-        /tmp/bazel.sh --user 
+        /tmp/bazel.sh
     popd
 }
 
 build_heron() {
-    pushd /tmp
-        wget http://ftpmirror.gnu.org/libtool/libtool-2.4.6.tar.gz
-        tar xf libtool*
-        cd libtool-2.4.6
-        sh configure --prefix /usr/local
-        make install 
-    popd
     pushd /vagrant
-        export CC=gcc-4.8
-        export CXX=g++-4.8
-        export PATH=/sbin:$PATH
-        ~/bin/bazel clean
+        bazel clean
         ./bazel_configure.py
-        ~/bin/bazel --bazelrc=tools/travis/bazel.rc build --config=ubuntu heron/...
+        bazel --bazelrc=tools/travis/bazel.rc build --config=ubuntu heron/...
     popd
 }
 
-if [[ $1 != "master" && $1 != "slave" ]]; then
+if [[ "$1" != "master" && $1 != "slave" ]]; then
     echo "Usage: $0 master|slave"
     exit 1
 fi
-mode=$1
+mode="$1"
 
 cd /vagrant/vagrant
 
 # name resolution
 cp .vagrant/hosts /etc/hosts
 
+# XXX: not needed?
 # ssh key
 key=".vagrant/ssh_key.pub"
-if [ -f $key ]; then
+if [ -f "$key" ]; then
     cat $key >> /home/vagrant/.ssh/authorized_keys
 fi
 
@@ -118,27 +91,23 @@
     echo "Acquire::http::Proxy \"$apt_proxy\";" > /etc/apt/apt.conf.d/90-apt-proxy.conf
 fi
 
+:<<'REMOVED'
 # add mesosphere repo
 apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv E56151BF
 DISTRO=$(lsb_release -is | tr '[:upper:]' '[:lower:]')
 CODENAME=$(lsb_release -cs)
-echo "deb http://repos.mesosphere.io/${DISTRO} ${CODENAME} main" | tee /etc/apt/sources.list.d/mesosphere.list
-
-# add docker repo
-apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 36A1D7869245C8950F966E92D8576A8BA88D21E9
-echo "deb http://get.docker.com/ubuntu docker main" > /etc/apt/sources.list.d/docker.list
+echo "deb http://repos.mesosphere.io/${DISTRO} cosmic main" | tee /etc/apt/sources.list.d/mesosphere.list
+REMOVED
 
 apt-get -qy update
 
 # install deps
-apt-get install -qy vim zip mc curl wget openjdk-7-jre scala git python-setuptools python-dev
+apt-get install -qy vim zip mc curl wget openjdk-11-jdk scala git python3-setuptools python3-dev libtool-bin libcppunit-dev python-is-python3
 
-install_mesos $mode
+# install_mesos $mode
 if [ $mode == "master" ]; then 
-    install_marathon
+    # install_marathon
     bazel_install
-    build_heron
+    # switch to non-root so bazel cache can be reused when SSHing in
+    # su --login vagrant /vagrant/scripts/travis/ci.sh
 fi
-
-install_docker
-
diff --git a/vagrant/local-ci.sh b/vagrant/local-ci.sh
new file mode 100755
index 0000000..63617bd
--- /dev/null
+++ b/vagrant/local-ci.sh
@@ -0,0 +1,31 @@
+#!/usr/bin/env bash
+:<<'DOC'
+This script is for running tests in a local VM, similar to the environment used in the CI pipeline. If the targent script fails, a shell will be opened up within the VM.
+
+To only run integration tests:
+  ./local-ci.sh test
+
+To run the full ci pipeline:
+  ./local-ci.sh ci
+
+The VM does not report the platform in python as expected, so PLATFORM=Ubuntu is needed to work around that for the CI script's platform discovery.
+
+DOC
+
+set -o errexit -o nounset -o pipefail
+HERE="$(cd "$(dirname "$0")" && pwd -P)"
+
+cd "$HERE"
+
+state="$(vagrant status master --machine-readable | grep master,state, | cut -d, -f4)"
+if [ "$state" != "running" ]; then
+    vagrant up master
+fi
+
+
+# allows you to do `$0 test` to run only integration tests
+script="${1-ci}"
+env="PLATFORM=Ubuntu JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64/"
+# run the CI, if it fails drop into a shell
+vagrant ssh master --command "cd /vagrant && $env ./scripts/travis/$script.sh" \
+    || vagrant ssh master --command "cd /vagrant && $env exec bash"
diff --git a/website2/docs/schedulers-nomad.md b/website2/docs/schedulers-nomad.md
index 3c411c6..94025a4 100644
--- a/website2/docs/schedulers-nomad.md
+++ b/website2/docs/schedulers-nomad.md
@@ -39,7 +39,7 @@
 When setting up your Nomad cluster, the following are required:
 
 * The [Heron CLI tool](user-manuals-heron-cli) must be installed on each machine used to deploy Heron topologies
-* Python 2.7, Java 7 or 8, and [curl](https://curl.haxx.se/) must be installed on every machine in the cluster
+* Python 3, Java 7 or 8, and [curl](https://curl.haxx.se/) must be installed on every machine in the cluster
 * A [ZooKeeper cluster](https://zookeeper.apache.org)
 
 ## Configuring Heron settings
diff --git a/website2/website/versioned_docs/version-0.20.0-incubating/compiling-docker.md b/website2/website/versioned_docs/version-0.20.0-incubating/compiling-docker.md
index dc71448..a3eb0b6 100644
--- a/website2/website/versioned_docs/version-0.20.0-incubating/compiling-docker.md
+++ b/website2/website/versioned_docs/version-0.20.0-incubating/compiling-docker.md
@@ -205,7 +205,7 @@
          libunwind8 \
          libunwind-setjmp0-dev \
          python \
-         python2.7-dev \
+         python3-dev \
          python-software-properties \
          software-properties-common \
          python-setuptools \