[LIVY-735][RSC] Fix rpc channel closed when multi clients connect to one driver

## What changes were proposed in this pull request?

Currently, the driver tries to support communicating with multi-clients, by registering each client at https://github.com/apache/incubator-livy/blob/master/rsc/src/main/java/org/apache/livy/rsc/driver/RSCDriver.java#L220.

But actually, if multi-clients connect to one driver, the rpc channel will close, the reason are as follows.

1.  In every communication, client sends two packages to driver: header{type, id}, and payload at https://github.com/apache/incubator-livy/blob/master/rsc/src/main/java/org/apache/livy/rsc/rpc/RpcDispatcher.java#L144.

2. If client1 sends header1, payload1, and client2 sends header2, payload2 at the same time.
  The driver receives the package in the order: header1, header2, payload1, payload2.

3. When driver receives header1, driver assigns lastHeader at https://github.com/apache/incubator-livy/blob/master/rsc/src/main/java/org/apache/livy/rsc/rpc/RpcDispatcher.java#L73.

4. Then driver receives header2, driver process it as a payload at https://github.com/apache/incubator-livy/blob/master/rsc/src/main/java/org/apache/livy/rsc/rpc/RpcDispatcher.java#L78 which cause exception and rpc channel closed.

In the muti-active HA mode, the design doc is at: https://docs.google.com/document/d/1bD3qYZpw14_NuCcSGUOfqQ0pqvSbCQsOLFuZp26Ohjc/edit?usp=sharing, the session is allocated among servers by consistent hashing. If a new livy joins, some session will be migrated from old livy to new livy. If the session client in new livy connect to driver before stoping session client in old livy, then two session clients will both connect to driver, and rpc channel close.  In this case, it's hard to ensure only one client connect to one driver at any time. So it's better to support multi-clients connect to one driver, which has no side effects.

How to fix:
1. Move the code of processing client message from `RpcDispatcher` to each `Rpc`.
2. Each `Rpc` registers itself to `channelRpc` in RpcDispatcher.
3. `RpcDispatcher` dispatches each message to `Rpc` according to  `ctx.channel()`.

## How was this patch tested?

Existed UT and IT

Author: runzhiwang <runzhiwang@tencent.com>

Closes #268 from runzhiwang/multi-client-one-driver.
3 files changed
tree: 4ea7b7fdf9343788e35245ed667e2905949f4ab0
  1. .github/
  2. api/
  3. assembly/
  4. bin/
  5. client-common/
  6. client-http/
  7. conf/
  8. core/
  9. coverage/
  10. dev/
  11. docs/
  12. examples/
  13. integration-test/
  14. python-api/
  15. repl/
  16. rsc/
  17. scala/
  18. scala-api/
  19. server/
  20. test-lib/
  21. thriftserver/
  22. .gitignore
  23. .rat-excludes
  24. .travis.yml
  25. checkstyle-suppressions.xml
  26. checkstyle.xml
  27. DISCLAIMER
  28. LICENSE
  29. NOTICE
  30. pom.xml
  31. README.md
  32. scalastyle.xml
README.md

Apache Livy

Build Status

Apache Livy is an open source REST interface for interacting with Apache Spark from anywhere. It supports executing snippets of code or programs in a Spark context that runs locally or in Apache Hadoop YARN.

  • Interactive Scala, Python and R shells
  • Batch submissions in Scala, Java, Python
  • Multiple users can share the same server (impersonation support)
  • Can be used for submitting jobs from anywhere with REST
  • Does not require any code change to your programs

Pull requests are welcomed! But before you begin, please check out the Contributing section on the Community page of our website.

Online Documentation

Guides and documentation on getting started using Livy, example code snippets, and Livy API documentation can be found at livy.incubator.apache.org.

Before Building Livy

To build Livy, you will need:

Debian/Ubuntu:

  • mvn (from maven package or maven3 tarball)
  • openjdk-8-jdk (or Oracle JDK 8)
  • Python 2.7+
  • R 3.x

Redhat/CentOS:

  • mvn (from maven package or maven3 tarball)
  • java-1.8.0-openjdk (or Oracle JDK 8)
  • Python 2.7+
  • R 3.x

MacOS:

  • Xcode command line tools
  • Oracle's JDK 1.8
  • Maven (Homebrew)
  • Python 2.7+
  • R 3.x

Required python packages for building Livy:

  • cloudpickle
  • requests
  • requests-kerberos
  • flake8
  • flaky
  • pytest

To run Livy, you will also need a Spark installation. You can get Spark releases at https://spark.apache.org/downloads.html.

Livy requires Spark 2.2+. You can switch to a different version of Spark by setting the SPARK_HOME environment variable in the Livy server process, without needing to rebuild Livy.

Building Livy

Livy is built using Apache Maven. To check out and build Livy, run:

git clone https://github.com/apache/incubator-livy.git
cd incubator-livy
mvn package

By default Livy is built against Apache Spark 2.2.0, but the version of Spark used when running Livy does not need to match the version used to build Livy. Livy internally handles the differences between different Spark versions.

The Livy package itself does not contain a Spark distribution. It will work with any supported version of Spark without needing to rebuild.