| commit | ba12b51ec987e76b6b67e20f60b4326e203b03f6 | [log] [tgz] |
|---|---|---|
| author | runzhiwang <runzhiwang@tencent.com> | Mon Oct 21 16:16:18 2019 +0800 |
| committer | jerryshao <jerryshao@tencent.com> | Mon Oct 21 16:16:18 2019 +0800 |
| tree | b109dbe436fb420be897fc47080b6452db16b5c5 | |
| parent | 85837e381bd9899a2680cfbb30c4f76f3a113c69 [diff] |
[LIVY-697] Rsc client cannot resolve the hostname of driver in yarn-cluster mode ## What changes were proposed in this pull request? [LIVY-697] Rsc client cannot resolve the hostname of driver in yarn-cluster mode 1. The content of Driver in /etc/hosts are as follows: 127.0.0.1 localhost 10.10.10.10 test_hostname 2. The content of Driver in /etc/hostname are as follows: test_hostname 3. The findLocalAddress method in livy cannot return 10.10.10.10, but return test_hostname. Because the test_hostname point to 10.10.10.10, which doesn't pass the check address.isLoopbackAddress(), so findLocalAddress return test_hostname . 4. The rsc client cannot resolve the test_hostname, which cause rsc client cannot connect to driver. 5. The findLocalAddress method in livy can return 10.10.10.10 as expected if the content of Driver in /etc/hosts are as follows, which is not correct in our environment. 127.0.0.1 localhost 127.0.0.1 test_hostname 6. Though I can modify the findLocalAddress method to return 10.10.10.10, but it maybe cause error if the machine has multiple network cards. So, rsc client gets the driver ip from the connection. ## How was this patch tested? 1. The content of Driver in /etc/hosts are as follows: 127.0.0.1 localhost 127.0.0.1 test_hostname 2. The content of Driver in /etc/hostname are as follows: test_hostname 3. rsc client can get the driver ip in connection, and connect to driver successfully. Author: runzhiwang <runzhiwang@tencent.com> Closes #246 from runzhiwang/hostname-2-ip.
Apache Livy is an open source REST interface for interacting with Apache Spark from anywhere. It supports executing snippets of code or programs in a Spark context that runs locally or in Apache Hadoop YARN.
Pull requests are welcomed! But before you begin, please check out the Contributing section on the Community page of our website.
Guides and documentation on getting started using Livy, example code snippets, and Livy API documentation can be found at livy.incubator.apache.org.
To build Livy, you will need:
Debian/Ubuntu:
maven package or maven3 tarball)Redhat/CentOS:
maven package or maven3 tarball)MacOS:
Required python packages for building Livy:
To run Livy, you will also need a Spark installation. You can get Spark releases at https://spark.apache.org/downloads.html.
Livy requires Spark 2.2+. You can switch to a different version of Spark by setting the SPARK_HOME environment variable in the Livy server process, without needing to rebuild Livy.
Livy is built using Apache Maven. To check out and build Livy, run:
git clone https://github.com/apache/incubator-livy.git cd incubator-livy mvn package
By default Livy is built against Apache Spark 2.2.0, but the version of Spark used when running Livy does not need to match the version used to build Livy. Livy internally handles the differences between different Spark versions.
The Livy package itself does not contain a Spark distribution. It will work with any supported version of Spark without needing to rebuild.