{% include JB/setup %}
Returns all the active interactive sessions.
Creates a new interactive Scala, Python, or R shell in the cluster.
1: Starting with version 0.5.0-incubating this field is not required. To be compatible with previous versions users can still specify this with spark, pyspark or sparkr, implying that the submitted code snippet is the corresponding kind.
The created Session.
Returns the session information.
The Session.
Returns the state of session
Kills the Session job.
Gets the log lines from this session.
Returns all the statements in a session.
Runs a statement in a session.
2: If session kind is not specified or the submitted code is not the kind specified in session creation, this field should be filled with correct kind. Otherwise Livy will use kind specified in session creation as the default code kind.
The statement object.
Returns a specified statement in a session.
The statement object.
Cancel the specified statement in this session.
Returns code completion candidates for the specified code in the session.
Returns all the active batch sessions.
Creates a new batch session.
The created Batch object.
Returns the batch session information.
The Batch.
Returns the state of batch session
Kills the Batch job.
Gets the log lines from this batch.
A session represents an interactive shell.
Starting with version 0.5.0-incubating, each session can support all four Scala, Python and R interpreters with newly added SQL interpreter. The kind
field in session creation is no longer required, instead users should specify code kind (spark, pyspark, sparkr or sql) during statement submission.
To be compatible with previous versions, users can still specify kind
in session creation, while ignoring kind
in statement submission. Livy will then use this session kind
as default kind for all the submitted statements.
If users want to submit code other than default kind
specified in session creation, users need to specify code kind (spark, pyspark, sparkr or sql) during statement submission.
To change the Python executable the session uses, Livy reads the path from environment variable PYSPARK_PYTHON
(Same as pyspark).
Starting with version 0.5.0-incubating, session kind “pyspark3” is removed, instead users require to set PYSPARK_PYTHON
to python3 executable.
Like pyspark, if Livy is running in local
mode, just set the environment variable. If the session is running in yarn-cluster
mode, please set spark.yarn.appMasterEnv.PYSPARK_PYTHON
in SparkConf so the environment variable is passed to the driver.
A statement represents the result of an execution statement.
doAs
supportIf superuser support is configured, Livy supports the doAs
query parameter to specify the user to impersonate. The doAs
query parameter can be used on any supported REST endpoint described above to perform the action as the specified user. If both doAs
and proxyUser
are specified during session or batch creation, the doAs
parameter takes precedence.