blob: 4bad5ff11c3a175f394c83022e1df60fc1a234b9 [file] [log] [blame]
:py:mod:`airflow.providers.cncf.kubernetes.sensors.spark_kubernetes`
====================================================================
.. py:module:: airflow.providers.cncf.kubernetes.sensors.spark_kubernetes
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.cncf.kubernetes.sensors.spark_kubernetes.SparkKubernetesSensor
.. py:class:: SparkKubernetesSensor(*, application_name, attach_log = False, namespace = None, kubernetes_conn_id = 'kubernetes_default', api_group = 'sparkoperator.k8s.io', api_version = 'v1beta2', **kwargs)
Bases: :py:obj:`airflow.sensors.base.BaseSensorOperator`
Checks sparkApplication object in kubernetes cluster:
.. seealso::
For more detail about Spark Application Object have a look at the reference:
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/v1beta2-1.1.0-2.4.5/docs/api-docs.md#sparkapplication
:param application_name: spark Application resource name
:param namespace: the kubernetes namespace where the sparkApplication reside in
:param kubernetes_conn_id: The :ref:`kubernetes connection<howto/connection:kubernetes>`
to Kubernetes cluster.
:param attach_log: determines whether logs for driver pod should be appended to the sensor log
:param api_group: kubernetes api group of sparkApplication
:param api_version: kubernetes api version of sparkApplication
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['application_name', 'namespace']
.. py:attribute:: FAILURE_STATES
:annotation: = ['FAILED', 'UNKNOWN']
.. py:attribute:: SUCCESS_STATES
:annotation: = ['COMPLETED']
.. py:method:: poke(context)
Function that the sensors defined while deriving this class should
override.