blob: 6e7a25e73e98d0722139c1d12450e2cc70acc0ca [file] [log] [blame]
:py:mod:`airflow.providers.cncf.kubernetes.operators.spark_kubernetes`
======================================================================
.. py:module:: airflow.providers.cncf.kubernetes.operators.spark_kubernetes
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.cncf.kubernetes.operators.spark_kubernetes.SparkKubernetesOperator
.. py:class:: SparkKubernetesOperator(*, application_file, namespace = None, kubernetes_conn_id = 'kubernetes_default', api_group = 'sparkoperator.k8s.io', api_version = 'v1beta2', **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Creates sparkApplication object in kubernetes cluster:
.. seealso::
For more detail about Spark Application Object have a look at the reference:
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/v1beta2-1.1.0-2.4.5/docs/api-docs.md#sparkapplication
:param application_file: Defines Kubernetes 'custom_resource_definition' of 'sparkApplication' as either a
path to a '.yaml' file, '.json' file, YAML string or JSON string.
:param namespace: kubernetes namespace to put sparkApplication
:param kubernetes_conn_id: The :ref:`kubernetes connection id <howto/connection:kubernetes>`
for the to Kubernetes cluster.
:param api_group: kubernetes api group of sparkApplication
:param api_version: kubernetes api version of sparkApplication
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['application_file', 'namespace']
.. py:attribute:: template_ext
:annotation: :Sequence[str] = ['.yaml', '.yml', '.json']
.. py:attribute:: ui_color
:annotation: = #f4a460
.. py:method:: execute(context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.