blob: 3b3c8594f910a6492b8aa3837c5dd00815ec5db0 [file] [log] [blame]
:py:mod:`airflow.providers.amazon.aws.sensors.batch`
====================================================
.. py:module:: airflow.providers.amazon.aws.sensors.batch
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.amazon.aws.sensors.batch.BatchSensor
airflow.providers.amazon.aws.sensors.batch.BatchComputeEnvironmentSensor
airflow.providers.amazon.aws.sensors.batch.BatchJobQueueSensor
.. py:class:: BatchSensor(*, job_id, aws_conn_id = 'aws_default', region_name = None, **kwargs)
Bases: :py:obj:`airflow.sensors.base.BaseSensorOperator`
Asks for the state of the Batch Job execution until it reaches a failure state or success state.
If the job fails, the task will fail.
.. seealso::
For more information on how to use this sensor, take a look at the guide:
:ref:`howto/sensor:BatchSensor`
:param job_id: Batch job_id to check the state for
:param aws_conn_id: aws connection to use, defaults to 'aws_default'
:param region_name: aws region name associated with the client
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['job_id']
.. py:attribute:: template_ext
:annotation: :Sequence[str] = []
.. py:attribute:: ui_color
:annotation: = #66c3ff
.. py:method:: poke(context)
Function that the sensors defined while deriving this class should
override.
.. py:method:: get_hook()
Create and return a BatchClientHook
.. py:class:: BatchComputeEnvironmentSensor(compute_environment, aws_conn_id = 'aws_default', region_name = None, **kwargs)
Bases: :py:obj:`airflow.sensors.base.BaseSensorOperator`
Asks for the state of the Batch compute environment until it reaches a failure state or success state.
If the environment fails, the task will fail.
.. seealso::
For more information on how to use this sensor, take a look at the guide:
:ref:`howto/sensor:BatchComputeEnvironmentSensor`
:param compute_environment: Batch compute environment name
:param aws_conn_id: aws connection to use, defaults to 'aws_default'
:param region_name: aws region name associated with the client
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['compute_environment']
.. py:attribute:: template_ext
:annotation: :Sequence[str] = []
.. py:attribute:: ui_color
:annotation: = #66c3ff
.. py:method:: hook()
Create and return a BatchClientHook
.. py:method:: poke(context)
Function that the sensors defined while deriving this class should
override.
.. py:class:: BatchJobQueueSensor(job_queue, treat_non_existing_as_deleted = False, aws_conn_id = 'aws_default', region_name = None, **kwargs)
Bases: :py:obj:`airflow.sensors.base.BaseSensorOperator`
Asks for the state of the Batch job queue until it reaches a failure state or success state.
If the queue fails, the task will fail.
.. seealso::
For more information on how to use this sensor, take a look at the guide:
:ref:`howto/sensor:BatchJobQueueSensor`
:param job_queue: Batch job queue name
:param treat_non_existing_as_deleted: If True, a non-existing Batch job queue is considered as a deleted
queue and as such a valid case.
:param aws_conn_id: aws connection to use, defaults to 'aws_default'
:param region_name: aws region name associated with the client
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['job_queue']
.. py:attribute:: template_ext
:annotation: :Sequence[str] = []
.. py:attribute:: ui_color
:annotation: = #66c3ff
.. py:method:: hook()
Create and return a BatchClientHook
.. py:method:: poke(context)
Function that the sensors defined while deriving this class should
override.