blob: 41b3b5d4c42581119548d19a9f7438ed318b409f [file] [log] [blame]
:py:mod:`airflow.providers.amazon.aws.sensors.glue_crawler`
===========================================================
.. py:module:: airflow.providers.amazon.aws.sensors.glue_crawler
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor
.. py:class:: GlueCrawlerSensor(*, crawler_name, aws_conn_id = 'aws_default', **kwargs)
Bases: :py:obj:`airflow.sensors.base.BaseSensorOperator`
Waits for an AWS Glue crawler to reach any of the statuses below.
'FAILED', 'CANCELLED', 'SUCCEEDED'
.. seealso::
For more information on how to use this sensor, take a look at the guide:
:ref:`howto/sensor:GlueCrawlerSensor`
:param crawler_name: The AWS Glue crawler unique name
:param aws_conn_id: aws connection to use, defaults to 'aws_default'
If this is None or empty then the default boto3 behaviour is used. If
running Airflow in a distributed manner and aws_conn_id is None or
empty, then default boto3 configuration would be used (and must be
maintained on each worker node).
.. py:attribute:: template_fields
:type: Sequence[str]
:value: ('crawler_name',)
.. py:method:: poke(context)
Override when deriving this class.
.. py:method:: get_hook()
Return a new or pre-existing GlueCrawlerHook.
.. py:method:: hook()