blob: 7c8465576ed856e06533e34394523ce6f39b27b2 [file] [log] [blame]
:mod:`airflow.executors.base_executor`
======================================
.. py:module:: airflow.executors.base_executor
Module Contents
---------------
.. data:: PARALLELISM
.. py:class:: BaseExecutor(parallelism=PARALLELISM)
Bases: :class:`airflow.utils.log.logging_mixin.LoggingMixin`
.. method:: start(self)
Executors may need to get things started. For example LocalExecutor
starts N workers.
.. method:: queue_command(self, simple_task_instance, command, priority=1, queue=None)
.. method:: queue_task_instance(self, task_instance, mark_success=False, pickle_id=None, ignore_all_deps=False, ignore_depends_on_past=False, ignore_task_deps=False, ignore_ti_state=False, pool=None, cfg_path=None)
.. method:: has_task(self, task_instance)
Checks if a task is either queued or running in this executor
:param task_instance: TaskInstance
:return: True if the task is known to this executor
.. method:: sync(self)
Sync will get called periodically by the heartbeat method.
Executors should override this to perform gather statuses.
.. method:: heartbeat(self)
.. method:: trigger_tasks(self, open_slots)
Trigger tasks
:param open_slots: Number of open slots
:return:
.. method:: change_state(self, key, state)
.. method:: fail(self, key)
.. method:: success(self, key)
.. method:: get_event_buffer(self, dag_ids=None)
Returns and flush the event buffer. In case dag_ids is specified
it will only return and flush events for the given dag_ids. Otherwise
it returns and flushes all
:param dag_ids: to dag_ids to return events for, if None returns all
:return: a dict of events
.. method:: execute_async(self, key, command, queue=None, executor_config=None)
This method will execute the command asynchronously.
.. method:: end(self)
This method is called when the caller is done submitting job and is
wants to wait synchronously for the job submitted previously to be
all done.
.. method:: terminate(self)
This method is called when the daemon receives a SIGTERM