blob: 8a19824d573087971f450f89df7e06c6cd32dae6 [file] [log] [blame]
:mod:`airflow.executors.debug_executor`
=======================================
.. py:module:: airflow.executors.debug_executor
.. autoapi-nested-parse::
DebugExecutor
.. seealso::
For more information on how the DebugExecutor works, take a look at the guide:
:ref:`executor:DebugExecutor`
Module Contents
---------------
.. py:class:: DebugExecutor
Bases: :class:`airflow.executors.base_executor.BaseExecutor`
This executor is meant for debugging purposes. It can be used with SQLite.
It executes one task instance at time. Additionally to support working
with sensors, all sensors ``mode`` will be automatically set to "reschedule".
.. attribute:: _terminated
.. method:: execute_async(self, *args, **kwargs)
The method is replaced by custom trigger_task implementation.
.. method:: sync(self)
.. method:: _run_task(self, ti)
.. method:: queue_task_instance(self, task_instance, mark_success=False, pickle_id=None, ignore_all_deps=False, ignore_depends_on_past=False, ignore_task_deps=False, ignore_ti_state=False, pool=None, cfg_path=None)
Queues task instance with empty command because we do not need it.
.. method:: trigger_tasks(self, open_slots)
Triggers tasks. Instead of calling exec_async we just
add task instance to tasks_to_run queue.
:param open_slots: Number of open slots
.. method:: end(self)
When the method is called we just set states of queued tasks
to UPSTREAM_FAILED marking them as not executed.
.. method:: terminate(self)
.. method:: change_state(self, key, state)