| :py:mod:`airflow.models.dag` |
| ============================ |
| |
| .. py:module:: airflow.models.dag |
| |
| |
| Module Contents |
| --------------- |
| |
| Classes |
| ~~~~~~~ |
| |
| .. autoapisummary:: |
| |
| airflow.models.dag.DAG |
| airflow.models.dag.DagTag |
| airflow.models.dag.DagModel |
| airflow.models.dag.DagContext |
| |
| |
| |
| Functions |
| ~~~~~~~~~ |
| |
| .. autoapisummary:: |
| |
| airflow.models.dag.create_timetable |
| airflow.models.dag.get_last_dagrun |
| airflow.models.dag.dag |
| |
| |
| |
| Attributes |
| ~~~~~~~~~~ |
| |
| .. autoapisummary:: |
| |
| airflow.models.dag.log |
| airflow.models.dag.DEFAULT_VIEW_PRESETS |
| airflow.models.dag.ORIENTATION_PRESETS |
| airflow.models.dag.ScheduleIntervalArgNotSet |
| airflow.models.dag.DagStateChangeCallback |
| airflow.models.dag.ScheduleInterval |
| airflow.models.dag.ScheduleIntervalArg |
| airflow.models.dag.DEFAULT_SCHEDULE_INTERVAL |
| |
| |
| .. py:data:: log |
| |
| |
| |
| |
| .. py:data:: DEFAULT_VIEW_PRESETS |
| :annotation: = ['tree', 'graph', 'duration', 'gantt', 'landing_times'] |
| |
| |
| |
| .. py:data:: ORIENTATION_PRESETS |
| :annotation: = ['LR', 'TB', 'RL', 'BT'] |
| |
| |
| |
| .. py:data:: ScheduleIntervalArgNotSet |
| |
| |
| |
| |
| .. py:data:: DagStateChangeCallback |
| |
| |
| |
| |
| .. py:data:: ScheduleInterval |
| |
| |
| |
| |
| .. py:data:: ScheduleIntervalArg |
| |
| |
| |
| |
| .. py:data:: DEFAULT_SCHEDULE_INTERVAL |
| |
| |
| |
| |
| .. py:exception:: InconsistentDataInterval(instance: Any, start_field_name: str, end_field_name: str) |
| |
| Bases: :py:obj:`airflow.exceptions.AirflowException` |
| |
| Exception raised when a model populates data interval fields incorrectly. |
| |
| The data interval fields should either both be None (for runs scheduled |
| prior to AIP-39), or both be datetime (for runs scheduled after AIP-39 is |
| implemented). This is raised if exactly one of the fields is None. |
| |
| .. py:method:: __str__(self) -> str |
| |
| Return str(self). |
| |
| |
| |
| .. py:function:: create_timetable(interval: ScheduleIntervalArg, timezone: datetime.tzinfo) -> airflow.timetables.base.Timetable |
| |
| Create a Timetable instance from a ``schedule_interval`` argument. |
| |
| |
| .. py:function:: get_last_dagrun(dag_id, session, include_externally_triggered=False) |
| |
| Returns the last dag run for a dag, None if there was none. |
| Last dag run can be any type of run eg. scheduled or backfilled. |
| Overridden DagRuns are ignored. |
| |
| |
| .. py:class:: DAG(dag_id: str, description: Optional[str] = None, schedule_interval: ScheduleIntervalArg = ScheduleIntervalArgNotSet, timetable: Optional[airflow.timetables.base.Timetable] = None, start_date: Optional[datetime.datetime] = None, end_date: Optional[datetime.datetime] = None, full_filepath: Optional[str] = None, template_searchpath: Optional[Union[str, Iterable[str]]] = None, template_undefined: Type[jinja2.StrictUndefined] = jinja2.StrictUndefined, user_defined_macros: Optional[Dict] = None, user_defined_filters: Optional[Dict] = None, default_args: Optional[Dict] = None, concurrency: Optional[int] = None, max_active_tasks: int = conf.getint('core', 'max_active_tasks_per_dag'), max_active_runs: int = conf.getint('core', 'max_active_runs_per_dag'), dagrun_timeout: Optional[datetime.timedelta] = None, sla_miss_callback: Optional[Callable[[DAG, str, str, List[str], List[airflow.models.taskinstance.TaskInstance]], None]] = None, default_view: str = conf.get('webserver', 'dag_default_view').lower(), orientation: str = conf.get('webserver', 'dag_orientation'), catchup: bool = conf.getboolean('scheduler', 'catchup_by_default'), on_success_callback: Optional[DagStateChangeCallback] = None, on_failure_callback: Optional[DagStateChangeCallback] = None, doc_md: Optional[str] = None, params: Optional[Dict] = None, access_control: Optional[Dict] = None, is_paused_upon_creation: Optional[bool] = None, jinja_environment_kwargs: Optional[Dict] = None, render_template_as_native_obj: bool = False, tags: Optional[List[str]] = None) |
| |
| Bases: :py:obj:`airflow.utils.log.logging_mixin.LoggingMixin` |
| |
| A dag (directed acyclic graph) is a collection of tasks with directional |
| dependencies. A dag also has a schedule, a start date and an end date |
| (optional). For each schedule, (say daily or hourly), the DAG needs to run |
| each individual tasks as their dependencies are met. Certain tasks have |
| the property of depending on their own past, meaning that they can't run |
| until their previous schedule (and upstream tasks) are completed. |
| |
| DAGs essentially act as namespaces for tasks. A task_id can only be |
| added once to a DAG. |
| |
| Note that if you plan to use time zones all the dates provided should be pendulum |
| dates. See :ref:`timezone_aware_dags`. |
| |
| :param dag_id: The id of the DAG; must consist exclusively of alphanumeric |
| characters, dashes, dots and underscores (all ASCII) |
| :type dag_id: str |
| :param description: The description for the DAG to e.g. be shown on the webserver |
| :type description: str |
| :param schedule_interval: Defines how often that DAG runs, this |
| timedelta object gets added to your latest task instance's |
| execution_date to figure out the next schedule |
| :type schedule_interval: datetime.timedelta or |
| dateutil.relativedelta.relativedelta or str that acts as a cron |
| expression |
| :param timetable: Specify which timetable to use (in which case schedule_interval |
| must not be set). See :doc:`/howto/timetable` for more information |
| :type timetable: airflow.timetables.base.Timetable |
| :param start_date: The timestamp from which the scheduler will |
| attempt to backfill |
| :type start_date: datetime.datetime |
| :param end_date: A date beyond which your DAG won't run, leave to None |
| for open ended scheduling |
| :type end_date: datetime.datetime |
| :param template_searchpath: This list of folders (non relative) |
| defines where jinja will look for your templates. Order matters. |
| Note that jinja/airflow includes the path of your DAG file by |
| default |
| :type template_searchpath: str or list[str] |
| :param template_undefined: Template undefined type. |
| :type template_undefined: jinja2.StrictUndefined |
| :param user_defined_macros: a dictionary of macros that will be exposed |
| in your jinja templates. For example, passing ``dict(foo='bar')`` |
| to this argument allows you to ``{{ foo }}`` in all jinja |
| templates related to this DAG. Note that you can pass any |
| type of object here. |
| :type user_defined_macros: dict |
| :param user_defined_filters: a dictionary of filters that will be exposed |
| in your jinja templates. For example, passing |
| ``dict(hello=lambda name: 'Hello %s' % name)`` to this argument allows |
| you to ``{{ 'world' | hello }}`` in all jinja templates related to |
| this DAG. |
| :type user_defined_filters: dict |
| :param default_args: A dictionary of default parameters to be used |
| as constructor keyword parameters when initialising operators. |
| Note that operators have the same hook, and precede those defined |
| here, meaning that if your dict contains `'depends_on_past': True` |
| here and `'depends_on_past': False` in the operator's call |
| `default_args`, the actual value will be `False`. |
| :type default_args: dict |
| :param params: a dictionary of DAG level parameters that are made |
| accessible in templates, namespaced under `params`. These |
| params can be overridden at the task level. |
| :type params: dict |
| :param max_active_tasks: the number of task instances allowed to run |
| concurrently |
| :type max_active_tasks: int |
| :param max_active_runs: maximum number of active DAG runs, beyond this |
| number of DAG runs in a running state, the scheduler won't create |
| new active DAG runs |
| :type max_active_runs: int |
| :param dagrun_timeout: specify how long a DagRun should be up before |
| timing out / failing, so that new DagRuns can be created. The timeout |
| is only enforced for scheduled DagRuns. |
| :type dagrun_timeout: datetime.timedelta |
| :param sla_miss_callback: specify a function to call when reporting SLA |
| timeouts. See :ref:`sla_miss_callback<concepts:sla_miss_callback>` for |
| more information about the function signature and parameters that are |
| passed to the callback. |
| :type sla_miss_callback: callable |
| :param default_view: Specify DAG default view (tree, graph, duration, |
| gantt, landing_times), default tree |
| :type default_view: str |
| :param orientation: Specify DAG orientation in graph view (LR, TB, RL, BT), default LR |
| :type orientation: str |
| :param catchup: Perform scheduler catchup (or only run latest)? Defaults to True |
| :type catchup: bool |
| :param on_failure_callback: A function to be called when a DagRun of this dag fails. |
| A context dictionary is passed as a single parameter to this function. |
| :type on_failure_callback: callable |
| :param on_success_callback: Much like the ``on_failure_callback`` except |
| that it is executed when the dag succeeds. |
| :type on_success_callback: callable |
| :param access_control: Specify optional DAG-level actions, e.g., |
| "{'role1': {'can_read'}, 'role2': {'can_read', 'can_edit'}}" |
| :type access_control: dict |
| :param is_paused_upon_creation: Specifies if the dag is paused when created for the first time. |
| If the dag exists already, this flag will be ignored. If this optional parameter |
| is not specified, the global config setting will be used. |
| :type is_paused_upon_creation: bool or None |
| :param jinja_environment_kwargs: additional configuration options to be passed to Jinja |
| ``Environment`` for template rendering |
| |
| **Example**: to avoid Jinja from removing a trailing newline from template strings :: |
| |
| DAG(dag_id='my-dag', |
| jinja_environment_kwargs={ |
| 'keep_trailing_newline': True, |
| # some other jinja2 Environment options here |
| } |
| ) |
| |
| **See**: `Jinja Environment documentation |
| <https://jinja.palletsprojects.com/en/2.11.x/api/#jinja2.Environment>`_ |
| |
| :type jinja_environment_kwargs: dict |
| :param render_template_as_native_obj: If True, uses a Jinja ``NativeEnvironment`` |
| to render templates as native Python types. If False, a Jinja |
| ``Environment`` is used to render templates as string values. |
| :type render_template_as_native_obj: bool |
| :param tags: List of tags to help filtering DAGs in the UI. |
| :type tags: List[str] |
| |
| .. py:attribute:: fileloc |
| :annotation: :str |
| |
| File path that needs to be imported to load this DAG or subdag. |
| |
| This may not be an actual file on disk in the case when this DAG is loaded |
| from a ZIP file or other DAG distribution format. |
| |
| |
| .. py:method:: __repr__(self) |
| |
| Return repr(self). |
| |
| |
| .. py:method:: __eq__(self, other) |
| |
| Return self==value. |
| |
| |
| .. py:method:: __ne__(self, other) |
| |
| Return self!=value. |
| |
| |
| .. py:method:: __lt__(self, other) |
| |
| Return self<value. |
| |
| |
| .. py:method:: __hash__(self) |
| |
| Return hash(self). |
| |
| |
| .. py:method:: __enter__(self) |
| |
| |
| .. py:method:: __exit__(self, _type, _value, _tb) |
| |
| |
| .. py:method:: date_range(self, start_date: datetime.datetime, num: Optional[int] = None, end_date: Optional[datetime.datetime] = timezone.utcnow()) -> List[datetime.datetime] |
| |
| |
| .. py:method:: is_fixed_time_schedule(self) |
| |
| |
| .. py:method:: following_schedule(self, dttm) |
| |
| Calculates the following schedule for this dag in UTC. |
| |
| :param dttm: utc datetime |
| :return: utc datetime |
| |
| |
| .. py:method:: previous_schedule(self, dttm) |
| |
| |
| .. py:method:: get_next_data_interval(self, dag_model: DagModel) -> Optional[airflow.timetables.base.DataInterval] |
| |
| Get the data interval of the next scheduled run. |
| |
| For compatibility, this method infers the data interval from the DAG's |
| schedule if the run does not have an explicit one set, which is possible |
| for runs created prior to AIP-39. |
| |
| This function is private to Airflow core and should not be depended as a |
| part of the Python API. |
| |
| :meta private: |
| |
| |
| .. py:method:: get_run_data_interval(self, run: airflow.models.dagrun.DagRun) -> airflow.timetables.base.DataInterval |
| |
| Get the data interval of this run. |
| |
| For compatibility, this method infers the data interval from the DAG's |
| schedule if the run does not have an explicit one set, which is possible for |
| runs created prior to AIP-39. |
| |
| This function is private to Airflow core and should not be depended as a |
| part of the Python API. |
| |
| :meta private: |
| |
| |
| .. py:method:: infer_automated_data_interval(self, logical_date: datetime.datetime) -> airflow.timetables.base.DataInterval |
| |
| Infer a data interval for a run against this DAG. |
| |
| This method is used to bridge runs created prior to AIP-39 |
| implementation, which do not have an explicit data interval. Therefore, |
| this method only considers ``schedule_interval`` values valid prior to |
| Airflow 2.2. |
| |
| DO NOT use this method is there is a known data interval. |
| |
| |
| .. py:method:: next_dagrun_info(self, last_automated_dagrun: Union[None, datetime.datetime, airflow.timetables.base.DataInterval], *, restricted: bool = True) -> Optional[airflow.timetables.base.DagRunInfo] |
| |
| Get information about the next DagRun of this dag after ``date_last_automated_dagrun``. |
| |
| This calculates what time interval the next DagRun should operate on |
| (its execution date) and when it can be scheduled, according to the |
| dag's timetable, start_date, end_date, etc. This doesn't check max |
| active run or any other "max_active_tasks" type limits, but only |
| performs calculations based on the various date and interval fields of |
| this dag and its tasks. |
| |
| :param last_automated_dagrun: The ``max(execution_date)`` of |
| existing "automated" DagRuns for this dag (scheduled or backfill, |
| but not manual). |
| :param restricted: If set to *False* (default is *True*), ignore |
| ``start_date``, ``end_date``, and ``catchup`` specified on the DAG |
| or tasks. |
| :return: DagRunInfo of the next dagrun, or None if a dagrun is not |
| going to be scheduled. |
| |
| |
| .. py:method:: next_dagrun_after_date(self, date_last_automated_dagrun: Optional[pendulum.DateTime]) |
| |
| |
| .. py:method:: iter_dagrun_infos_between(self, earliest: Optional[pendulum.DateTime], latest: pendulum.DateTime, *, align: bool = True) -> Iterable[airflow.timetables.base.DagRunInfo] |
| |
| Yield DagRunInfo using this DAG's timetable between given interval. |
| |
| DagRunInfo instances yielded if their ``logical_date`` is not earlier |
| than ``earliest``, nor later than ``latest``. The instances are ordered |
| by their ``logical_date`` from earliest to latest. |
| |
| If ``align`` is ``False``, the first run will happen immediately on |
| ``earliest``, even if it does not fall on the logical timetable schedule. |
| The default is ``True``, but subdags will ignore this value and always |
| behave as if this is set to ``False`` for backward compatibility. |
| |
| Example: A DAG is scheduled to run every midnight (``0 0 * * *``). If |
| ``earliest`` is ``2021-06-03 23:00:00``, the first DagRunInfo would be |
| ``2021-06-03 23:00:00`` if ``align=False``, and ``2021-06-04 00:00:00`` |
| if ``align=True``. |
| |
| |
| .. py:method:: get_run_dates(self, start_date, end_date=None) |
| |
| Returns a list of dates between the interval received as parameter using this |
| dag's schedule interval. Returned dates can be used for execution dates. |
| |
| :param start_date: The start date of the interval. |
| :type start_date: datetime |
| :param end_date: The end date of the interval. Defaults to ``timezone.utcnow()``. |
| :type end_date: datetime |
| :return: A list of dates within the interval following the dag's schedule. |
| :rtype: list |
| |
| |
| .. py:method:: normalize_schedule(self, dttm) |
| |
| |
| .. py:method:: get_last_dagrun(self, session=None, include_externally_triggered=False) |
| |
| |
| .. py:method:: has_dag_runs(self, session=None, include_externally_triggered=True) -> bool |
| |
| |
| .. py:method:: dag_id(self) -> str |
| :property: |
| |
| |
| .. py:method:: full_filepath(self) -> str |
| :property: |
| |
| :meta private: |
| |
| |
| .. py:method:: concurrency(self) -> int |
| :property: |
| |
| |
| .. py:method:: max_active_tasks(self) -> int |
| :property: |
| |
| |
| .. py:method:: access_control(self) |
| :property: |
| |
| |
| .. py:method:: description(self) -> Optional[str] |
| :property: |
| |
| |
| .. py:method:: default_view(self) -> str |
| :property: |
| |
| |
| .. py:method:: pickle_id(self) -> Optional[int] |
| :property: |
| |
| |
| .. py:method:: param(self, name: str, default=None) -> airflow.models.param.DagParam |
| |
| Return a DagParam object for current dag. |
| |
| :param name: dag parameter name. |
| :param default: fallback value for dag parameter. |
| :return: DagParam instance for specified name and current dag. |
| |
| |
| .. py:method:: tasks(self) -> List[airflow.models.baseoperator.BaseOperator] |
| :property: |
| |
| |
| .. py:method:: task_ids(self) -> List[str] |
| :property: |
| |
| |
| .. py:method:: task_group(self) -> airflow.utils.task_group.TaskGroup |
| :property: |
| |
| |
| .. py:method:: filepath(self) -> str |
| :property: |
| |
| :meta private: |
| |
| |
| .. py:method:: relative_fileloc(self) -> pathlib.Path |
| :property: |
| |
| File location of the importable dag 'file' relative to the configured DAGs folder. |
| |
| |
| .. py:method:: folder(self) -> str |
| :property: |
| |
| Folder location of where the DAG object is instantiated. |
| |
| |
| .. py:method:: owner(self) -> str |
| :property: |
| |
| Return list of all owners found in DAG tasks. |
| |
| :return: Comma separated list of owners in DAG tasks |
| :rtype: str |
| |
| |
| .. py:method:: allow_future_exec_dates(self) -> bool |
| :property: |
| |
| |
| .. py:method:: get_concurrency_reached(self, session=None) -> bool |
| |
| Returns a boolean indicating whether the max_active_tasks limit for this DAG |
| has been reached |
| |
| |
| .. py:method:: concurrency_reached(self) |
| :property: |
| |
| This attribute is deprecated. Please use `airflow.models.DAG.get_concurrency_reached` method. |
| |
| |
| .. py:method:: get_is_active(self, session=None) -> Optional[None] |
| |
| Returns a boolean indicating whether this DAG is active |
| |
| |
| .. py:method:: get_is_paused(self, session=None) -> Optional[None] |
| |
| Returns a boolean indicating whether this DAG is paused |
| |
| |
| .. py:method:: is_paused(self) |
| :property: |
| |
| This attribute is deprecated. Please use `airflow.models.DAG.get_is_paused` method. |
| |
| |
| .. py:method:: normalized_schedule_interval(self) -> Optional[ScheduleInterval] |
| :property: |
| |
| |
| .. py:method:: handle_callback(self, dagrun, success=True, reason=None, session=None) |
| |
| Triggers the appropriate callback depending on the value of success, namely the |
| on_failure_callback or on_success_callback. This method gets the context of a |
| single TaskInstance part of this DagRun and passes that to the callable along |
| with a 'reason', primarily to differentiate DagRun failures. |
| |
| .. note: The logs end up in |
| ``$AIRFLOW_HOME/logs/scheduler/latest/PROJECT/DAG_FILE.py.log`` |
| |
| :param dagrun: DagRun object |
| :param success: Flag to specify if failure or success callback should be called |
| :param reason: Completion reason |
| :param session: Database session |
| |
| |
| .. py:method:: get_active_runs(self) |
| |
| Returns a list of dag run execution dates currently running |
| |
| :return: List of execution dates |
| |
| |
| .. py:method:: get_num_active_runs(self, external_trigger=None, only_running=True, session=None) |
| |
| Returns the number of active "running" dag runs |
| |
| :param external_trigger: True for externally triggered active dag runs |
| :type external_trigger: bool |
| :param session: |
| :return: number greater than 0 for active dag runs |
| |
| |
| .. py:method:: get_dagrun(self, execution_date: Optional[str] = None, run_id: Optional[str] = None, session: Optional[sqlalchemy.orm.session.Session] = None) |
| |
| Returns the dag run for a given execution date or run_id if it exists, otherwise |
| none. |
| |
| :param execution_date: The execution date of the DagRun to find. |
| :param run_id: The run_id of the DagRun to find. |
| :param session: |
| :return: The DagRun if found, otherwise None. |
| |
| |
| .. py:method:: get_dagruns_between(self, start_date, end_date, session=None) |
| |
| Returns the list of dag runs between start_date (inclusive) and end_date (inclusive). |
| |
| :param start_date: The starting execution date of the DagRun to find. |
| :param end_date: The ending execution date of the DagRun to find. |
| :param session: |
| :return: The list of DagRuns found. |
| |
| |
| .. py:method:: get_latest_execution_date(self, session: sqlalchemy.orm.session.Session) -> Optional[datetime.datetime] |
| |
| Returns the latest date for which at least one dag run exists |
| |
| |
| .. py:method:: latest_execution_date(self) |
| :property: |
| |
| This attribute is deprecated. Please use `airflow.models.DAG.get_latest_execution_date` method. |
| |
| |
| .. py:method:: subdags(self) |
| :property: |
| |
| Returns a list of the subdag objects associated to this DAG |
| |
| |
| .. py:method:: resolve_template_files(self) |
| |
| |
| .. py:method:: get_template_env(self) -> jinja2.Environment |
| |
| Build a Jinja2 environment. |
| |
| |
| .. py:method:: set_dependency(self, upstream_task_id, downstream_task_id) |
| |
| Simple utility method to set dependency between two tasks that |
| already have been added to the DAG using add_task() |
| |
| |
| .. py:method:: get_task_instances_before(self, base_date: datetime.datetime, num: int, *, session: sqlalchemy.orm.session.Session) -> List[airflow.models.taskinstance.TaskInstance] |
| |
| Get ``num`` task instances before (including) ``base_date``. |
| |
| The returned list may contain exactly ``num`` task instances. It can |
| have less if there are less than ``num`` scheduled DAG runs before |
| ``base_date``, or more if there are manual task runs between the |
| requested period, which does not count toward ``num``. |
| |
| |
| .. py:method:: get_task_instances(self, start_date=None, end_date=None, state=None, session=None) -> List[airflow.models.taskinstance.TaskInstance] |
| |
| |
| .. py:method:: set_task_instance_state(self, task_id: str, execution_date: datetime.datetime, state: airflow.utils.state.State, upstream: Optional[bool] = False, downstream: Optional[bool] = False, future: Optional[bool] = False, past: Optional[bool] = False, commit: Optional[bool] = True, session=None) -> List[airflow.models.taskinstance.TaskInstance] |
| |
| Set the state of a TaskInstance to the given state, and clear its downstream tasks that are |
| in failed or upstream_failed state. |
| |
| :param task_id: Task ID of the TaskInstance |
| :type task_id: str |
| :param execution_date: execution_date of the TaskInstance |
| :type execution_date: datetime |
| :param state: State to set the TaskInstance to |
| :type state: State |
| :param upstream: Include all upstream tasks of the given task_id |
| :type upstream: bool |
| :param downstream: Include all downstream tasks of the given task_id |
| :type downstream: bool |
| :param future: Include all future TaskInstances of the given task_id |
| :type future: bool |
| :param commit: Commit changes |
| :type commit: bool |
| :param past: Include all past TaskInstances of the given task_id |
| :type past: bool |
| |
| |
| .. py:method:: roots(self) -> List[airflow.models.baseoperator.BaseOperator] |
| :property: |
| |
| Return nodes with no parents. These are first to execute and are called roots or root nodes. |
| |
| |
| .. py:method:: leaves(self) -> List[airflow.models.baseoperator.BaseOperator] |
| :property: |
| |
| Return nodes with no children. These are last to execute and are called leaves or leaf nodes. |
| |
| |
| .. py:method:: topological_sort(self, include_subdag_tasks: bool = False) |
| |
| Sorts tasks in topographical order, such that a task comes after any of its |
| upstream dependencies. |
| |
| Heavily inspired by: |
| http://blog.jupo.org/2012/04/06/topological-sorting-acyclic-directed-graphs/ |
| |
| :param include_subdag_tasks: whether to include tasks in subdags, default to False |
| :return: list of tasks in topological order |
| |
| |
| .. py:method:: set_dag_runs_state(self, state: str = State.RUNNING, session: sqlalchemy.orm.session.Session = None, start_date: Optional[datetime.datetime] = None, end_date: Optional[datetime.datetime] = None, dag_ids: List[str] = None) -> None |
| |
| |
| .. py:method:: clear(self, task_ids=None, start_date=None, end_date=None, only_failed=False, only_running=False, confirm_prompt=False, include_subdags=True, include_parentdag=True, dag_run_state: airflow.utils.state.DagRunState = DagRunState.QUEUED, dry_run=False, session=None, get_tis=False, recursion_depth=0, max_recursion_depth=None, dag_bag=None, exclude_task_ids: FrozenSet[str] = frozenset({})) |
| |
| Clears a set of task instances associated with the current dag for |
| a specified date range. |
| |
| :param task_ids: List of task ids to clear |
| :type task_ids: List[str] |
| :param start_date: The minimum execution_date to clear |
| :type start_date: datetime.datetime or None |
| :param end_date: The maximum execution_date to clear |
| :type end_date: datetime.datetime or None |
| :param only_failed: Only clear failed tasks |
| :type only_failed: bool |
| :param only_running: Only clear running tasks. |
| :type only_running: bool |
| :param confirm_prompt: Ask for confirmation |
| :type confirm_prompt: bool |
| :param include_subdags: Clear tasks in subdags and clear external tasks |
| indicated by ExternalTaskMarker |
| :type include_subdags: bool |
| :param include_parentdag: Clear tasks in the parent dag of the subdag. |
| :type include_parentdag: bool |
| :param dag_run_state: state to set DagRun to. If set to False, dagrun state will not |
| be changed. |
| :param dry_run: Find the tasks to clear but don't clear them. |
| :type dry_run: bool |
| :param session: The sqlalchemy session to use |
| :type session: sqlalchemy.orm.session.Session |
| :param dag_bag: The DagBag used to find the dags subdags (Optional) |
| :type dag_bag: airflow.models.dagbag.DagBag |
| :param exclude_task_ids: A set of ``task_id`` that should not be cleared |
| :type exclude_task_ids: frozenset |
| |
| |
| .. py:method:: clear_dags(cls, dags, start_date=None, end_date=None, only_failed=False, only_running=False, confirm_prompt=False, include_subdags=True, include_parentdag=False, dag_run_state=DagRunState.QUEUED, dry_run=False) |
| :classmethod: |
| |
| |
| .. py:method:: __deepcopy__(self, memo) |
| |
| |
| .. py:method:: sub_dag(self, *args, **kwargs) |
| |
| This method is deprecated in favor of partial_subset |
| |
| |
| .. py:method:: partial_subset(self, task_ids_or_regex: Union[str, airflow.typing_compat.RePatternType, Iterable[str]], include_downstream=False, include_upstream=True, include_direct_upstream=False) |
| |
| Returns a subset of the current dag as a deep copy of the current dag |
| based on a regex that should match one or many tasks, and includes |
| upstream and downstream neighbours based on the flag passed. |
| |
| :param task_ids_or_regex: Either a list of task_ids, or a regex to |
| match against task ids (as a string, or compiled regex pattern). |
| :type task_ids_or_regex: [str] or str or re.Pattern |
| :param include_downstream: Include all downstream tasks of matched |
| tasks, in addition to matched tasks. |
| :param include_upstream: Include all upstream tasks of matched tasks, |
| in addition to matched tasks. |
| |
| |
| .. py:method:: has_task(self, task_id: str) |
| |
| |
| .. py:method:: get_task(self, task_id: str, include_subdags: bool = False) -> airflow.models.baseoperator.BaseOperator |
| |
| |
| .. py:method:: pickle_info(self) |
| |
| |
| .. py:method:: pickle(self, session=None) -> airflow.models.dagpickle.DagPickle |
| |
| |
| .. py:method:: tree_view(self) -> None |
| |
| Print an ASCII tree representation of the DAG. |
| |
| |
| .. py:method:: task(self) |
| :property: |
| |
| |
| .. py:method:: add_task(self, task) |
| |
| Add a task to the DAG |
| |
| :param task: the task you want to add |
| :type task: task |
| |
| |
| .. py:method:: add_tasks(self, tasks) |
| |
| Add a list of tasks to the DAG |
| |
| :param tasks: a lit of tasks you want to add |
| :type tasks: list of tasks |
| |
| |
| .. py:method:: run(self, start_date=None, end_date=None, mark_success=False, local=False, executor=None, donot_pickle=conf.getboolean('core', 'donot_pickle'), ignore_task_deps=False, ignore_first_depends_on_past=True, pool=None, delay_on_limit_secs=1.0, verbose=False, conf=None, rerun_failed_tasks=False, run_backwards=False, run_at_least_once=False) |
| |
| Runs the DAG. |
| |
| :param start_date: the start date of the range to run |
| :type start_date: datetime.datetime |
| :param end_date: the end date of the range to run |
| :type end_date: datetime.datetime |
| :param mark_success: True to mark jobs as succeeded without running them |
| :type mark_success: bool |
| :param local: True to run the tasks using the LocalExecutor |
| :type local: bool |
| :param executor: The executor instance to run the tasks |
| :type executor: airflow.executor.base_executor.BaseExecutor |
| :param donot_pickle: True to avoid pickling DAG object and send to workers |
| :type donot_pickle: bool |
| :param ignore_task_deps: True to skip upstream tasks |
| :type ignore_task_deps: bool |
| :param ignore_first_depends_on_past: True to ignore depends_on_past |
| dependencies for the first set of tasks only |
| :type ignore_first_depends_on_past: bool |
| :param pool: Resource pool to use |
| :type pool: str |
| :param delay_on_limit_secs: Time in seconds to wait before next attempt to run |
| dag run when max_active_runs limit has been reached |
| :type delay_on_limit_secs: float |
| :param verbose: Make logging output more verbose |
| :type verbose: bool |
| :param conf: user defined dictionary passed from CLI |
| :type conf: dict |
| :param rerun_failed_tasks: |
| :type: bool |
| :param run_backwards: |
| :type: bool |
| :param run_at_least_once: If true, always run the DAG at least once even |
| if no logical run exists within the time range. |
| :type: bool |
| |
| |
| .. py:method:: cli(self) |
| |
| Exposes a CLI specific to this DAG |
| |
| |
| .. py:method:: create_dagrun(self, state: airflow.utils.state.DagRunState, execution_date: Optional[datetime.datetime] = None, run_id: Optional[str] = None, start_date: Optional[datetime.datetime] = None, external_trigger: Optional[bool] = False, conf: Optional[dict] = None, run_type: Optional[airflow.utils.types.DagRunType] = None, session=None, dag_hash: Optional[str] = None, creating_job_id: Optional[int] = None, data_interval: Optional[Tuple[datetime.datetime, datetime.datetime]] = None) |
| |
| Creates a dag run from this dag including the tasks associated with this dag. |
| Returns the dag run. |
| |
| :param run_id: defines the run id for this dag run |
| :type run_id: str |
| :param run_type: type of DagRun |
| :type run_type: airflow.utils.types.DagRunType |
| :param execution_date: the execution date of this dag run |
| :type execution_date: datetime.datetime |
| :param state: the state of the dag run |
| :type state: airflow.utils.state.DagRunState |
| :param start_date: the date this dag run should be evaluated |
| :type start_date: datetime |
| :param external_trigger: whether this dag run is externally triggered |
| :type external_trigger: bool |
| :param conf: Dict containing configuration/parameters to pass to the DAG |
| :type conf: dict |
| :param creating_job_id: id of the job creating this DagRun |
| :type creating_job_id: int |
| :param session: database session |
| :type session: sqlalchemy.orm.session.Session |
| :param dag_hash: Hash of Serialized DAG |
| :type dag_hash: str |
| :param data_interval: Data interval of the DagRun |
| :type data_interval: tuple[datetime, datetime] | None |
| |
| |
| .. py:method:: bulk_sync_to_db(cls, dags: Collection[DAG], session=None) |
| :classmethod: |
| |
| This method is deprecated in favor of bulk_write_to_db |
| |
| |
| .. py:method:: bulk_write_to_db(cls, dags: Collection[DAG], session=None) |
| :classmethod: |
| |
| Ensure the DagModel rows for the given dags are up-to-date in the dag table in the DB, including |
| calculated fields. |
| |
| Note that this method can be called for both DAGs and SubDAGs. A SubDag is actually a SubDagOperator. |
| |
| :param dags: the DAG objects to save to the DB |
| :type dags: List[airflow.models.dag.DAG] |
| :return: None |
| |
| |
| .. py:method:: sync_to_db(self, session=None) |
| |
| Save attributes about this DAG to the DB. Note that this method |
| can be called for both DAGs and SubDAGs. A SubDag is actually a |
| SubDagOperator. |
| |
| :return: None |
| |
| |
| .. py:method:: get_default_view(self) |
| |
| This is only there for backward compatible jinja2 templates |
| |
| |
| .. py:method:: deactivate_unknown_dags(active_dag_ids, session=None) |
| :staticmethod: |
| |
| Given a list of known DAGs, deactivate any other DAGs that are |
| marked as active in the ORM |
| |
| :param active_dag_ids: list of DAG IDs that are active |
| :type active_dag_ids: list[unicode] |
| :return: None |
| |
| |
| .. py:method:: deactivate_stale_dags(expiration_date, session=None) |
| :staticmethod: |
| |
| Deactivate any DAGs that were last touched by the scheduler before |
| the expiration date. These DAGs were likely deleted. |
| |
| :param expiration_date: set inactive DAGs that were touched before this |
| time |
| :type expiration_date: datetime |
| :return: None |
| |
| |
| .. py:method:: get_num_task_instances(dag_id, task_ids=None, states=None, session=None) |
| :staticmethod: |
| |
| Returns the number of task instances in the given DAG. |
| |
| :param session: ORM session |
| :param dag_id: ID of the DAG to get the task concurrency of |
| :type dag_id: unicode |
| :param task_ids: A list of valid task IDs for the given DAG |
| :type task_ids: list[unicode] |
| :param states: A list of states to filter by if supplied |
| :type states: list[state] |
| :return: The number of running tasks |
| :rtype: int |
| |
| |
| .. py:method:: get_serialized_fields(cls) |
| :classmethod: |
| |
| Stringified DAGs and operators contain exactly these fields. |
| |
| |
| .. py:method:: get_edge_info(self, upstream_task_id: str, downstream_task_id: str) -> airflow.utils.types.EdgeInfoType |
| |
| Returns edge information for the given pair of tasks if present, and |
| None if there is no information. |
| |
| |
| .. py:method:: set_edge_info(self, upstream_task_id: str, downstream_task_id: str, info: airflow.utils.types.EdgeInfoType) |
| |
| Sets the given edge information on the DAG. Note that this will overwrite, |
| rather than merge with, existing info. |
| |
| |
| .. py:method:: validate_schedule_and_params(self) |
| |
| Validates & raise exception if there are any Params in the DAG which neither have a default value nor |
| have the null in schema['type'] list, but the DAG have a schedule_interval which is not None. |
| |
| |
| |
| .. py:class:: DagTag |
| |
| Bases: :py:obj:`airflow.models.base.Base` |
| |
| A tag name per dag, to allow quick filtering in the DAG view. |
| |
| .. py:attribute:: __tablename__ |
| :annotation: = dag_tag |
| |
| |
| |
| .. py:attribute:: name |
| |
| |
| |
| |
| .. py:attribute:: dag_id |
| |
| |
| |
| |
| .. py:method:: __repr__(self) |
| |
| Return repr(self). |
| |
| |
| |
| .. py:class:: DagModel(concurrency=None, **kwargs) |
| |
| Bases: :py:obj:`airflow.models.base.Base` |
| |
| Table containing DAG properties |
| |
| .. py:attribute:: __tablename__ |
| :annotation: = dag |
| |
| These items are stored in the database for state related information |
| |
| |
| .. py:attribute:: dag_id |
| |
| |
| |
| |
| .. py:attribute:: root_dag_id |
| |
| |
| |
| |
| .. py:attribute:: is_paused_at_creation |
| |
| |
| |
| |
| .. py:attribute:: is_paused |
| |
| |
| |
| |
| .. py:attribute:: is_subdag |
| |
| |
| |
| |
| .. py:attribute:: is_active |
| |
| |
| |
| |
| .. py:attribute:: last_parsed_time |
| |
| |
| |
| |
| .. py:attribute:: last_pickled |
| |
| |
| |
| |
| .. py:attribute:: last_expired |
| |
| |
| |
| |
| .. py:attribute:: scheduler_lock |
| |
| |
| |
| |
| .. py:attribute:: pickle_id |
| |
| |
| |
| |
| .. py:attribute:: fileloc |
| |
| |
| |
| |
| .. py:attribute:: owners |
| |
| |
| |
| |
| .. py:attribute:: description |
| |
| |
| |
| |
| .. py:attribute:: default_view |
| |
| |
| |
| |
| .. py:attribute:: schedule_interval |
| |
| |
| |
| |
| .. py:attribute:: tags |
| |
| |
| |
| |
| .. py:attribute:: max_active_tasks |
| |
| |
| |
| |
| .. py:attribute:: max_active_runs |
| |
| |
| |
| |
| .. py:attribute:: has_task_concurrency_limits |
| |
| |
| |
| |
| .. py:attribute:: has_import_errors |
| |
| |
| |
| |
| .. py:attribute:: next_dagrun |
| |
| |
| |
| |
| .. py:attribute:: next_dagrun_data_interval_start |
| |
| |
| |
| |
| .. py:attribute:: next_dagrun_data_interval_end |
| |
| |
| |
| |
| .. py:attribute:: next_dagrun_create_after |
| |
| |
| |
| |
| .. py:attribute:: __table_args__ |
| |
| |
| |
| |
| .. py:attribute:: parent_dag |
| |
| |
| |
| |
| .. py:attribute:: NUM_DAGS_PER_DAGRUN_QUERY |
| |
| |
| |
| |
| .. py:method:: __repr__(self) |
| |
| Return repr(self). |
| |
| |
| .. py:method:: next_dagrun_data_interval(self) -> Optional[airflow.timetables.base.DataInterval] |
| :property: |
| |
| |
| .. py:method:: timezone(self) |
| :property: |
| |
| |
| .. py:method:: get_dagmodel(dag_id, session=None) |
| :staticmethod: |
| |
| |
| .. py:method:: get_current(cls, dag_id, session=None) |
| :classmethod: |
| |
| |
| .. py:method:: get_last_dagrun(self, session=None, include_externally_triggered=False) |
| |
| |
| .. py:method:: get_paused_dag_ids(dag_ids: List[str], session: sqlalchemy.orm.session.Session = None) -> Set[str] |
| :staticmethod: |
| |
| Given a list of dag_ids, get a set of Paused Dag Ids |
| |
| :param dag_ids: List of Dag ids |
| :param session: ORM Session |
| :return: Paused Dag_ids |
| |
| |
| .. py:method:: get_default_view(self) -> str |
| |
| Get the Default DAG View, returns the default config value if DagModel does not |
| have a value |
| |
| |
| .. py:method:: safe_dag_id(self) |
| :property: |
| |
| |
| .. py:method:: relative_fileloc(self) -> Optional[pathlib.Path] |
| :property: |
| |
| File location of the importable dag 'file' relative to the configured DAGs folder. |
| |
| |
| .. py:method:: set_is_paused(self, is_paused: bool, including_subdags: bool = True, session=None) -> None |
| |
| Pause/Un-pause a DAG. |
| |
| :param is_paused: Is the DAG paused |
| :param including_subdags: whether to include the DAG's subdags |
| :param session: session |
| |
| |
| .. py:method:: deactivate_deleted_dags(cls, alive_dag_filelocs: List[str], session=None) |
| :classmethod: |
| |
| Set ``is_active=False`` on the DAGs for which the DAG files have been removed. |
| |
| :param alive_dag_filelocs: file paths of alive DAGs |
| :param session: ORM Session |
| |
| |
| .. py:method:: dags_needing_dagruns(cls, session: sqlalchemy.orm.session.Session) |
| :classmethod: |
| |
| Return (and lock) a list of Dag objects that are due to create a new DagRun. |
| |
| This will return a resultset of rows that is row-level-locked with a "SELECT ... FOR UPDATE" query, |
| you should ensure that any scheduling decisions are made in a single transaction -- as soon as the |
| transaction is committed it will be unlocked. |
| |
| |
| .. py:method:: calculate_dagrun_date_fields(self, dag: DAG, most_recent_dag_run: Union[None, datetime.datetime, airflow.timetables.base.DataInterval]) -> None |
| |
| Calculate ``next_dagrun`` and `next_dagrun_create_after`` |
| |
| :param dag: The DAG object |
| :param most_recent_dag_run: DataInterval (or datetime) of most recent run of this dag, or none |
| if not yet scheduled. |
| |
| |
| |
| .. py:function:: dag(*dag_args, **dag_kwargs) |
| |
| Python dag decorator. Wraps a function into an Airflow DAG. |
| Accepts kwargs for operator kwarg. Can be used to parameterize DAGs. |
| |
| :param dag_args: Arguments for DAG object |
| :type dag_args: Any |
| :param dag_kwargs: Kwargs for DAG object. |
| :type dag_kwargs: Any |
| |
| |
| .. py:class:: DagContext |
| |
| DAG context is used to keep the current DAG when DAG is used as ContextManager. |
| |
| You can use DAG as context: |
| |
| .. code-block:: python |
| |
| with DAG( |
| dag_id="example_dag", |
| default_args=default_args, |
| schedule_interval="0 0 * * *", |
| dagrun_timeout=timedelta(minutes=60), |
| ) as dag: |
| ... |
| |
| If you do this the context stores the DAG and whenever new task is created, it will use |
| such stored DAG as the parent DAG. |
| |
| |
| .. py:method:: push_context_managed_dag(cls, dag: DAG) |
| :classmethod: |
| |
| |
| .. py:method:: pop_context_managed_dag(cls) -> Optional[DAG] |
| :classmethod: |
| |
| |
| .. py:method:: get_current_dag(cls) -> Optional[DAG] |
| :classmethod: |
| |
| |
| |