Apache Airflow v2.0.2

Bug Fixes
"""""""""

* Bugfix: ``TypeError`` when Serializing & sorting iterable properties of DAGs (#15395)
* Fix missing ``on_load`` trigger for folder-based plugins (#15208)
* ``kubernetes cleanup-pods`` subcommand will only clean up Airflow-created Pods (#15204)
* Fix password masking in CLI action_logging (#15143)
* Fix url generation for TriggerDagRunOperatorLink (#14990)
* Restore base lineage backend (#14146)
* Unable to trigger backfill or manual jobs with Kubernetes executor. (#14160)
* Bugfix: Task docs are not shown in the Task Instance Detail View (#15191)
* Bugfix: Fix overriding ``pod_template_file`` in KubernetesExecutor (#15197)
* Bugfix: resources in ``executor_config`` breaks Graph View in UI (#15199)
* Fix celery executor bug trying to call len on map (#14883)
* Fix bug in airflow.stats timing that broke dogstatsd mode (#15132)
* Avoid scheduler/parser manager deadlock by using non-blocking IO (#15112)
* Re-introduce ``dagrun.schedule_delay`` metric (#15105)
* Compare string values, not if strings are the same object in Kube executor(#14942)
* Pass queue to BaseExecutor.execute_async like in airflow 1.10 (#14861)
* Scheduler: Remove TIs from starved pools from the critical path. (#14476)
* Remove extra/needless deprecation warnings from airflow.contrib module (#15065)
* Fix support for long dag_id and task_id in KubernetesExecutor (#14703)
* Sort lists, sets and tuples in Serialized DAGs (#14909)
* Simplify cleaning string passed to origin param (#14738) (#14905)
* Fix error when running tasks with Sentry integration enabled. (#13929)
* Webserver: Sanitize string passed to origin param (#14738)
* Fix losing duration < 1 secs in tree (#13537)
* Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils (#14812)
* Fix KubernetesExecutor issue with deleted pending pods (#14810)
* Default to Celery Task model when backend model does not exist (#14612)
* Bugfix: Plugins endpoint was unauthenticated (#14570)
* BugFix: fix DAG doc display (especially for TaskFlow DAGs) (#14564)
* BugFix: TypeError in airflow.kubernetes.pod_launcher's monitor_pod (#14513)
* Bugfix: Fix wrong output of tags and owners in dag detail API endpoint (#14490)
* Fix logging error with task error when JSON logging is enabled (#14456)
* Fix statsd metrics not sending when using daemon mode (#14454)
* Gracefully handle missing start_date and end_date for DagRun (#14452)
* BugFix: Serialize max_retry_delay as a timedelta (#14436)
* Fix crash when user clicks on  "Task Instance Details" caused by start_date being None (#14416)
* BugFix: Fix TaskInstance API call fails if a task is removed from running DAG (#14381)
* Scheduler should not fail when invalid ``executor_config`` is passed (#14323)
* Fix bug allowing task instances to survive when dagrun_timeout is exceeded (#14321)
* Fix bug where DAG timezone was not always shown correctly in UI tooltips (#14204)
* Use ``Lax`` for ``cookie_samesite`` when empty string is passed (#14183)
* [AIRFLOW-6076] fix ``dag.cli()`` KeyError (#13647)
* Fix running child tasks in a subdag after clearing a successful subdag (#14776)

Improvements
""""""""""""

* Remove unused JS packages causing false security alerts (#15383)
* Change default of ``[kubernetes] enable_tcp_keepalive`` for new installs to ``True`` (#15338)
* Fixed #14270: Add error message in OOM situations (#15207)
* Better compatibility/diagnostics for arbitrary UID in docker image (#15162)
* Updates 3.6 limits for latest versions of a few libraries (#15209)
* Adds Blinker dependency which is missing after recent changes (#15182)
* Remove 'conf' from search_columns in DagRun View (#15099)
* More proper default value for namespace in K8S cleanup-pods CLI (#15060)
* Faster default role syncing during webserver start (#15017)
* Speed up webserver start when there are many DAGs (#14993)
* Much easier to use and better documented Docker image (#14911)
* Use ``libyaml`` C library when available. (#14577)
* Don't create unittest.cfg when not running in unit test mode (#14420)
* Webserver: Allow Filtering TaskInstances by queued_dttm (#14708)
* Update Flask-AppBuilder dependency to allow 3.2 (and all 3.x series) (#14665)
* Remember expanded task groups in browser local storage (#14661)
* Add plain format output to cli tables (#14546)
* Make ``airflow dags show`` command display TaskGroups (#14269)
* Increase maximum size of ``extra`` connection field. (#12944)
* Speed up clear_task_instances by doing a single sql delete for TaskReschedule (#14048)
* Add more flexibility with FAB menu links (#13903)
* Add better description and guidance in case of sqlite version mismatch (#14209)

Doc only changes
""""""""""""""""

* Add documentation create/update community providers (#15061)
* Fix mistake and typos in airflow.utils.timezone docstrings (#15180)
* Replace new url for Stable Airflow Docs (#15169)
* Docs: Clarify behavior of delete_worker_pods_on_failure (#14958)
* Create a documentation package for Docker image (#14846)
* Multiple minor doc (OpenAPI) fixes (#14917)
* Replace Graph View Screenshot to show Auto-refresh (#14571)

Misc/Internal
"""""""""""""

* Import Connection lazily in hooks to avoid cycles (#15361)
* Rename last_scheduler_run into last_parsed_time, and ensure it's updated in DB (#14581)
* Make TaskInstance.pool_slots not nullable with a default of 1 (#14406)
* Log migrations info in consistent way (#14158)
-----BEGIN PGP SIGNATURE-----

iQJDBAABCAAtFiEEXMrqx1jtZMoyPwU7gHxzGoyCoJUFAmB5uYEPHGFzaEBhcGFj
aGUub3JnAAoJEIB8cxqMgqCVhCIQAIusGlMO/ktYoteTyKwlo7nUtsfhgxbdgC+g
ceXCw6R9nErUmMPkfYkhHMCOveonqxC5WpAmq+BoX0fE0BNinyqW4iJW6tSyeylv
2RE+nChQbr8lGBZzbKish1buRNiLRWxKNU4Os94hAYl3B5mqdONA4sA2if12NmFv
/XOI3o1nl46DQ4PmKVT5lSCbXjmxpSVOHJ9cuj5tXfikWIYNAk+wzD57rVu7b80o
089C+rkm15rQYEXop6o3bAXp5XI7RlcVV+L2IzjWMIbf/ZwlIKnGLKCMv8gKWfjc
rdtYnt3HJQesBQX+ZLVDcP+IQ8QkdiS5AESLDQ/vWgQ+ztU7ijQMulEdgFyz00eb
Dz/XREGP1y+rB4HRh/OWHAvxNLDc7vkTIq1b1kPTaq3e5+G1QMu/GhNaXO2bjL0e
+KDuukNrmXQD0jTWrce3BOyI+PVRlN0xGZYI76ErgbTUuUcp1ot91NrIHLnUF5FC
mgtpDXQg579cVQFZs0B96yVBY19Jlr7N6x1OYi3flNe/kML5+V1uoJUKzMR27aX4
6YjHwdrP/Apv2bFVKPBnLe9pm1vQXRKWtDK7O/O/NJ/6gmwiTSp+FdkeBZd34GYi
uZuuI6hqjr11D+HFdVfFPIA6zvDw4JnINZvd7jaah1EMwZGiEH47Mkin4b65P5Ri
CQ0/e0Vg
=Azb1
-----END PGP SIGNATURE-----

commit e494306fb01f3a026e7e2832ca94902e96b526fa
Author: Ash Berlin-Taylor <ash@apache.org>
Date:   Fri Apr 16 16:58:22 2021 +0100

    Update version to 2.0.2

diff --git a/setup.py b/setup.py
index 5ec7d370f..0ae7bd080 100644
--- a/setup.py
+++ b/setup.py
@@ -39,7 +39,7 @@ INSTALL_PROVIDERS_FROM_SOURCES = 'INSTALL_PROVIDERS_FROM_SOURCES'

 logger = logging.getLogger(__name__)

-version = '2.0.1'
+version = '2.0.2'

 my_dir = dirname(__file__)
 Update version in docs/start/docker*
2 files changed
tree: e0497b377f62a0a9dd44d51dbfbff1811e4fc8a2
  1. .github/
  2. airflow/
  3. chart/
  4. clients/
  5. dags/
  6. dev/
  7. docker-context-files/
  8. docs/
  9. empty/
  10. hooks/
  11. images/
  12. kubernetes_tests/
  13. license-templates/
  14. licenses/
  15. manifests/
  16. metastore_browser/
  17. provider_packages/
  18. scripts/
  19. tests/
  20. .asf.yaml
  21. .bash_completion
  22. .coveragerc
  23. .dockerignore
  24. .editorconfig
  25. .flake8
  26. .gitignore
  27. .gitmodules
  28. .hadolint.yaml
  29. .mailmap
  30. .markdownlint.yml
  31. .pre-commit-config.yaml
  32. .rat-excludes
  33. .readthedocs.yml
  34. breeze
  35. breeze-complete
  36. BREEZE.rst
  37. CHANGELOG.txt
  38. CI.rst
  39. CODE_OF_CONDUCT.md
  40. codecov.yml
  41. COMMITTERS.rst
  42. confirm
  43. CONTRIBUTING.rst
  44. CONTRIBUTORS_QUICK_START.rst
  45. Dockerfile
  46. Dockerfile.ci
  47. IMAGES.rst
  48. INSTALL
  49. INTHEWILD.md
  50. LICENSE
  51. LOCAL_VIRTUALENV.rst
  52. MANIFEST.in
  53. NOTICE
  54. PULL_REQUEST_WORKFLOW.rst
  55. pylintrc
  56. pylintrc-tests
  57. pyproject.toml
  58. pytest.ini
  59. README.md
  60. setup.cfg
  61. setup.py
  62. STATIC_CODE_CHECKS.rst
  63. TESTING.rst
  64. UPDATING.md
  65. yamllint-config.yml
README.md

Apache Airflow

PyPI version GitHub Build Coverage Status License PyPI - Python Version Docker Pulls Docker Stars PyPI - Downloads Code style: black Twitter Follow Slack Status

Apache Airflow (or simply Airflow) is a platform to programmatically author, schedule, and monitor workflows.

When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.

Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.

Table of contents

Project Focus

Airflow works best with workflows that are mostly static and slowly changing. When DAG structure is similar from one run to the next, it allows for clarity around unit of work and continuity. Other similar projects include Luigi, Oozie and Azkaban.

Airflow is commonly used to process data, but has the opinion that tasks should ideally be idempotent (i.e. results of the task will be the same, and will not create duplicated data in a destination system), and should not pass large quantities of data from one task to the next (though tasks can pass metadata using Airflow's Xcom feature). For high-volume, data-intensive tasks, a best practice is to delegate to external services that specialize on that type of work.

Airflow is not a streaming solution, but it is often used to process real-time data, pulling data off streams in batches.

Principles

  • Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writing code that instantiates pipelines dynamically.
  • Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
  • Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built into the core of Airflow using the powerful Jinja templating engine.
  • Scalable: Airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers.

Requirements

Apache Airflow is tested with:

Master version (dev)Stable version (2.0.1)Previous version (1.10.14)
Python3.6, 3.7, 3.83.6, 3.7, 3.82.7, 3.5, 3.6, 3.7, 3.8
PostgreSQL9.6, 10, 11, 12, 139.6, 10, 11, 12, 139.6, 10, 11, 12, 13
MySQL5.7, 85.7, 85.6, 5.7
SQLite3.15.0+3.15.0+3.15.0+
Kubernetes1.20, 1.19, 1.181.20, 1.19, 1.181.18, 1.17, 1.16

Note: MySQL 5.x versions are unable to or have limitations with running multiple schedulers -- please see the “Scheduler” docs. MariaDB is not tested/recommended.

Note: SQLite is used in Airflow tests. Do not use it in production. We recommend using the latest stable version of SQLite for local development.

Support for Python versions

As of Airflow 2.0 we agreed to certain rules we follow for Python support. They are based on the official release schedule of Python, nicely summarized in the Python Developer's Guide

  1. We finish support for python versions when they reach EOL (For python 3.6 it means that we will remove it from being supported on 23.12.2021).

  2. The “oldest” supported version of Python is the default one. “Default” is only meaningful in terms of “smoke tests” in CI PRs which are run using this default version.

  3. We support a new version of Python after it is officially released, as soon as we manage to make it works in our CI pipeline (which might not be immediate) and release a new version of Airflow (non-Patch version) based on this CI set-up.

Additional notes on Python version requirements

  • Previous version requires at least Python 3.5.3 when using Python 3

Getting started

Visit the official Airflow website documentation (latest stable release) for help with installing Airflow, getting started, or walking through a more complete tutorial.

Note: If you're looking for documentation for master branch (latest development branch): you can find it on s.apache.org/airflow-docs.

For more information on Airflow's Roadmap or Airflow Improvement Proposals (AIPs), visit the Airflow Wiki.

Official Docker (container) images for Apache Airflow are described in IMAGES.rst.

Installing from PyPI

We publish Apache Airflow as apache-airflow package in PyPI. Installing it however might be sometimes tricky because Airflow is a bit of both a library and application. Libraries usually keep their dependencies open and applications usually pin them, but we should do neither and both at the same time. We decided to keep our dependencies as open as possible (in setup.py) so users can install different versions of libraries if needed. This means that from time to time plain pip install apache-airflow will not work or will produce unusable Airflow installation.

In order to have repeatable installation, however, introduced in Airflow 1.10.10 and updated in Airflow 1.10.12 we also keep a set of “known-to-be-working” constraint files in the orphan constraints-master, constraints-2-0 and constraints-1-10 branches. We keep those “known-to-be-working” constraints files separately per major/minor python version. You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify correct Airflow tag/version/branch and python versions in the URL.

  1. Installing just Airflow:

NOTE!!!

On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver might work with Apache Airflow as of 20.3.3, but it might lead to errors in installation. It might depend on your choice of extras. In order to install Airflow reliably, you might need to either downgrade pip to version 20.2.4 pip install --upgrade pip==20.2.4 or, in case you use Pip 20.3, you might need to add option] --use-deprecated legacy-resolver to your pip install command. While pip 20.3.3 solved most of the teething problems of 20.3, this note will remain here until we set pip 20.3 as official version in our CI pipeline where we are testing the installation as well. Due to those constraints, only pip installation is currently officially supported.

While they are some successes with using other tools like poetry or pip-tools, they do not share the same workflow as pip - especially when it comes to constraint vs. requirements management. Installing via Poetry or pip-tools is not currently supported.

If you wish to install airflow using those tools you should use the constraint files and convert them to appropriate format and workflow that your tool requires.

pip install apache-airflow==2.0.1 \
 --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.0.1/constraints-3.7.txt"
  1. Installing with extras (for example postgres,google)
pip install apache-airflow[postgres,google]==2.0.1 \
 --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.0.1/constraints-3.7.txt"

For information on installing provider packages check providers.

Official source code

Apache Airflow is an Apache Software Foundation (ASF) project, and our official source code releases:

Following the ASF rules, the source packages released must be sufficient for a user to build and test the release provided they have access to the appropriate platform and tools.

Convenience packages

There are other ways of installing and using Airflow. Those are “convenience” methods - they are not “official releases” as stated by the ASF Release Policy, but they can be used by the users who do not want to build the software themselves.

Those are - in the order of most common ways people install Airflow:

  • PyPI releases to install Airflow using standard pip tool
  • Docker Images to install airflow via docker tool, use them in Kubernetes, Helm Charts, docker-compose, docker swarm etc. You can read more about using, customising, and extending the images in the Latest docs, and learn details on the internals in the IMAGES.rst document.
  • Tags in GitHub to retrieve the git project sources that were used to generate official source packages via git

All those artifacts are not official releases, but they are prepared using officially released sources. Some of those artifacts are “development” or “pre-release” ones, and they are clearly marked as such following the ASF Policy.

User Interface

  • DAGs: Overview of all DAGs in your environment.

    DAGs

  • Tree View: Tree representation of a DAG that spans across time.

    Tree View

  • Graph View: Visualization of a DAG's dependencies and their current status for a specific run.

    Graph View

  • Task Duration: Total time spent on different tasks over time.

    Task Duration

  • Gantt View: Duration and overlap of a DAG.

    Gantt View

  • Code View: Quick way to view source code of a DAG.

    Code View

Contributing

Want to help build Apache Airflow? Check out our contributing documentation.

Who uses Apache Airflow?

More than 350 organizations are using Apache Airflow in the wild.

Who Maintains Apache Airflow?

Airflow is the work of the community, but the core committers/maintainers are responsible for reviewing and merging PRs as well as steering conversation around new feature requests. If you would like to become a maintainer, please review the Apache Airflow committer requirements.

Can I use the Apache Airflow logo in my presentation?

Yes! Be sure to abide by the Apache Foundation trademark policies and the Apache Airflow Brandbook. The most up to date logos are found in this repo and on the Apache Software Foundation website.

Airflow merchandise

If you would love to have Apache Airflow stickers, t-shirt etc. then check out Redbubble Shop.

Links