Apache Airflow

Clone this repo:
  1. 6018532 Fixed conditions for upgrade to latest requirements (#8013) by Jarek Potiuk · 8 weeks ago master
  2. 8426723 [AIRFLOW] Force PGPORT env. var to be a string (#7773) by jjmurre · 8 weeks ago
  3. 7282591 Upgrading to latest requirements is eager (#7980) by Jarek Potiuk · 8 weeks ago
  4. 9626b03 [AIRFLOW-6574] Adding private_environment to docker operator. (#7671) by Ashton Hudson · 8 weeks ago
  5. 7790239 [AIRFLOW-7075] Operators for storing information from GCS into GA (#7743) by Tomek Urbaszek · 8 weeks ago

Apache Airflow

PyPI version Build Status Coverage Status Documentation Status License PyPI - Python Version Twitter Follow Slack Status

Apache Airflow (or simply Airflow) is a platform to programmatically author, schedule, and monitor workflows.

When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.

Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.

Table of contents


Apache Airflow is tested with:

Master version (2.0.0dev)

  • Python versions: 3.6, 3.7
  • Postgres DB: 9.6, 10
  • MySQL DB: 5.7
  • Sqlite - latest stable (it is used mainly for development purpose)

Stable version (1.10.9)

  • Python versions: 2.7, 3.5, 3.6, 3.7
  • Postgres DB: 9.6, 10
  • MySQL DB: 5.6, 5.7
  • Sqlite - latest stable (it is used mainly for development purpose)

Getting started

Please visit the Airflow Platform documentation (latest stable release) for help with installing Airflow, getting a quick start, or a more complete tutorial.

Documentation of GitHub master (latest development branch): ReadTheDocs Documentation

For further information, please visit the Airflow Wiki.

Beyond the Horizon

Airflow is not a data streaming solution. Tasks do not move data from one to the other (though tasks can exchange metadata!). Airflow is not in the Spark Streaming or Storm space, it is more comparable to Oozie or Azkaban.

Workflows are expected to be mostly static or slowly changing. You can think of the structure of the tasks in your workflow as slightly more dynamic than a database structure would be. Airflow workflows are expected to look similar from a run to the next, this allows for clarity around unit of work and continuity.


  • Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writing code that instantiates pipelines dynamically.
  • Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
  • Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built into the core of Airflow using the powerful Jinja templating engine.
  • Scalable: Airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers.

User Interface

  • DAGs: Overview of all DAGs in your environment.

  • Tree View: Tree representation of a DAG that spans across time.

  • Graph View: Visualization of a DAG's dependencies and their current status for a specific run.

  • Task Duration: Total time spent on different tasks over time.

  • Gantt View: Duration and overlap of a DAG.

  • Code View: Quick way to view source code of a DAG.