tree: ae09baca92b406fe7bd73e14fa58b7c6a16aeb2a [path history] [tgz]
  1. providers/
  2. utils/
  3. __init__.py
  4. conftest.py
  5. README.md
tests/system/README.md

Airflow System Tests

System tests verify the correctness of Airflow Operators by running them in DAGs and allowing to communicate with external services. A system test tries to look as close to a regular DAG as possible, and it generally checks the “happy path” (a scenario featuring no errors) ensuring that the Operator works as expected.

The purpose of these tests is to:

  • assure high quality of providers and their integration with Airflow core,
  • avoid regression in providers when doing changes to the Airflow,
  • autogenerate documentation for Operators from code,
  • provide runnable example DAGs with use cases for different Operators,
  • serve both as examples and test files.

This is the new design of system tests which temporarily exists along with the old one documented at TESTING.rst and soon will completely replace it. The new design is based on the AIP-47. Please use it and write any new system tests according to this documentation.

How to run system tests

There are multiple ways of running system tests. Each system test is a self-contained DAG, so it can be run as any other DAG. Some tests may require access to external services, enabled APIs or specific permissions. Make sure to prepare your environment correctly, depending on the system tests you want to run - some may require additional configuration which should be documented by the relevant providers in their subdirectory tests/system/providers/<provider_name>/README.md.

Running via Airflow

If you have a working Airflow environment with a scheduler and a webserver, you can import system test files into your Airflow instance and they will be automatically triggered. If the setup of the environment is correct (depending on the type of tests you want to run), they should be executed without any issues. The instructions on how to set up the environment is documented in each provider's system tests directory. Make sure that all resource required by the tests are also imported.

Running via Pytest

Running system tests with pytest is the easiest with Breeze. Thanks to it, you don't need to bother about setting up the correct environment, that is able to execute the tests. You can either run them using your IDE (if you have installed plugin/widget supporting pytest) or using the following example of command:

# pytest --system [provider_name] [path_to_test(s)]
pytest --system google tests/system/providers/google/cloud/bigquery/example_bigquery_queries.py

You can specify several --system flags if you want to execute tests for several providers:

pytest --system google --system aws tests/system

Running via Airflow CLI

It is possible to run system tests using Airflow CLI. To execute a specific system test, you need to provide dag_id of the test to be run, execution_date (preferably the one from the past) and a -S/--subdir option followed by the path where the tests are stored (the command by default looks into $AIRFLOW_HOME/dags):

# airflow dags test -S [path_to_tests] [dag_id] [execution date]
airflow dags test -S tests/system bigquery_dataset 2022-01-01

Some additional setup may be required to use Airflow CLI. Please refer here for a documentation.

How to write system tests

If you are going to implement new system tests, it is recommended to familiarize with the content of the AIP-47. There are many changes in comparison to the old design documented at TESTING.rst, so you need to be aware of them and be compliant with the new design.

To make it easier to migrate old system tests or write new ones, we documented the whole process of migration in details (which can be found here) and also prepared an example of a test (located just below the migration details).