Contributions are welcome and are greatly appreciated! Every little bit helps, and credit will always be given.
Report bugs through Apache Jira
Please report relevant information and preferably code that exhibits the problem.
Look through the Jira issues for bugs. Anything is open to whoever wants to implement it.
Look through the Apache Jira for features. Any unassigned “Improvement” issue is open to whoever wants to implement it.
We've created the operators, hooks, macros and executors we needed, but we made sure that this part of Airflow is extensible. New operators, hooks, macros and executors are very welcomed!
Airflow could always use better documentation, whether as part of the official Airflow docs, in docstrings, docs/*.rst
or even on the web as blog posts or articles.
The best way to send feedback is to open an issue on Apache Jira
If you are proposing a feature:
The latest API documentation is usually available here. To generate a local version, you need to have set up an Airflow development environemnt (see below). Also install the doc
extra.
pip install -e '.[doc]'
Generate the documentation by running:
cd docs && ./build.sh
Only a subset of the API reference documentation builds. Install additional extras to build the full API reference.
There are three ways to setup an Apache Airflow development environment.
Install Python (2.7.x or 3.4.x), MySQL, and libxml by using system-level package managers like yum, apt-get for Linux, or Homebrew for Mac OS at first. Refer to the base CI Dockerfile for a comprehensive list of required packages.
Then install python development requirements. It is usually best to work in a virtualenv:
cd $AIRFLOW_HOME virtualenv env source env/bin/activate pip install -e '.[devel]'
Go to your Airflow directory and start a new docker container. You can choose between Python 2 or 3, whatever you prefer.
# Start docker in your Airflow directory docker run -t -i -v `pwd`:/airflow/ -w /airflow/ python:3 bash # To install all of airflows dependencies to run all tests (this is a lot) pip install -e . # To run only certain tests install the devel requirements and whatever is required # for your test. See setup.py for the possible requirements. For example: pip install -e '.[gcp,devel]' # Init the database airflow initdb nosetests -v tests/hooks/test_druid_hook.py test_get_first_record (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok test_get_records (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok test_get_uri (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok test_get_conn_url (tests.hooks.test_druid_hook.TestDruidHook) ... ok test_submit_gone_wrong (tests.hooks.test_druid_hook.TestDruidHook) ... ok test_submit_ok (tests.hooks.test_druid_hook.TestDruidHook) ... ok test_submit_timeout (tests.hooks.test_druid_hook.TestDruidHook) ... ok test_submit_unknown_response (tests.hooks.test_druid_hook.TestDruidHook) ... ok ---------------------------------------------------------------------- Ran 8 tests in 3.036s OK
The Airflow code is mounted inside of the Docker container, so if you change something using your favorite IDE, you can directly test is in the container.
Start a docker container through Compose for development to avoid installing the packages directly on your system. The following will give you a shell inside a container, run all required service containers (MySQL, PostgresSQL, krb5 and so on) and install all the dependencies:
docker-compose -f scripts/ci/docker-compose.yml run airflow-testing bash # From the container export TOX_ENV=py27-backend_mysql-env_docker /app/scripts/ci/run-ci.sh
If you wish to run individual tests inside of Docker environment you can do as follows:
# From the container (with your desired environment) with druid hook export TOX_ENV=py27-backend_mysql-env_docker /app/scripts/ci/run-ci.sh -- tests/hooks/test_druid_hook.py
To run tests locally, once your unit test environment is setup (directly on your system or through our Docker setup) you should be able to simply run ./run_unit_tests.sh
at will.
For example, in order to just execute the “core” unit tests, run the following:
./run_unit_tests.sh tests.core:CoreTest -s --logging-level=DEBUG
or a single test method:
./run_unit_tests.sh tests.core:CoreTest.test_check_operators -s --logging-level=DEBUG
or another example:
./run_unit_tests.sh tests.contrib.operators.test_dataproc_operator:DataprocClusterCreateOperatorTest.test_create_cluster_deletes_error_cluster -s --logging-level=DEBUG
To run the whole test suite with Docker Compose, do:
# Install Docker Compose first, then this will run the tests docker-compose -f scripts/ci/docker-compose.yml run airflow-testing /app/scripts/ci/run-ci.sh
Alternatively can also set up Travis CI on your repo to automate this. It is free for open source projects.
Another great way of automating linting and testing is to use Git Hooks. For example you could create a pre-commit
file based on the Travis CI Pipeline so that before each commit a local pipeline will be triggered and if this pipeline fails (returns an exit code other than 0
) the commit does not come through. This “in theory” has the advantage that you can not commit any code that fails that again reduces the errors in the Travis CI Pipelines.
Since there are a lot of tests the script would last very long so you propably only should test your new feature locally.
The following example of a pre-commit
file allows you..
#!/bin/sh GREEN='\033[0;32m' NO_COLOR='\033[0m' setup_python_env() { local venv_path=${1} echo -e "${GREEN}Activating python virtual environment ${venv_path}..${NO_COLOR}" source ${venv_path} } run_linting() { local project_dir=$(git rev-parse --show-toplevel) echo -e "${GREEN}Running flake8 over directory ${project_dir}..${NO_COLOR}" flake8 ${project_dir} } run_testing_in_docker() { local feature_path=${1} local airflow_py2_container=${2} local airflow_py3_container=${3} echo -e "${GREEN}Running tests in ${feature_path} in airflow python 2 docker container..${NO_COLOR}" docker exec -i -w /airflow/ ${airflow_py2_container} nosetests -v ${feature_path} echo -e "${GREEN}Running tests in ${feature_path} in airflow python 3 docker container..${NO_COLOR}" docker exec -i -w /airflow/ ${airflow_py3_container} nosetests -v ${feature_path} } set -e # NOTE: Before running this make sure you have set the function arguments correctly. setup_python_env /Users/feluelle/venv/bin/activate run_linting run_testing_in_docker tests/contrib/hooks/test_imap_hook.py dazzling_chatterjee quirky_stallman
For more information on how to run a subset of the tests, take a look at the nosetests docs.
See also the list of test classes and methods in tests/core.py
.
Feel free to customize based on the extras available in setup.py
Before you submit a pull request from your forked repo, check that it meets these guidelines:
flake8 airflow tests
. git diff upstream/master -u -- "*.py" | flake8 --diff
will return any changed files in your branch that require linting.When developing features the need may arise to persist information to the the metadata database. Airflow has Alembic built-in to handle all schema changes. Alembic must be installed on your development machine before continuing.
# starting at the root of the project $ pwd ~/airflow # change to the airflow directory $ cd airflow $ alembic revision -m "add new field to db" Generating ~/airflow/airflow/migrations/versions/12341123_add_new_field_to_db.py
airflow/www_rbac/
contains all npm-managed, front end assets. Flask-Appbuilder itself comes bundled with jQuery and bootstrap. While these may be phased out over time, these packages are currently not managed with npm.
Make sure you are using recent versions of node and npm. No problems have been found with node>=8.11.3 and npm>=6.1.3
First, npm must be available in your environment. If it is not you can run the following commands (taken from this source)
brew install node --without-npm echo prefix=~/.npm-packages >> ~/.npmrc curl -L https://www.npmjs.com/install.sh | sh
The final step is to add ~/.npm-packages/bin
to your PATH
so commands you install globally are usable. Add something like this to your .bashrc
file, then source ~/.bashrc
to reflect the change.
export PATH="$HOME/.npm-packages/bin:$PATH"
To install third party libraries defined in package.json
, run the following within the airflow/www_rbac/
directory which will install them in a new node_modules/
folder within www_rbac/
.
# from the root of the repository, move to where our JS package.json lives cd airflow/www_rbac/ # run npm install to fetch all the dependencies npm install
To parse and generate bundled files for airflow, run either of the following commands. The dev
flag will keep the npm script running and re-run it upon any changes within the assets directory.
# Compiles the production / optimized js & css npm run prod # Start a web server that manages and updates your assets as you modify them npm run dev
Should you add or upgrade a npm package, which involves changing package.json
, you'll need to re-run npm install
and push the newly generated package-lock.json
file so we get the reproducible build.
We try to enforce a more consistent style and try to follow the JS community guidelines. Once you add or modify any javascript code in the project, please make sure it follows the guidelines defined in Airbnb JavaScript Style Guide. Apache Airflow uses ESLint as a tool for identifying and reporting on patterns in JavaScript, which can be used by running any of the following commands.
# Check JS code in .js and .html files, and report any errors/warnings npm run lint # Check JS code in .js and .html files, report any errors/warnings and fix them if possible npm run lint:fix