uv tool install prekprek installbreeze.dev/ (mounted as /opt/airflow/dev/ inside Breeze).<PROJECT> is folder where pyproject.toml of the package you want to test is located. For example, airflow-core or providers/amazon. <target_branch> is the branch the PR will be merged into — usually main, but could be v3-1-test when creating a PR for the 3.1 branch.
uv run --project <PROJECT> pytest path/to/test.py::TestClass::test_method -xvsuv run --project <PROJECT> pytest path/to/test.py -xvsuv run --project <PROJECT> pytest path/to/package -xvsbreeze run pytest <tests> -xvsuv run --project <PROJECT> python dev/my_script.pybreeze testing <test_group> --run-in-parallel (test groups: core-tests, providers-tests)breeze testing <test_group> --run-db-tests-only --run-in-parallel (test groups: core-tests, providers-tests)breeze testing <test_group> --skip-db-tests --use-xdist (test groups: core-tests, providers-tests)breeze testing providers-tests --test-type "Providers[PROVIDERS_LIST]" (e.g., Providers[google] or Providers[amazon] or “Providers[amazon,google]”)breeze testing helm-tests --use-xdistbreeze testing helm-tests --use-xdist --kubernetes-version 1.35.0breeze testing helm-tests --use-xdist --test-type <type> (types: airflow_aux, airflow_core, apiserver, dagprocessor, other, redis, security, statsd, webserver)breeze testing <test_group> (test groups: airflow-ctl-tests, docker-compose-tests, task-sdk-tests)uv run --project scripts pytest scripts/tests/ -xvsbreeze run airflow dags listbreeze run mypy path/to/codeprek run ruff --from-ref <target_branch>prek run ruff-format --from-ref <target_branch>prek run --from-ref <target_branch> --stage pre-commitprek run --from-ref <target_branch> --stage manualbreeze build-docsbreeze selective-checks --commit-ref <commit_with_squashed_changes>SQLite is the default backend. Use --backend postgres or --backend mysql for integration tests that need those databases. If Docker networking fails, run docker network prune.
UV workspace monorepo. Key paths:
airflow-core/src/airflow/ — core scheduler, API, CLI, modelsmodels/ — SQLAlchemy models (DagModel, TaskInstance, DagRun, Asset, etc.)jobs/ — scheduler, triggerer, Dag processor runnersapi_fastapi/core_api/ — public REST API v2, UI endpointsapi_fastapi/execution_api/ — task execution communication APIdag_processing/ — Dag parsing and validationcli/ — command-line interfaceui/ — React/TypeScript web interface (Vite)task-sdk/ — lightweight SDK for Dag authoring and task execution runtimesrc/airflow/sdk/execution_time/ — task runner, supervisorproviders/ — 100+ provider packages, each with its own pyproject.tomlairflow-ctl/ — management CLI toolchart/ — Helm chart for Kubernetes deploymentdev/ — development utilities and scripts used to bootstrap the environment, releases, breeze dev envscripts/ — utility scripts for CI, Docker, and prek hooks (workspace distribution apache-airflow-scripts)ci/prek/ — prek (pre-commit) hook scripts; shared utilities in common_prek_utils.pytests/ — pytest tests for the scripts; run with uv run --project scripts pytest scripts/tests/airflow.sdk).shared folder.uv workspace feature to keep all the distributions sharing dependencies and venvuv --project <FOLDER> sync command acts on the selected project in the monorepo with only dependencies that it hasshared folderairflow-core, task-sdk for example)uv run ruff format <file_path> and uv run ruff check --fix <file_path>. Do this for every Python file you create or modify, before moving on to the next step.assert in production code.time.monotonic() for durations, not time.time().airflow-core, functions with a session parameter must not call session.commit(). Use keyword-only session parameters.TYPE_CHECKING blocks.kubernetes.client) with TYPE_CHECKING in multi-process code paths.ValueError instead of raising the broad AirflowException directly. Each error case should have a specific exception type that conveys what went wrong.unittest.TestCase.spec/autospec when mocking.time_machine for time-dependent tests.@pytest.mark.parametrize for multiple similar inputs.@pytest.mark.db_test for tests that require database access.devel-common/src/tests_common/pytest_plugin.py.airflow/cli/cli_parser.py → tests/cli/test_cli_parser.py.Write commit messages focused on user impact, not implementation details.
Fix airflow dags test command failure without serialized DagsUI: Fix Grid view not refreshing after task actionsInitialize DAG bundles in CLI get_dag functionAdd a newsfragment for user-visible changes: echo "Brief description" > airflow-core/newsfragments/{PR_NUMBER}.{bugfix|feature|improvement|doc|misc|significant}.rst
Always push to the user's fork, not to the upstream apache/airflow repo. Never push directly to main.
Before pushing, determine the fork remote. Check git remote -v — if origin does not point to apache/airflow, use origin (it‘s the user’s fork). If origin points to apache/airflow, look for another remote that points to the user's fork. If no fork remote exists, create one:
gh repo fork apache/airflow --remote --remote-name fork
Before pushing, perform a self-review of your changes following the Gen-AI review guidelines in contributing-docs/05_pull_requests.rst and the code review checklist in .github/instructions/code-review.instructions.md:
git diff main...HEAD) and verify every change is intentional and related to the task — remove any unrelated changes..github/instructions/code-review.instructions.md and check your diff against every rule — architecture boundaries, database correctness, code quality, testing requirements, API correctness, and AI-generated code signals. Fix any violations before pushing.prek run --from-ref <target_branch> --stage pre-commit) and fix any failures.prek run --from-ref <target_branch> --stage manual) and fix any failures.Before pushing, always rebase your branch onto the latest target branch (usually main) to avoid merge conflicts and ensure CI runs against up-to-date code:
git fetch <upstream-remote> <target_branch> git rebase <upstream-remote>/<target_branch>
If there are conflicts, resolve them and continue the rebase. If the rebase is too complex, ask the user for guidance.
Then push the branch to the fork remote and open the PR creation page in the browser with the body pre-filled (including the generative AI disclosure already checked):
git push -u <fork-remote> <branch-name> gh pr create --web --title "Short title (under 70 chars)" --body "$(cat <<'EOF' Brief description of the changes. closes: #ISSUE (if applicable) --- ##### Was generative AI tooling used to co-author this PR? - [X] Yes — <Agent Name and Version> Generated-by: <Agent Name and Version> following [the guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#gen-ai-assisted-contributions) EOF )"
The --web flag opens the browser so the user can review and submit. The --body flag pre-fills the PR template with the generative AI disclosure already completed.
Remind the user to:
closes: #ISSUE or related: #ISSUE).