| commit | a0bed169a6ce63b6d861601b59f1ad6f12afdf67 | [log] [tgz] | 
|---|---|---|
| author | Jens Scheffler <j_scheffler@gmx.de> | Fri May 02 13:52:20 2025 +0200 | 
| committer | Jens Scheffler <j_scheffler@gmx.de> | Fri May 02 13:52:20 2025 +0200 | 
| tree | 83c00e46b2fc99ccaa2ddb7b33e076e92bdf94cf | |
| parent | f27957e7fb3a9f93fa71dfba3af543adea61f015 [diff] | 
Change notifications to commits@airflow.apache.org
Sync S3 to Github: Use the scripts/s3_to_github.py script to download the latest documentation from S3 to your ./docs-archive folder. It has the following command line arguments:
--bucket-path: The S3 bucket path where the documentation is stored.--local-path: The local path where the documentation will be downloaded.--document-folder: The folder in the S3 bucket where the documentation is stored (This is optional if any particular folder need to be synced, provide the folder name ex: apache-airflow-providers-amazon).uv run ./scripts/s3_to_github.py --bucket-path s3://staging-docs-airflow-apache-org/docs/ --local-path ./docs-archive
Sync Github to S3: Use the scripts/github_to_s3.py script to upload the latest documentation from your ./docs-archive folder to S3. It has two modes:
./docs-archive to S3. It has the following command line arguments:--bucket-path: The S3 bucket path where the documentation will be stored.--local-path: The local path where the documentation is stored.--document-folder: The folder in the local path where the documentation is stored (This is optional if any particular folder need to be synced, provide the folder name ex: apache-airflow-providers-amazon).--sync-type: The type of sync to perform. Can be either last_commit or full_sync.--commit-sha: The commit sha to sync to S3. This is only required if the sync type is last_commit.uv run ./scripts/github_to_s3.py --bucket-path s3://staging-docs-airflow-apache-org/docs/ --local-path ./docs-archive --sync-type last-commit