| commit | 482fce358ee61a8771de24f0dfe0ec27c0ef1366 | [log] [tgz] |
|---|---|---|
| author | Jens Scheffler <j_scheffler@gmx.de> | Fri May 02 14:10:55 2025 +0200 |
| committer | Jens Scheffler <j_scheffler@gmx.de> | Fri May 02 14:10:55 2025 +0200 |
| tree | eaa2b696ab3fff3f854e575a8b352c6aecb85d78 | |
| parent | a0bed169a6ce63b6d861601b59f1ad6f12afdf67 [diff] |
Add branch protection later...
Sync S3 to Github: Use the scripts/s3_to_github.py script to download the latest documentation from S3 to your ./docs-archive folder. It has the following command line arguments:
--bucket-path: The S3 bucket path where the documentation is stored.--local-path: The local path where the documentation will be downloaded.--document-folder: The folder in the S3 bucket where the documentation is stored (This is optional if any particular folder need to be synced, provide the folder name ex: apache-airflow-providers-amazon).uv run ./scripts/s3_to_github.py --bucket-path s3://staging-docs-airflow-apache-org/docs/ --local-path ./docs-archive
Sync Github to S3: Use the scripts/github_to_s3.py script to upload the latest documentation from your ./docs-archive folder to S3. It has two modes:
./docs-archive to S3. It has the following command line arguments:--bucket-path: The S3 bucket path where the documentation will be stored.--local-path: The local path where the documentation is stored.--document-folder: The folder in the local path where the documentation is stored (This is optional if any particular folder need to be synced, provide the folder name ex: apache-airflow-providers-amazon).--sync-type: The type of sync to perform. Can be either last_commit or full_sync.--commit-sha: The commit sha to sync to S3. This is only required if the sync type is last_commit.uv run ./scripts/github_to_s3.py --bucket-path s3://staging-docs-airflow-apache-org/docs/ --local-path ./docs-archive --sync-type last-commit