| commit | f27957e7fb3a9f93fa71dfba3af543adea61f015 | [log] [tgz] |
|---|---|---|
| author | Jarek Potiuk <jarek@potiuk.com> | Fri May 02 11:14:08 2025 +0200 |
| committer | GitHub <noreply@github.com> | Fri May 02 11:14:08 2025 +0200 |
| tree | 4cf5a6401e319e95f82ea47d0645c8cb0b868f4d | |
| parent | 53f58e86f8f416b0c78e914e2976ad4672bbd58e [diff] | |
| parent | bc6deadf4f018c562f073e970e3648b792939313 [diff] |
Merge pull request #4 from apache/fix-nospace Fix nospace issue
Sync S3 to Github: Use the scripts/s3_to_github.py script to download the latest documentation from S3 to your ./docs-archive folder. It has the following command line arguments:
--bucket-path: The S3 bucket path where the documentation is stored.--local-path: The local path where the documentation will be downloaded.--document-folder: The folder in the S3 bucket where the documentation is stored (This is optional if any particular folder need to be synced, provide the folder name ex: apache-airflow-providers-amazon).uv run ./scripts/s3_to_github.py --bucket-path s3://staging-docs-airflow-apache-org/docs/ --local-path ./docs-archive
Sync Github to S3: Use the scripts/github_to_s3.py script to upload the latest documentation from your ./docs-archive folder to S3. It has two modes:
./docs-archive to S3. It has the following command line arguments:--bucket-path: The S3 bucket path where the documentation will be stored.--local-path: The local path where the documentation is stored.--document-folder: The folder in the local path where the documentation is stored (This is optional if any particular folder need to be synced, provide the folder name ex: apache-airflow-providers-amazon).--sync-type: The type of sync to perform. Can be either last_commit or full_sync.--commit-sha: The commit sha to sync to S3. This is only required if the sync type is last_commit.uv run ./scripts/github_to_s3.py --bucket-path s3://staging-docs-airflow-apache-org/docs/ --local-path ./docs-archive --sync-type last-commit