blob: 004640248de539915a2abb939bd0309e55ad734c [file] [log] [blame]
:py:mod:`airflow.providers.google.cloud.example_dags.example_cloud_storage_transfer_service_aws`
================================================================================================
.. py:module:: airflow.providers.google.cloud.example_dags.example_cloud_storage_transfer_service_aws
.. autoapi-nested-parse::
Example Airflow DAG that demonstrates interactions with Google Cloud Transfer. This DAG relies on
the following OS environment variables
Note that you need to provide a large enough set of data so that operations do not execute too quickly.
Otherwise, DAG will fail.
* GCP_PROJECT_ID - Google Cloud Project to use for the Google Cloud Transfer Service.
* GCP_DESCRIPTION - Description of transfer job
* GCP_TRANSFER_SOURCE_AWS_BUCKET - Amazon Web Services Storage bucket from which files are copied.
* GCP_TRANSFER_SECOND_TARGET_BUCKET - Google Cloud Storage bucket to which files are copied
* WAIT_FOR_OPERATION_POKE_INTERVAL - interval of what to check the status of the operation
A smaller value than the default value accelerates the system test and ensures its correct execution with
smaller quantities of files in the source bucket
Look at documentation of :class:`~airflow.operators.sensors.BaseSensorOperator` for more information
Module Contents
---------------
.. py:data:: GCP_PROJECT_ID
.. py:data:: GCP_DESCRIPTION
.. py:data:: GCP_TRANSFER_TARGET_BUCKET
.. py:data:: WAIT_FOR_OPERATION_POKE_INTERVAL
.. py:data:: GCP_TRANSFER_SOURCE_AWS_BUCKET
.. py:data:: GCP_TRANSFER_FIRST_TARGET_BUCKET
.. py:data:: GCP_TRANSFER_JOB_NAME
.. py:data:: aws_to_gcs_transfer_body
.. py:data:: create_transfer_job_from_aws