blob: b5056004553f377043092cb11f6a95b4ed9ab209 [file] [log] [blame]
:py:mod:`airflow.providers.google.cloud.triggers.bigquery_dts`
==============================================================
.. py:module:: airflow.providers.google.cloud.triggers.bigquery_dts
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.google.cloud.triggers.bigquery_dts.BigQueryDataTransferRunTrigger
.. py:class:: BigQueryDataTransferRunTrigger(project_id, config_id, run_id, poll_interval = 10, gcp_conn_id = 'google_cloud_default', location = None, impersonation_chain = None)
Bases: :py:obj:`airflow.triggers.base.BaseTrigger`
Triggers class to watch the Transfer Run state to define when the job is done.
:param project_id: The BigQuery project id where the transfer configuration should be
:param config_id: ID of the config of the Transfer Run which should be watched.
:param run_id: ID of the Transfer Run which should be watched.
:param poll_interval: Optional. Interval which defines how often triggers check status of the job.
:param gcp_conn_id: The connection ID used to connect to Google Cloud.
:param location: BigQuery Transfer Service location for regional transfers.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:method:: serialize()
Serialize class arguments and classpath.
.. py:method:: run()
:async:
If the Transfer Run is in a terminal state, then yield TriggerEvent object.