| :mod:`airflow.contrib.sensors.bigquery_sensor` |
| ============================================== |
| |
| .. py:module:: airflow.contrib.sensors.bigquery_sensor |
| |
| |
| Module Contents |
| --------------- |
| |
| .. py:class:: BigQueryTableSensor(project_id, dataset_id, table_id, bigquery_conn_id='bigquery_default_conn', delegate_to=None, *args, **kwargs) |
| |
| Bases: :class:`airflow.sensors.base_sensor_operator.BaseSensorOperator` |
| |
| Checks for the existence of a table in Google Bigquery. |
| |
| :param project_id: The Google cloud project in which to look for the table. |
| The connection supplied to the hook must provide |
| access to the specified project. |
| :type project_id: str |
| :param dataset_id: The name of the dataset in which to look for the table. |
| storage bucket. |
| :type dataset_id: str |
| :param table_id: The name of the table to check the existence of. |
| :type table_id: str |
| :param bigquery_conn_id: The connection ID to use when connecting to |
| Google BigQuery. |
| :type bigquery_conn_id: str |
| :param delegate_to: The account to impersonate, if any. |
| For this to work, the service account making the request must |
| have domain-wide delegation enabled. |
| :type delegate_to: str |
| |
| .. attribute:: template_fields |
| :annotation: = ['project_id', 'dataset_id', 'table_id'] |
| |
| |
| |
| .. attribute:: ui_color |
| :annotation: = #f0eee4 |
| |
| |
| |
| |
| .. method:: poke(self, context) |
| |
| |
| |
| |