blob: a5275b8312d64ab239eb5621b7d154ae7ac2f325 [file] [log] [blame]
:mod:`airflow.contrib.operators.bigquery_to_mysql_operator`
===========================================================
.. py:module:: airflow.contrib.operators.bigquery_to_mysql_operator
.. autoapi-nested-parse::
This module contains a Google BigQuery to MySQL operator.
Module Contents
---------------
.. py:class:: BigQueryToMySqlOperator(dataset_table, mysql_table, selected_fields=None, gcp_conn_id='google_cloud_default', mysql_conn_id='mysql_default', database=None, delegate_to=None, replace=False, batch_size=1000, *args, **kwargs)
Bases: :class:`airflow.models.BaseOperator`
Fetches the data from a BigQuery table (alternatively fetch data for selected columns)
and insert that data into a MySQL table.
.. note::
If you pass fields to ``selected_fields`` which are in different order than the
order of columns already in
BQ table, the data will still be in the order of BQ table.
For example if the BQ table has 3 columns as
``[A,B,C]`` and you pass 'B,A' in the ``selected_fields``
the data would still be of the form ``'A,B'`` and passed through this form
to MySQL
**Example**: ::
transfer_data = BigQueryToMySqlOperator(
task_id='task_id',
dataset_table='origin_bq_table',
mysql_table='dest_table_name',
replace=True,
)
:param dataset_table: A dotted ``<dataset>.<table>``: the big query table of origin
:type dataset_table: str
:param max_results: The maximum number of records (rows) to be fetched
from the table. (templated)
:type max_results: str
:param selected_fields: List of fields to return (comma-separated). If
unspecified, all fields are returned.
:type selected_fields: str
:param gcp_conn_id: reference to a specific GCP hook.
:type gcp_conn_id: str
:param delegate_to: The account to impersonate, if any.
For this to work, the service account making the request must have domain-wide
delegation enabled.
:type delegate_to: str
:param mysql_conn_id: reference to a specific mysql hook
:type mysql_conn_id: str
:param database: name of database which overwrite defined one in connection
:type database: str
:param replace: Whether to replace instead of insert
:type replace: bool
:param batch_size: The number of rows to take in each batch
:type batch_size: int
.. attribute:: template_fields
:annotation: = ['dataset_id', 'table_id', 'mysql_table']
.. method:: _bq_get_data(self)
.. method:: execute(self, context)