blob: 08c45b21020d4dc94c2154eafd8d395909b39567 [file] [log] [blame]
:mod:`airflow.contrib.operators.mysql_to_gcs`
=============================================
.. py:module:: airflow.contrib.operators.mysql_to_gcs
.. autoapi-nested-parse::
MySQL to GCS operator.
Module Contents
---------------
.. data:: PY3
.. py:class:: MySqlToGoogleCloudStorageOperator(mysql_conn_id='mysql_default', ensure_utc=False, *args, **kwargs)
Bases: :class:`airflow.contrib.operators.sql_to_gcs.BaseSQLToGoogleCloudStorageOperator`
Copy data from MySQL to Google cloud storage in JSON or CSV format.
:param mysql_conn_id: Reference to a specific MySQL hook.
:type mysql_conn_id: str
:param ensure_utc: Ensure TIMESTAMP columns exported as UTC. If set to
`False`, TIMESTAMP columns will be exported using the MySQL server's
default timezone.
:type ensure_utc: bool
.. attribute:: ui_color
:annotation: = #a0e08c
.. attribute:: type_map
.. method:: query(self)
Queries mysql and returns a cursor to the results.
.. method:: field_to_bigquery(self, field)
.. method:: convert_type(self, value, schema_type)
Takes a value from MySQLdb, and converts it to a value that's safe for
JSON/Google cloud storage/BigQuery. Dates are converted to UTC seconds.
Decimals are converted to floats. Binary type fields are encoded with base64,
as imported BYTES data must be base64-encoded according to Bigquery SQL
date type documentation: https://cloud.google.com/bigquery/data-types
:param value: MySQLdb column value
:type value: Any
:param schema_type: BigQuery data type
:type schema_type: str