blob: 8b82345543064b64602777ac896ecb7c2f364579 [file] [log] [blame]
:py:mod:`airflow.providers.apache.hive.transfers.mssql_to_hive`
===============================================================
.. py:module:: airflow.providers.apache.hive.transfers.mssql_to_hive
.. autoapi-nested-parse::
This module contains an operator to move data from MSSQL to Hive.
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.apache.hive.transfers.mssql_to_hive.MsSqlToHiveOperator
.. py:class:: MsSqlToHiveOperator(*, sql, hive_table, create = True, recreate = False, partition = None, delimiter = chr(1), mssql_conn_id = 'mssql_default', hive_cli_conn_id = 'hive_cli_default', tblproperties = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Moves data from Microsoft SQL Server to Hive. The operator runs
your query against Microsoft SQL Server, stores the file locally
before loading it into a Hive table. If the ``create`` or
``recreate`` arguments are set to ``True``,
a ``CREATE TABLE`` and ``DROP TABLE`` statements are generated.
Hive data types are inferred from the cursor's metadata.
Note that the table generated in Hive uses ``STORED AS textfile``
which isn't the most efficient serialization format. If a
large amount of data is loaded and/or if the table gets
queried considerably, you may want to use this operator only to
stage the data into a temporary table before loading it into its
final destination using a ``HiveOperator``.
:param sql: SQL query to execute against the Microsoft SQL Server
database. (templated)
:param hive_table: target Hive table, use dot notation to target a specific
database. (templated)
:param create: whether to create the table if it doesn't exist
:param recreate: whether to drop and recreate the table at every execution
:param partition: target partition as a dict of partition columns and
values. (templated)
:param delimiter: field delimiter in the file
:param mssql_conn_id: source Microsoft SQL Server connection
:param hive_cli_conn_id: Reference to the
:ref:`Hive CLI connection id <howto/connection:hive_cli>`.
:param tblproperties: TBLPROPERTIES of the hive table being created
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['sql', 'partition', 'hive_table']
.. py:attribute:: template_ext
:annotation: :Sequence[str] = ['.sql']
.. py:attribute:: template_fields_renderers
.. py:attribute:: ui_color
:annotation: = #a0e08c
.. py:method:: type_map(cls, mssql_type)
:classmethod:
Maps MsSQL type to Hive type.
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.