blob: ca1be69e952e8b2e06f2bc5beb6495a3196e2ab1 [file] [log] [blame]
:py:mod:`airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake`
===============================================================================
.. py:module:: airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake.OracleToAzureDataLakeOperator
.. py:class:: OracleToAzureDataLakeOperator(*, filename, azure_data_lake_conn_id, azure_data_lake_path, oracle_conn_id, sql, sql_params = None, delimiter = ',', encoding = 'utf-8', quotechar = '"', quoting = csv.QUOTE_MINIMAL, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Moves data from Oracle to Azure Data Lake. The operator runs the query against
Oracle and stores the file locally before loading it into Azure Data Lake.
:param filename: file name to be used by the csv file.
:param azure_data_lake_conn_id: destination azure data lake connection.
:param azure_data_lake_path: destination path in azure data lake to put the file.
:param oracle_conn_id: :ref:`Source Oracle connection <howto/connection:oracle>`.
:param sql: SQL query to execute against the Oracle database. (templated)
:param sql_params: Parameters to use in sql query. (templated)
:param delimiter: field delimiter in the file.
:param encoding: encoding type for the file.
:param quotechar: Character to use in quoting.
:param quoting: Quoting strategy. See unicodecsv quoting for more information.
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['filename', 'sql', 'sql_params']
.. py:attribute:: template_fields_renderers
.. py:attribute:: ui_color
:annotation: = #e08c8c
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.