blob: 7a41d72cd9cb394bac08b1758b82d6e0670c1c87 [file] [log] [blame]
:py:mod:`airflow.providers.google.cloud.operators.datafusion`
=============================================================
.. py:module:: airflow.providers.google.cloud.operators.datafusion
.. autoapi-nested-parse::
This module contains Google DataFusion operators.
Module Contents
---------------
Classes
~~~~~~~
.. autoapisummary::
airflow.providers.google.cloud.operators.datafusion.DataFusionPipelineLinkHelper
airflow.providers.google.cloud.operators.datafusion.DataFusionInstanceLink
airflow.providers.google.cloud.operators.datafusion.DataFusionPipelineLink
airflow.providers.google.cloud.operators.datafusion.DataFusionPipelinesLink
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionRestartInstanceOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionDeleteInstanceOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionCreateInstanceOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionUpdateInstanceOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionGetInstanceOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionCreatePipelineOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionDeletePipelineOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionListPipelinesOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionStartPipelineOperator
airflow.providers.google.cloud.operators.datafusion.CloudDataFusionStopPipelineOperator
Attributes
~~~~~~~~~~
.. autoapisummary::
airflow.providers.google.cloud.operators.datafusion.BASE_LINK
airflow.providers.google.cloud.operators.datafusion.DATAFUSION_INSTANCE_LINK
airflow.providers.google.cloud.operators.datafusion.DATAFUSION_PIPELINES_LINK
airflow.providers.google.cloud.operators.datafusion.DATAFUSION_PIPELINE_LINK
.. py:data:: BASE_LINK
:annotation: = https://console.cloud.google.com/data-fusion
.. py:data:: DATAFUSION_INSTANCE_LINK
.. py:data:: DATAFUSION_PIPELINES_LINK
:annotation: = {uri}/cdap/ns/default/pipelines
.. py:data:: DATAFUSION_PIPELINE_LINK
:annotation: = {uri}/pipelines/ns/default/view/{pipeline_name}
.. py:class:: DataFusionPipelineLinkHelper
Helper class for Pipeline links
.. py:method:: get_project_id(instance)
:staticmethod:
.. py:class:: DataFusionInstanceLink
Bases: :py:obj:`airflow.providers.google.cloud.links.base.BaseGoogleLink`
Helper class for constructing Data Fusion Instance link
.. py:attribute:: name
:annotation: = Data Fusion Instance
.. py:attribute:: key
:annotation: = instance_conf
.. py:attribute:: format_str
.. py:method:: persist(context, task_instance, project_id)
:staticmethod:
.. py:class:: DataFusionPipelineLink
Bases: :py:obj:`airflow.providers.google.cloud.links.base.BaseGoogleLink`
Helper class for constructing Data Fusion Pipeline link
.. py:attribute:: name
:annotation: = Data Fusion Pipeline
.. py:attribute:: key
:annotation: = pipeline_conf
.. py:attribute:: format_str
.. py:method:: persist(context, task_instance, uri)
:staticmethod:
.. py:class:: DataFusionPipelinesLink
Bases: :py:obj:`airflow.providers.google.cloud.links.base.BaseGoogleLink`
Helper class for constructing list of Data Fusion Pipelines link
.. py:attribute:: name
:annotation: = Data Fusion Pipelines
.. py:attribute:: key
:annotation: = pipelines_conf
.. py:attribute:: format_str
.. py:method:: persist(context, task_instance, uri)
:staticmethod:
.. py:class:: CloudDataFusionRestartInstanceOperator(*, instance_name, location, project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Restart a single Data Fusion instance.
At the end of an operation instance is fully restarted.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionRestartInstanceOperator`
:param instance_name: The name of the instance to restart.
:param location: The Cloud Data Fusion location in which to handle the request.
:param project_id: The ID of the Google Cloud project that the instance belongs to.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionDeleteInstanceOperator(*, instance_name, location, project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Deletes a single Date Fusion instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionDeleteInstanceOperator`
:param instance_name: The name of the instance to restart.
:param location: The Cloud Data Fusion location in which to handle the request.
:param project_id: The ID of the Google Cloud project that the instance belongs to.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'impersonation_chain']
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionCreateInstanceOperator(*, instance_name, instance, location, project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Creates a new Data Fusion instance in the specified project and location.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionCreateInstanceOperator`
:param instance_name: The name of the instance to create.
:param instance: An instance of Instance.
https://cloud.google.com/data-fusion/docs/reference/rest/v1beta1/projects.locations.instances#Instance
:param location: The Cloud Data Fusion location in which to handle the request.
:param project_id: The ID of the Google Cloud project that the instance belongs to.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'instance', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionUpdateInstanceOperator(*, instance_name, instance, update_mask, location, project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Updates a single Data Fusion instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionUpdateInstanceOperator`
:param instance_name: The name of the instance to create.
:param instance: An instance of Instance.
https://cloud.google.com/data-fusion/docs/reference/rest/v1beta1/projects.locations.instances#Instance
:param update_mask: Field mask is used to specify the fields that the update will overwrite
in an instance resource. The fields specified in the updateMask are relative to the resource,
not the full request. A field will be overwritten if it is in the mask. If the user does not
provide a mask, all the supported fields (labels and options currently) will be overwritten.
A comma-separated list of fully qualified names of fields. Example: "user.displayName,photo".
https://developers.google.com/protocol-buffers/docs/reference/google.protobuf?_ga=2.205612571.-968688242.1573564810#google.protobuf.FieldMask
:param location: The Cloud Data Fusion location in which to handle the request.
:param project_id: The ID of the Google Cloud project that the instance belongs to.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'instance', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionGetInstanceOperator(*, instance_name, location, project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Gets details of a single Data Fusion instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionGetInstanceOperator`
:param instance_name: The name of the instance.
:param location: The Cloud Data Fusion location in which to handle the request.
:param project_id: The ID of the Google Cloud project that the instance belongs to.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionCreatePipelineOperator(*, pipeline_name, pipeline, instance_name, location, namespace = 'default', project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Creates a Cloud Data Fusion pipeline.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionCreatePipelineOperator`
:param pipeline_name: Your pipeline name.
:param pipeline: The pipeline definition. For more information check:
https://docs.cdap.io/cdap/current/en/developer-manual/pipelines/developing-pipelines.html#pipeline-configuration-file-format
:param instance_name: The name of the instance.
:param location: The Cloud Data Fusion location in which to handle the request.
:param namespace: If your pipeline belongs to a Basic edition instance, the namespace ID
is always default. If your pipeline belongs to an Enterprise edition instance, you
can create a namespace.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'pipeline_name', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionDeletePipelineOperator(*, pipeline_name, instance_name, location, version_id = None, namespace = 'default', project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Deletes a Cloud Data Fusion pipeline.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionDeletePipelineOperator`
:param pipeline_name: Your pipeline name.
:param version_id: Version of pipeline to delete
:param instance_name: The name of the instance.
:param location: The Cloud Data Fusion location in which to handle the request.
:param namespace: If your pipeline belongs to a Basic edition instance, the namespace ID
is always default. If your pipeline belongs to an Enterprise edition instance, you
can create a namespace.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'version_id', 'pipeline_name', 'impersonation_chain']
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionListPipelinesOperator(*, instance_name, location, artifact_name = None, artifact_version = None, namespace = 'default', project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Lists Cloud Data Fusion pipelines.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionListPipelinesOperator`
:param instance_name: The name of the instance.
:param location: The Cloud Data Fusion location in which to handle the request.
:param artifact_version: Artifact version to filter instances
:param artifact_name: Artifact name to filter instances
:param namespace: If your pipeline belongs to a Basic edition instance, the namespace ID
is always default. If your pipeline belongs to an Enterprise edition instance, you
can create a namespace.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'artifact_name', 'artifact_version', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionStartPipelineOperator(*, pipeline_name, instance_name, location, runtime_args = None, success_states = None, namespace = 'default', pipeline_timeout = 5 * 60, project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, asynchronous=False, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Starts a Cloud Data Fusion pipeline. Works for both batch and stream pipelines.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionStartPipelineOperator`
:param pipeline_name: Your pipeline name.
:param instance_name: The name of the instance.
:param success_states: If provided the operator will wait for pipeline to be in one of
the provided states.
:param pipeline_timeout: How long (in seconds) operator should wait for the pipeline to be in one of
``success_states``. Works only if ``success_states`` are provided.
:param location: The Cloud Data Fusion location in which to handle the request.
:param runtime_args: Optional runtime args to be passed to the pipeline
:param namespace: If your pipeline belongs to a Basic edition instance, the namespace ID
is always default. If your pipeline belongs to an Enterprise edition instance, you
can create a namespace.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:param asynchronous: Flag to return after submitting the pipeline Id to the Data Fusion API.
This is useful for submitting long running pipelines and
waiting on them asynchronously using the CloudDataFusionPipelineStateSensor
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'pipeline_name', 'runtime_args', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.
.. py:class:: CloudDataFusionStopPipelineOperator(*, pipeline_name, instance_name, location, namespace = 'default', project_id = None, api_version = 'v1beta1', gcp_conn_id = 'google_cloud_default', delegate_to = None, impersonation_chain = None, **kwargs)
Bases: :py:obj:`airflow.models.BaseOperator`
Stops a Cloud Data Fusion pipeline. Works for both batch and stream pipelines.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudDataFusionStopPipelineOperator`
:param pipeline_name: Your pipeline name.
:param instance_name: The name of the instance.
:param location: The Cloud Data Fusion location in which to handle the request.
:param namespace: If your pipeline belongs to a Basic edition instance, the namespace ID
is always default. If your pipeline belongs to an Enterprise edition instance, you
can create a namespace.
:param api_version: The version of the api that will be requested for example 'v3'.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
.. py:attribute:: template_fields
:annotation: :Sequence[str] = ['instance_name', 'pipeline_name', 'impersonation_chain']
.. py:attribute:: operator_extra_links
.. py:method:: execute(self, context)
This is the main method to derive when creating an operator.
Context is the same dictionary used as when rendering jinja templates.
Refer to get_template_context for more context.