blob: 4d739e44505339e2a21713acc61a4bd1a5b2d598 [file] [log] [blame]
:mod:`airflow.contrib.operators.datastore_import_operator`
==========================================================
.. py:module:: airflow.contrib.operators.datastore_import_operator
Module Contents
---------------
.. py:class:: DatastoreImportOperator(bucket, file, namespace=None, entity_filter=None, labels=None, datastore_conn_id='google_cloud_default', delegate_to=None, polling_interval_in_seconds=10, xcom_push=False, *args, **kwargs)
Bases: :class:`airflow.models.BaseOperator`
Import entities from Cloud Storage to Google Cloud Datastore
:param bucket: container in Cloud Storage to store data
:type bucket: str
:param file: path of the backup metadata file in the specified Cloud Storage bucket.
It should have the extension .overall_export_metadata
:type file: str
:param namespace: optional namespace of the backup metadata file in
the specified Cloud Storage bucket.
:type namespace: str
:param entity_filter: description of what data from the project is included in
the export, refer to
https://cloud.google.com/datastore/docs/reference/rest/Shared.Types/EntityFilter
:type entity_filter: dict
:param labels: client-assigned labels for cloud storage
:type labels: dict
:param datastore_conn_id: the name of the connection id to use
:type datastore_conn_id: str
:param delegate_to: The account to impersonate, if any.
For this to work, the service account making the request must have domain-wide
delegation enabled.
:type delegate_to: str
:param polling_interval_in_seconds: number of seconds to wait before polling for
execution status again
:type polling_interval_in_seconds: int
:param xcom_push: push operation name to xcom for reference
:type xcom_push: bool
.. method:: execute(self, context)