blob: ad9a77deddd8002f05cd37376fcad69bd754bed4 [file] [log] [blame]
:mod:`airflow.providers.amazon.aws.operators.s3_bucket_tagging`
===============================================================
.. py:module:: airflow.providers.amazon.aws.operators.s3_bucket_tagging
.. autoapi-nested-parse::
This module contains AWS S3 operators.
Module Contents
---------------
.. data:: BUCKET_DOES_NOT_EXIST_MSG
:annotation: = Bucket with name: %s doesn't exist
.. py:class:: S3GetBucketTaggingOperator(bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', **kwargs)
Bases: :class:`airflow.models.BaseOperator`
This operator gets tagging from an S3 bucket
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:S3GetBucketTaggingOperator`
:param bucket_name: This is bucket name you want to reference
:type bucket_name: str
:param aws_conn_id: The Airflow connection used for AWS credentials.
If this is None or empty then the default boto3 behaviour is used. If
running Airflow in a distributed manner and aws_conn_id is None or
empty, then default boto3 configuration would be used (and must be
maintained on each worker node).
:type aws_conn_id: Optional[str]
.. attribute:: template_fields
:annotation: = ['bucket_name']
.. method:: execute(self, context)
.. py:class:: S3PutBucketTaggingOperator(bucket_name: str, key: Optional[str] = None, value: Optional[str] = None, tag_set: Optional[List[Dict[str, str]]] = None, aws_conn_id: Optional[str] = 'aws_default', **kwargs)
Bases: :class:`airflow.models.BaseOperator`
This operator puts tagging for an S3 bucket.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:S3PutBucketTaggingOperator`
:param bucket_name: The name of the bucket to add tags to.
:type bucket_name: str
:param key: The key portion of the key/value pair for a tag to be added.
If a key is provided, a value must be provided as well.
:type key: str
:param value: The value portion of the key/value pair for a tag to be added.
If a value is provided, a key must be provided as well.
:param tag_set: A List of key/value pairs.
:type tag_set: List[Dict[str, str]]
:param aws_conn_id: The Airflow connection used for AWS credentials.
If this is None or empty then the default boto3 behaviour is used. If
running Airflow in a distributed manner and aws_conn_id is None or
empty, then the default boto3 configuration would be used (and must be
maintained on each worker node).
:type aws_conn_id: Optional[str]
.. attribute:: template_fields
:annotation: = ['bucket_name']
.. method:: execute(self, context)
.. py:class:: S3DeleteBucketTaggingOperator(bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', **kwargs)
Bases: :class:`airflow.models.BaseOperator`
This operator deletes tagging from an S3 bucket.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:S3DeleteBucketTaggingOperator`
:param bucket_name: This is the name of the bucket to delete tags from.
:type bucket_name: str
:param aws_conn_id: The Airflow connection used for AWS credentials.
If this is None or empty then the default boto3 behaviour is used. If
running Airflow in a distributed manner and aws_conn_id is None or
empty, then default boto3 configuration would be used (and must be
maintained on each worker node).
:type aws_conn_id: Optional[str]
.. attribute:: template_fields
:annotation: = ['bucket_name']
.. method:: execute(self, context)