blob: f6dab3cc716e5d2b2a8fb400844c1ec933456d40 [file] [log] [blame]
:mod:`airflow.providers.amazon.aws.hooks.logs`
==============================================
.. py:module:: airflow.providers.amazon.aws.hooks.logs
.. autoapi-nested-parse::
This module contains a hook (AwsLogsHook) with some very basic
functionality for interacting with AWS CloudWatch.
Module Contents
---------------
.. py:class:: AwsLogsHook(*args, **kwargs)
Bases: :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
Interact with AWS CloudWatch Logs
Additional arguments (such as ``aws_conn_id``) may be specified and
are passed down to the underlying AwsBaseHook.
.. seealso::
:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
.. method:: get_log_events(self, log_group: str, log_stream_name: str, start_time: int = 0, skip: int = 0, start_from_head: bool = True)
A generator for log items in a single stream. This will yield all the
items that are available at the current moment.
:param log_group: The name of the log group.
:type log_group: str
:param log_stream_name: The name of the specific stream.
:type log_stream_name: str
:param start_time: The time stamp value to start reading the logs from (default: 0).
:type start_time: int
:param skip: The number of log entries to skip at the start (default: 0).
This is for when there are multiple entries at the same timestamp.
:type skip: int
:param start_from_head: whether to start from the beginning (True) of the log or
at the end of the log (False).
:type start_from_head: bool
:rtype: dict
:return: | A CloudWatch log event with the following key-value pairs:
| 'timestamp' (int): The time in milliseconds of the event.
| 'message' (str): The log event data.
| 'ingestionTime' (int): The time in milliseconds the event was ingested.