import ChangeLog from ‘../changelog/connector-sls.md’;

Sls

Sls sink connector

Support Those Engines

Spark
Flink
Seatunnel Zeta

主要特性

描述

Sink connector for Aliyun Sls.

从写入数据到阿里云Sls日志服务

为了使用Sls连接器,需要以下依赖关系。 它们可以通过install-plugin.sh或Maven中央存储库下载。

DatasourceSupported VersionsMaven
SlsUniversalDownload

支持的数据源信息

NameTypeRequiredDefaultDescription
projectStringYes-阿里云 Sls 项目
logstoreStringYes-阿里云 Sls 日志库
endpointStringYes-阿里云访问服务点
access_key_idStringYes-阿里云访问用户ID
access_key_secretStringYes-阿里云访问用户密码
sourceStringNoSeaTunnel-Source在sls中数据来源标记
topicStringNoSeaTunnel-Topic在sls中数据主题标记

任务示例

简单示例

此示例写入sls的logstore1的数据。如果您尚未安装和部署SeaTunnel,则需要按照安装SeaTunnel中的说明安装和部署SeaTunnel。然后按照[快速启动SeaTunnel引擎](../../Start-v2/locale/Quick-Start SeaTunnel Engine.md)中的说明运行此作业。

创建RAM用户及授权, 请确认RAM用户有足够的权限来读取及管理数据,参考:RAM自定义授权示例

# Defining the runtime environment
env {
  parallelism = 2
  job.mode = "STREAMING"
  checkpoint.interval = 30000
}
source {
  FakeSource {
    row.num = 10
    map.size = 10
    array.size = 10
    bytes.length = 10
    string.length = 10
    schema = {
      fields = {
            id = "int"
            name = "string"
            description = "string"
            weight = "string"
      }
    }
  }
}

sink {
  Sls {
    endpoint = "cn-hangzhou-intranet.log.aliyuncs.com"
    project = "project1"
    logstore = "logstore1"
    access_key_id = "xxxxxxxxxxxxxxxxxxxxxxxx"
    access_key_secret = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
  }
}

变更日志