blob: 35d3ed924c165e528b2e65d00a862d237f16f41a [file] [log] [blame]
# Camel-Kafka-connector AWS2 S3 MinIO Source
## Introduction
This is an example for Camel-Kafka-connector AW2-S3 MinIO
## What is needed
- A MinIO instance running
- AWS S3 Bucket (it will be auto created if it does not exist)
## Running Kafka
```
$KAFKA_HOME/bin/zookeeper-server-start.sh $KAFKA_HOME/config/zookeeper.properties
$KAFKA_HOME/bin/kafka-server-start.sh $KAFKA_HOME/config/server.properties
$KAFKA_HOME/bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic test1
```
## Setting up the needed bits and running the example
You'll need to setup the plugin.path property in your kafka
Open the `$KAFKA_HOME/config/connect-standalone.properties`
and set the `plugin.path` property to your choosen location
In this example we'll use `/home/oscerd/connectors/`
```
> cd /home/oscerd/connectors/
> wget https://repo1.maven.org/maven2/org/apache/camel/kafkaconnector/camel-aws2-s3-kafka-connector/0.10.1/camel-aws2-s3-kafka-connector-0.10.1-package.tar.gz
> untar.gz camel-aws2-s3-kafka-connector-0.10.1-package.tar.gz
```
Now we need to setup a MinIO instance
```
> wget https://dl.min.io/server/minio/release/linux-amd64/minio
> chmod +x minio
> export MINIO_ACCESS_KEY=minio
> export MINIO_SECRET_KEY=miniostorage
> mkdir data
> ./minio server data/
```
Now it's time to setup the connectors
Open the AWS2 S3 configuration file for MinIO
```
name=CamelAWS2S3SourceConnector
connector.class=org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SourceConnector
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
camel.source.maxPollDuration=10000
topics=test1
camel.source.path.bucketNameOrArn=bucket
camel.source.endpoint.overrideEndpoint=true
camel.source.endpoint.uriEndpointOverride=http://localhost:9000
camel.component.aws2-s3.accessKey=minio
camel.component.aws2-s3.secretKey=miniostorage
camel.component.aws2-s3.region=eu-west-1
```
Now you can run the example
```
$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties config/CamelAWSS3SourceConnector.properties
```
Just connect to your MinIO console and upload a file into your bucket.
On a different terminal run the kafka-consumer and you should see messages from the MinIO S3 bucket arriving through Kafka Broker.
```
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test1 --from-beginning
S3 to Kafka through Camel
```