A producer publishes messages to Kafka topics. The message itself contains information about what topic and partition to publish to so you can publish to different topics with the same producer.
The Apache Pekko Connectors Kafka @apidoc[SendProducer] does not integrate with Apache Pekko Streams. Instead, it offers a wrapper of the Apache Kafka @javadocKafkaProducer to send data to Kafka topics in a per-element fashion with a @scala[Future
-based]@java[CompletionStage
-based] API.
It supports the same @refsettings as Apache Pekko Connectors @apidoc[Producer$] flows and sinks and supports @refservice discovery.
After use, the Producer
needs to be properly closed via the asynchronous close()
method.
The Send Producer offers methods for sending
send
sendEnvelope
(similar to Producer.flexiFlow
)After use, the Send Producer should be closed with close()
.
Produce a @javadocProducerRecord to a topic.
Scala : @@ snip snip { #record }
Java : @@ snip snip { #record }
The @apidoc[ProducerMessage.Envelope] can be used to send one record, or a list of of @javadocProducerRecords to produce a single or multiple messages to Kafka topics. The envelope can be used to pass through an arbitrary value which will be attached to the result.
Scala : @@ snip snip { #multiMessage }
Java : @@ snip snip { #multiMessage }
After successful sending, a @apidoc[ProducerMessage.Message] will return a @apidoc[org.apache.pekko.kafka.ProducerMessage.Result] element containing:
passThrough
within the message.A @apidoc[ProducerMessage.MultiMessage] will return a @apidoc[org.apache.pekko.kafka.ProducerMessage.MultiResult] containing:
passThrough
data.