blob: c3247f650b6e18104023f440dce59a2b66d59987 [file] [log] [blame] [view]
# flink-storm-examples
This module contains multiple versions of a simple Word-Count example to illustrate the usage of the compatibility layer:
* the usage of spouts and bolts within a regular Flink streaming program (ie, embedded mode)
1. `SpoutSourceWordCount` uses a spout as data source within a Flink streaming program
2. `BoltTokenizeerWordCount` uses a bolt to split sentences into words within a Flink streaming program
* `BoltTokenizeerWordCountWithNames` used `Tuple` input type and accesses attributes by field names (rather than index)
* `BoltTokenizeerWordCountPOJO` used POJO input type and accesses attributes by field names (rather than index)
* how to submit a whole Storm topology to Flink
3. `WordCountTopology` plugs a Storm topology together
* `StormWordCountLocal` submits the topology to a local Flink cluster (similiar to a `LocalCluster` in Storm)
(`WordCountLocalByName` accesses attributes by field names rather than index)
* `WordCountRemoteByClient` submits the topology to a remote Flink cluster (simliar to the usage of `NimbusClient` in Storm)
* `WordCountRemoteBySubmitter` submits the topology to a remote Flink cluster (simliar to the usage of `StormSubmitter` in Storm)
Additionally, this module package the three example Word-Count programs as jar files to be submitted to a Flink cluster via `bin/flink run example.jar`.
(Valid jars are `WordCount-SpoutSource.jar`, `WordCount-BoltTokenizer.jar`, and `WordCount-StormTopology.jar`)
The package `org.apache.flink.storm.wordcount.operators` contains original spouts and bolts that can be used unmodified within Storm or Flink.