Apache StreamPipes (incubating) enables flexible modeling of stream processing pipelines by providing a graphical modeling editor on top of existing stream processing frameworks.
It empowers non-technical users to quickly define and execute processing pipelines based on an easily extensible toolbox of data sources, data processors and data sinks. StreamPipes has an exchangeable runtime execution layer and executes pipelines using one of the provided wrappers, e.g., standalone or distributed in Apache Flink.
Pipeline elements in StreamPipes can be installed at runtime - the built-in SDK allows to easily implement new pipeline elements according to your needs. Pipeline elements are standalone microservices that can run anywhere - centrally on your server, in a large-scale cluster or close at the edge.
StreamPipes allows you to connect IoT data sources using the SDK or the built-in graphical tool StreamPipes Connect.
The extensible toolbox of data processors and sinks supports use cases such as
The quickest way to run StreamPipes including the latest extensions (adapters, pipeline elements) is by using our Docker-based installation & operation options, namely:
NOTE: StreamPipes CLI & k8s are highly recommended for developers or operators. Standard users should stick to StreamPipes Compose.
Please follow the instructions provided in the corresponding
README.md to get started.
For a more in-depth manual, read the installation guide.
To properly build the StreamPipes core, the following tools should be installed:
To build the core project, do the following:
mvn clean package
To build the ui, switch to the
ui folder and perform the following steps:
npm install # for NPM > v7, run npm install --legacy-peer-deps npm run build
To start StreamPipes, run
docker-compose up --build -d from the root directory.
You can also use the installer or CLI as described in the
StreamPipes includes a repository of extensions for
A description of the standard elements can be found in streampipes-extensions.
You can easily add your own data streams, processors or sinks. A Java-based SDK and several run-time wrappers for popular streaming frameworks such as Apache Flink and Apache Kafka Streams (and also plain Java programs) can be used to integrate your existing processing logic into StreamPipes. Pipeline elements are packaged as Docker images and can be installed at runtime, whenever your requirements change.
Check our developer guide at https://streampipes.apache.org/docs/docs/dev-guide-introduction.
If you‘ve found a bug or have a feature that you’d love to see in StreamPipes, feel free to create an issue in our Jira: https://issues.apache.org/jira/projects/STREAMPIPES
If you have any problems during the installation or questions around StreamPipes, you'll get help through one of our community channels:
And don't forget to follow us on Twitter!
We welcome contributions to StreamPipes. If you are interested in contributing to StreamPipes, let us know! You'll get to know an open-minded and motivated team working together to build the next IIoT analytics toolbox.
Here are some first steps in case you want to contribute:
We'd love to hear your feedback! Subscribe to firstname.lastname@example.org
Apache License 2.0