Fix broken links
diff --git a/docs/06_extend-sdk-stream-requirements.md b/docs/06_extend-sdk-stream-requirements.md
index 63ac0c7..409c516 100644
--- a/docs/06_extend-sdk-stream-requirements.md
+++ b/docs/06_extend-sdk-stream-requirements.md
@@ -10,7 +10,7 @@
 Once users create pipelines in the StreamPipes Pipeline Editor, these requirements are verified against the connected event stream.
 By using this feature, StreamPipes ensures that only pipeline elements can be connected that are syntactically and semantically valid.
 
-This guide covers the creation of stream requirements. Before reading this section, we recommend that you make yourself familiar with the SDK guide on [data processors](dev-guide-processor-sdk.md) and [data sinks](dev-guide-sink-sdk.md).
+This guide covers the creation of stream requirements. Before reading this section, we recommend that you make yourself familiar with the SDK guide on [data processors](extend-first-processor).
 
 
 :::tip Code on Github
@@ -90,7 +90,7 @@
   }
 ```
 
-See also the developer guide on [static properties](dev-guide-static-properties.md) to better understand the usage of ``MappingProperties``.
+See also the developer guide on [static properties](extend-sdk-static-properties) to better understand the usage of ``MappingProperties``.
 
 Requirements on primitive fields can be specified for all common datatypes:
 
diff --git a/docs/user-guide-first-steps.md b/docs/user-guide-first-steps.md
index b031145..b7ca89f 100644
--- a/docs/user-guide-first-steps.md
+++ b/docs/user-guide-first-steps.md
@@ -205,5 +205,5 @@
 
 We hope we gave you an easy quick start into StreamPipes.
 If you have any questions or suggestions, just send us an email.
-From here on you can explore all features in the [User Guide](user-guide-introduction.md) or go to the [Developer Guide](dev-guide-introduction.md) to learn how to write your own StreamPipes processing elements.
+From here on you can explore all features in the [User Guide](user-guide-introduction) or go to the [Developer Guide](extend-setup) to learn how to write your own StreamPipes processing elements.
 
diff --git a/docs/user-guide-introduction.md b/docs/user-guide-introduction.md
deleted file mode 100644
index 0f84725..0000000
--- a/docs/user-guide-introduction.md
+++ /dev/null
@@ -1,61 +0,0 @@
----
-id: user-guide-introduction-old
-title: Introduction
-sidebar_label: Introduction
----
-
-StreamPipes is a framework that enables users to work with data streams.
-It uses a lot of different technologies especially form the fields of big data, distributed computing and semantic web.
-One of the core concepts of StreamPipes is to add a higher semantic layer on top of big data processing technologies to ease their usage.
-StreamPipes is not just a UI, it is a framework with a lot of different capabilities, like modelling new data processing pipelines, execute them in a distributed environment.
-On top it uses semantics to provide guidance to non-technical people for better analyzing their data streams in a self-service manner.
-
-
-
-## Pipelines
-The core concept of StreamPipes are data processing pipelines.
-Those pipelines use data from different sources (Data Streams), then transform it via Processing Elements and store them in an database or send it to third party systems (Data Sinks).
-A brief introduction is given in the following sections.
-At the next page a detailed tour through StreamPies explains all the different features that are available.
-
-
-## Data Streams
-Data Streams represent the primary source for data in StreamPipes.
-A stream is an ordered sequence of events, where an event is described as one or more observation values.
-Those events can come from different sources like sensors, machines, log files or many more.
-It does not matter what kind of serialization format the events have or which kind of transportation protocol the individual data streams use.
-As long as a semantic description is provided StreamPipes is capable of processing the data.
-
-
-## Processing Elements
-Processing Elements are defined as an processor that transforms one or more input event streams to an output event stream. 
-Those transformations can be rather simple like filtering out events based on a predefined rule or more complex by applying algorithms on the data.  
-Processing elements define stream requirements that are a set of minimum properties an incoming event stream must provide. 
-Furthermore, Processing Elements describe their output based on a set of output strategies.
-They also describe further (human) input in form of configuration parameters.
-The Processing Elements can be implemented in multiple technologies.
-This information is not necessary when constructing a pipeline, the user does not need to know where and how the actual algorithm is deployed and executed.
-During the modelling phase it is possible to set configuration parameters, wich are then injected into the program when it is started.
-A description is provided for all parameters and it is ensured by the system that the user can just enter semantically correct values.
-
-
-## Data Sinks
-Data Sinks consume event streams similar to processing elements with the difference that sinks do not provide an output stream, i.e., they are defined as sinks that perform some action or trigger a visualization as a result of a stream transformation.
-The sinks also define stream requirements that must be fulfilled.
-In a pipeline it is not necessary to use a processing element to transform data.
-Often it can make sense to just use a data sink and connect it directly to the sensor to store the raw data into a data store for offline analysis.
-This is very simple with StreamPipes and no additional code must be written to create such a data lake.
-
-
-## Target Audience
-StreamPipes focuses on multiple target groups.
-This guide is for users who interact with the graphical user interface in the browser.
-If you are interested in the technical details or plan to extend the system with new algorithms, please read the Developer Guide.
-The graphical user interface is designed for domain experts who want to analyze data, but are not interested in technical details and do not want to write code.
-The SDK can be used by software developers to extend the framework with new functionality.
-After importing newly developed pipeline elements, they are available to all users of StreamPipes.
-
-
-## Next Steps
-To test StreamPipes on your local environment go to the [installation guide](user-guide-installation.md).
-If you are further interested in the concepts of StreamPipes continue with the [tour](user-guide-tour.md).
diff --git a/docs/user-guide-software-components.md b/docs/user-guide-software-components.md
deleted file mode 100644
index fc92dfc..0000000
--- a/docs/user-guide-software-components.md
+++ /dev/null
@@ -1,334 +0,0 @@
----
-id: user-guide-software-components
-title: Software Components
-sidebar_label: Software Components
----
-
-This page contains all the software components that can be used within the StreamPipes framework.
-Some of them are mandatory but others are just necessary for a special capabilities.
-In the [Installation Guide](user-guide-installation.md#installation_1) we  already provide a docker-compose.yml file with all the necessary components
-for a minimal setup.
-Extend this configuration files with further containers described on this page and configure StreamPipes
-according to your needs.
-
-
-## StreamPipes Framework
-
-<details class="tip">
-<summary>StreamPipes Backend</summary>
-
-#### Description
-The StreamPipes Backend is the main component of the StreamPipes Framework. It contains the application logic to create and execute pipelines.
-Furthermore, it provides a REST-API that is used by other components for communication.
-
-#### Docker Compose
-```yaml
-backend:
-  image: streampipes/backend
-  depends_on:
-    - "consul"
-  ports:
-    - "8030:8030"
-  volumes:
-    - ./config:/root/.streampipes
-    - ./config/aduna:/root/.aduna
-  networks:
-    spnet:
-```
-</details>
-
-
-<details class="tip">
-<summary>StreamPipes UI</summary>
-
-#### Description
-This service uses nginx and contains the UI of StreamPipes.
-The UI can, for example, be used to import new pipeline elements, create new pipelines and manage the pipeline
-execution. The UI communicates with the backend via the REST interface.
-
-#### Docker Compose
-```yaml
-nginx:
-  image: streampipes/ui
-  ports:
-    - "80:80"
-  depends_on:
-    - backend
-  networks:
-    spnet:
-```
-</details>
-
-## StreamPipes Services
-
-<details class="tip">
-<summary>Consul</summary>
-#### Description
-Consul is used to store configuration parameters of the backend service and processing elements.
-It is further used for service discovery. Once a processing element container is started in the network, it is
-automatically discovered via the service discovery feature of Consul.
-
-#### Docker Compose
-```yaml
-consul:
-    image: consul
-    environment:
-      - "CONSUL_LOCAL_CONFIG={\"disable_update_check\": true}"
-      - "CONSUL_BIND_INTERFACE=eth0"
-      - "CONSUL_HTTP_ADDR=0.0.0.0"
-    entrypoint:
-      - consul
-      - agent
-      - -server
-      - -bootstrap-expect=1
-      - -data-dir=/consul/data
-      - -node=consul-one
-      - -bind={{ GetInterfaceIP "eth0" }}
-      - -client=0.0.0.0
-      - -enable-script-checks=true
-      - -ui
-    volumes:
-      - ./config/consul:/consul/data
-    ports:
-      - "8500:8500"
-      - "8600:8600"
-    networks:
-      spnet:
-        ipv4_address: 172.30.0.9
-```
-</details>
-
-<details class="tip">
-<summary>Zookeeper</summary>
-#### Description
-Apache Kafka and Apache Flink require zookeeper to manage their clusters.
-
-#### Docker Compose
-```yaml
-zookeeper:
-    image: wurstmeister/zookeeper
-    ports:
-      - "2181:2181"
-    networks:
-      spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Kafka</summary>
-
-#### Description
-Kafka is used as the primary message broker. It is possible to use other brokers or even multiple message brokers in a single pipeline, but Kafka is the
-default. The communication between the processing elements in a pipeline is mostly done via Kafka.
-
-#### Docker Compose
-```yaml
-  kafka:
-    image: wurstmeister/kafka:0.10.0.1
-    ports:
-      - "9092:9092"
-    environment:
-      KAFKA_ADVERTISED_HOST_NAME: ###TODO ADD HOSTNAME HERE ###
-      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
-    volumes:
-      - /var/run/docker.sock:/var/run/docker.sock
-    networks:
-      spnet:
-```
-</details>
-
-<details class="tip">
-<summary>ActiveMQ</summary>
-#### Description
-ActiveMQ is another message broker which can be used in addition to Kafka. Currently, the main purpose is to provide
-an endpoint for the websocket connections required by the real-time dashboard of the StreamPipes UI.
-
-#### Docker Compose
-```yaml
-activemq:
-  image: streampipes/activemq
-  ports:
-    - "61616:61616"
-    - "61614:61614"
-    - "8161:8161"
-  networks:
-    spnet:
-
-    ```
-</details>
-
-<details class="tip">
-<summary>CouchDB</summary>
-
-#### Description
-CouchDB is the main database for StreamPipes data that needs to be persisted such as pipelines, users and visualizations created in the dashboard.
-
-#### Docker Compose
-```yaml
-couchdb:
-  image: couchdb
-  ports:
-    - "5984:5984"
-  volumes:
-    - ./config/couchdb/data:/usr/local/var/lib/couchdb
-  networks:
-    spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Flink</summary>
-#### Description
-This service sets up a sample flink cluster with one jobmanager and one taskmanager. Although this cluster can be used for testing, it is not recommended for production use.
-
-#### Docker Compose
-```yaml
-jobmanager:
-  image: streampipes/flink
-  ports:
-    - "8081:8099"
-  command: jobmanager
-  networks:
-    spnet:
-
-
-taskmanager:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/flink
-  command: taskmanager
-  environment:
-    - FLINK_NUM_SLOTS=20
-  networks:
-    spnet:
-```
-</details>
-
-
-## Processing Elements
-
-<details class="tip">
-<summary>PE Examples Sources</summary>
-#### Description
-This Processing Element Container contains several sample data sources that can be used to work with StreamPipes.
-It consists of sources descriptions and data simulators that constantly produce data.
-
-#### Docker Compose
-```yaml
-    pe-examples-sources:
-      image: streampipes/pe-examples-sources:
-      depends_on:
-        - "consul"
-      ports:
-        - "8098:8090"
-      networks:
-        spnet:
-```
-</details>
-
-<details class="tip">
-<summary>PE Examples JVM</summary>
-
-#### Description
-This Processing Element Container contains some sink example implementations, like for example the real-time
-dashboard. This can be used to visualize data within StreamPipes.
-
-#### Docker Compose
-```yaml
-      pe-exanmples-jvm:
-        image: streampipes/pe-examples-jvm
-        depends_on:
-          - "consul"
-        environment:
-          - STREAMPIPES_HOST=###TODO ADD HOSTNAME HERE ###
-        ports:
-          - "8096:8090"
-        networks:
-          spnet:
-```
-</details>
-
-<details class="tip">
-<summary>PE Examples Flink</summary>
-
-#### Description
-The Flink Samples Processing Element Container contains some example algorithms that can be used within processing
-pipelines in the pipeline editor. Those algorithms are deployed to a Flink cluster once the pipeline is started.
-
-#### Docker Compose
-```yaml
-  pe-flink-samples:
-    image: streampipes/pe-examples-flink
-    depends_on:
-      - "consul"
-    ports:
-      - "8094:8090"
-    volumes:
-      - ./config:/root/.streampipes
-    networks:
-      spnet:
-```
-</details>
-
-### Third Party Services
-
-<details class="tip">
-<summary>Elasticsearch</summary>
-
-#### Description
-This service can be used to run Elasticsearch. Data can be written into Elasticsearch with the Elasticsearch
-sink of the PE Flink samples conatiner.
-
-#### Docker Compose
-```yaml
-elasticsearch:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/elasticsearch
-  ports:
-    - "9200:9200"
-    - "9300:9300"
-  volumes:
-    - ./config/elasticsearch/data:/usr/share/elasticsearch/data
-  networks:
-    spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Kibana</summary>
-#### Description
-Kibana is used to visualize data that is written into Elasticsearch. It can be used in addition to our live dashboard
-to analyse and visualize historic data.
-
-#### Docker Compose
-```yaml
-kibana:
-  image: kibana:5.2.2
-  ports:
-    - "5601:5601"
-  volumes:
-    - ./config/kibana/kibana.yml:/opt/kibana/config/kibana.yml
-  environment:
-    - ELASTICSEARCH_URL=http://elasticsearch:9200
-  networks:
-    spnet:
-```
-</details>
-
-
-<details class="tip">
-<summary>Kafka Web Console</summary>
-
-#### Description
-The kafka web console can be used to monitor the kafka cluster. This is a good tool for debugging your newly
-developed pipeline elements.
-
-#### Docker Compose
-```yaml
-kafka-web-console:
-  image: hwestphal/kafka-web-console
-  ports:
-    - "9000:9000"
-  volumes:
-    - ./config:/data
-  networks:
-    spnet:
-```
-</details>
diff --git a/website-v2/blog/2022-03-21_release-0690.md b/website-v2/blog/2022-03-21_release-0690.md
index 99d13cd..41662fa 100644
--- a/website-v2/blog/2022-03-21_release-0690.md
+++ b/website-v2/blog/2022-03-21_release-0690.md
@@ -81,5 +81,5 @@
 
 
 ## Migration
-While we are not yet ready for automatic migration, a [migration guide](dev-guide-migration.md) explains several new concepts introduced in this StreamPipes version.
+While we are not yet ready for automatic migration, a migration guide explains several new concepts introduced in this StreamPipes version.
 We aim at providing a backwards compatible version with release 1.0, planned for later this year.
diff --git a/website-v2/src/navbar/navbar.js b/website-v2/src/navbar/navbar.js
index cd18680..79c3bba 100644
--- a/website-v2/src/navbar/navbar.js
+++ b/website-v2/src/navbar/navbar.js
@@ -28,7 +28,6 @@
     "position": "right"
   },
   {
-    "to": "/resources",
     "label": "Resources",
     "position": "right",
     items: [
@@ -56,7 +55,6 @@
     "position": "right"
   },
   {
-    "to": "/community",
     "label": "Community",
     "position": "right",
     items: [
@@ -79,7 +77,6 @@
     ]
   },
   {
-    "to": "/apache",
     "label": "Apache",
     "position": "right",
     items: [
diff --git a/website-v2/src/pages/versions.js b/website-v2/src/pages/versions.js
deleted file mode 100644
index 5f4f2b1..0000000
--- a/website-v2/src/pages/versions.js
+++ /dev/null
@@ -1,129 +0,0 @@
-
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *
- */
-
-const React = require('react');
-const CompLibrary = {
-  Container: props => <div {...props}></div>,
-  GridBlock: props => <div {...props}></div>,
-  MarkdownBlock: props => <div {...props}></div>
-};
-const Container = CompLibrary.Container;
-const CWD = process.cwd();
-
-const siteConfig = require(`${CWD}/siteConfig.js`);
-const versions = require(`${CWD}/versions.json`);
-
-import Layout from "@theme/Layout";
-
-function Versions(props) {
-  const latestVersion = versions[0];
-  const repoUrl = `https://github.com/apache/streampipes`;
-  return (
-    <div className="docMainWrapper wrapper">
-      <Container className="mainContainer versionsContainer">
-        <div className="post">
-          <header className="postHeader">
-            <h1>{siteConfig.title} Versions</h1>
-          </header>
-          <p>New versions of this project are released every so often.</p>
-          <h3 id="latest">Current version (Stable)</h3>
-          <table className="versions">
-            <tbody>
-              <tr>
-                <th>{latestVersion}</th>
-                <td>
-                  <a href={`${siteConfig.baseUrl}docs/user-guide-introduction`}>&#x1F4DA; Documentation</a>
-                </td>
-                <td>
-                  <a href={`${repoUrl}/releases/tag/release/${latestVersion}`}>&#x1F5DE; Release Notes</a>
-                </td>
-              </tr>
-            </tbody>
-          </table>
-          <p>
-            This is the version that is configured automatically when you first
-            install this project.
-          </p>
-          <h3 id="rc">Pre-release versions</h3>
-          <p>The current development status can be found in our dev-branch on GitHub.<br/>
-            Please note that the software is still under development, which means that features may still change or disappear or behave error-prone until the next release.<br/>
-            If you have ideas for new features, you can discuss them on the mailing list or create an entry in the issue tracker.</p>
-          <table className="versions">
-            <tbody>
-              <tr>
-                <td>
-                  <a href="https://github.com/apache/streampipes/tree/dev">&#128421; Development Branch</a>
-                </td>
-                <td>
-                  <a href={`${siteConfig.baseUrl}docs/next/user-guide-introduction`}>&#x1F6A7; Documentation</a>
-                </td>
-                <td>
-                  <a href="https://streampipes.apache.org/mailinglists.html">&#x1F4EF; Mailing list</a>
-                </td>
-                <td>
-                  <a href="https://github.com/apache/streampipes/issues">&#x1F4A1; Issue Tracker</a>
-                </td>
-              </tr>
-            </tbody>
-          </table>
-          <p></p>
-          <h3 id="archive">Past Versions</h3>
-          <table className="versions">
-            <tbody>
-              {versions.map(
-                version =>
-                  version !== latestVersion && !version.includes("pre-asf") && (
-                    <tr>
-                      <th>{version}</th>
-                      <td>
-                        <a href={`${siteConfig.baseUrl}docs/${version}/user-guide-introduction`}>&#x1F4DA; Documentation</a>
-                      </td>
-                      <td>
-                        <a href={`${repoUrl}/releases/tag/release/${version}`}>&#x1F5DE; Release Notes</a>
-                      </td>
-                    </tr>
-                  ),
-              )}
-              {versions.map(
-                  version =>
-                      version !== latestVersion && version.includes("pre-asf") && (
-                          <tr>
-                            <th>{version}</th>
-                            <td>
-                              <a href={`${siteConfig.baseUrl}docs/${version}/user-guide-introduction`}>&#x1F4DA; Documentation</a>
-                            </td>
-                            <td>
-                              <a href={`${repoUrl}/releases/tag/${version.replace("-pre-asf", "")}`}>&#x1F5DE; Release Notes</a>
-                            </td>
-                          </tr>
-                      ),
-              )}
-            </tbody>
-          </table>
-          <p>
-            You can find past versions of this project on{' '}
-            <a href={`${repoUrl}/releases`}>GitHub</a>
-          </p>
-        </div>
-      </Container>
-    </div>
-  );
-}
-
-export default props => <Layout><Versions {...props} /></Layout>;
diff --git a/website-v2/versioned_docs/version-0.70.0/01_try-installation.md b/website-v2/versioned_docs/version-0.70.0/01_try-installation.md
index 858e0ca..e179eb2d 100644
--- a/website-v2/versioned_docs/version-0.70.0/01_try-installation.md
+++ b/website-v2/versioned_docs/version-0.70.0/01_try-installation.md
@@ -41,9 +41,7 @@
 
 ## Install StreamPipes
 
-<ul style="padding-left:0">
-  <DownloadSection version={'0.70.0'}></DownloadSection>
-</ul>
+<DownloadSection version={'0.70.0'}></DownloadSection>
 
 ## Setup StreamPipes
 
@@ -70,4 +68,4 @@
 
 ## Next Steps
 
-That's it! To ease your first steps with StreamPipes, we've created an [interactive tutorial](01_try-tutorial.md).
+That's it! To ease your first steps with StreamPipes, we've created an [interactive tutorial](try-tutorial).
diff --git a/website-v2/versioned_docs/version-0.70.0/01_try-overview.md b/website-v2/versioned_docs/version-0.70.0/01_try-overview.md
index 8be91ea..708ed05 100644
--- a/website-v2/versioned_docs/version-0.70.0/01_try-overview.md
+++ b/website-v2/versioned_docs/version-0.70.0/01_try-overview.md
@@ -7,7 +7,7 @@
 
 This is the documentation of Apache StreamPipes.
 
-<img class="docs-image docs-image-small docs-image-no-shadow" style="padding: 10px;" src="/img/01_try-overview/01_streampipes-overview.png" alt="StreamPipes Overview"/>
+<img class="docs-image docs-image-small docs-image-no-shadow" style={{padding: '10px'}} src="/img/01_try-overview/01_streampipes-overview.png" alt="StreamPipes Overview"/>
 
 
 <div class="container grid col-3">
diff --git a/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-event-model.md b/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-event-model.md
index 1152010..f4bb8ed 100644
--- a/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-event-model.md
+++ b/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-event-model.md
@@ -11,11 +11,11 @@
 
 ## Prerequisites
 
-This guide assumes that you are already familiar with the basic setup of [data processors](dev-guide-processor-sdk.md) and [data sinks](dev-guide-sink-sdk.md).
+This guide assumes that you are already familiar with the basic setup of [data processors](extend-first-processor).
 
 ### Property Selectors
 
-In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](dev-guide-static-properties.md)).
+In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](extend-sdk-static-properties)).
 
 Mapping properties return a ``PropertySelector`` that identifies a field based on (i) the **streamIndex** and (ii) the runtime name of the field.
 Let's assume we have an event with the following structure:
@@ -42,7 +42,7 @@
 
 ``s0`` identifies the stream (in this case, only one input streams exist, but as data processors might require more than one input stream, a stream identifier is required), while the appendix identifies the runtime name.
 
-Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](dev-guide-output-strategies.md).
+Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](extend-sdk-output-strategies).
 
 ### Reading Fields
 
diff --git a/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-stream-requirements.md b/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-stream-requirements.md
index 6f1213a..98a9215 100644
--- a/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-stream-requirements.md
+++ b/website-v2/versioned_docs/version-0.70.0/06_extend-sdk-stream-requirements.md
@@ -88,7 +88,7 @@
   }
 ```
 
-See also the developer guide on [static properties](dev-guide-static-properties.md) to better understand the usage of ``MappingProperties``.
+See also the developer guide on [static properties](extend-sdk-static-properties) to better understand the usage of ``MappingProperties``.
 
 Requirements on primitive fields can be specified for all common datatypes:
 
diff --git a/website-v2/versioned_docs/version-0.70.0/user-guide-installation.md b/website-v2/versioned_docs/version-0.70.0/user-guide-installation.md
deleted file mode 100644
index 2ac726a..0000000
--- a/website-v2/versioned_docs/version-0.70.0/user-guide-installation.md
+++ /dev/null
@@ -1,121 +0,0 @@
----
-id: user-guide-installation
-title: Installation
-sidebar_label: Installation
-original_id: user-guide-installation
----
-## Prerequisites
-
-### Hardware
-
-* Docker (latest version, see instructions below)
-* Docker Compose (latest version., see instructions below)
-
-### Supported operating systems
-We rely on Docker and support three operating systems for the StreamPipes system
-
-* Linux
-* OSX
-* Windows 10
-    * Please note that older Windows versions are not compatible with Docker. Also Linux VMs under Windows might not work, due to network problems with docker.
-
-### Web Browser
-StreamPipes is a modern web application, therefore you need a recent version of Chrome (recommended), Firefox or Edge.
-
-### Docker
-You need to have Docker installed on your system before you continue with the installation guide.
-
-
-<div class="admonition info">
-<div class="admonition-title">Install Docker</div>
-<p>Go to https://docs.docker.com/installation/ and follow the instructions to install Docker for your OS. Make sure docker can be started as a non-root user (described in the installation manual, don’t forget to log out and in again) and check that Docker is installed correctly by executing docker-run hello-world</p>
-</div>
-
-<div class="admonition info">
-<div class="admonition-title">Configure Docker</div>
-<p>By default, Docker uses only a limited number of CPU cores and memory.
-       If you run StreamPipes on Windows or on a Mac you need to adjust the default settings.
-       To do that, click on the Docker icon in your tab bar and open the preferences.
-       Go to the advanced preferences and set the **number of CPUs to 6** (recommended) and the **Memory to 4GB**.
-       After changing the settings, Docker needs to be restarted.</p></div>
-
-
-## Install StreamPipes
-
-<div class="tab-content" id="myTabContent">
-    <div class="tab-pane fade show active" id="linux" role="tabpanel" aria-labelledby="linux-tab">
-        <ul style="padding-left:0">
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">1</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                        <a href="https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=streampipes/installer/0.70.0/apache-streampipes-installer-0.70.0-incubating-source-release.zip">Download</a>
-                        the latest Apache StreamPipes release and extract the zip file to a directory of your choice.
-                    </div>
-                </div>
-            </li>
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">2</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                       In a command prompt, open the folder <code>/compose</code> and run <code>docker-compose up -d</code>.
-                    </div>
-                </div>
-            </li>
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">3</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                        Open your browser, navigate to http://localhost:80 (or the domain name of your server) and finish the setup according to the instructions below.
-                    </div>
-                </div>
-            </li>
-        </ul>
-        </div>
-    </div>
-
-## Setup StreamPipes
-
-Once you've opened the browser at the URL given above, you should see StreamPipes application as shown below.
-To set up the system, enter an email address and a password and click on install.
-At this point, it is not necessary to change anything in the advanced settings menu.
-The installation might take some time, continue by clicking on "Go to login page", once all components are successfully configured.
-
-
-On the login page, enter your credentials, then you should be forwarded to the home page.
-
-Congratulations! You've successfully managed to install StreamPipes. Now we're ready to build our first pipeline!
-
-<div class="my-carousel">
-    <img src="/img/quickstart/setup/01_register_user.png" alt="Set Up User"/>
-    <img src="/img/quickstart/setup/02_user_set_up.png" alt="SetUp StreamPipes Components"/>
-    <img src="/img/quickstart/setup/03_login.png" alt="Go to login page"/>
-    <img src="/img/quickstart/setup/04_home.png" alt="Home page"/>
-</div>
-
-<div class="admonition error">
-<div class="admonition-title">Errors during the installation process</div>
-<p>In most cases, errors during the installation are due to an under-powered system.<br/>
-If there is a problem with any of the components, please restart the whole system and delete the "config" directory on the server.
-   This directory is in the same folder as the docker-compose.yml file.<br/>
-   Please also make sure that your system meets the hardware requirements as mentioned in the first section of the installation guide.</p>
-</div>
-
-## Next Steps
-
-Now you can continue with the tutorial on page [First steps](user-guide-first-steps.md).
diff --git a/website-v2/versioned_docs/version-0.70.0/user-guide-introduction.md b/website-v2/versioned_docs/version-0.70.0/user-guide-introduction.md
deleted file mode 100644
index 1e3471a..0000000
--- a/website-v2/versioned_docs/version-0.70.0/user-guide-introduction.md
+++ /dev/null
@@ -1,62 +0,0 @@
----
-id: user-guide-introduction-old
-title: Introduction
-sidebar_label: Introduction
-original_id: user-guide-introduction-old
----
-
-StreamPipes is a framework that enables users to work with data streams.
-It uses a lot of different technologies especially form the fields of big data, distributed computing and semantic web.
-One of the core concepts of StreamPipes is to add a higher semantic layer on top of big data processing technologies to ease their usage.
-StreamPipes is not just a UI, it is a framework with a lot of different capabilities, like modelling new data processing pipelines, execute them in a distributed environment.
-On top it uses semantics to provide guidance to non-technical people for better analyzing their data streams in a self-service manner.
-
-
-
-## Pipelines
-The core concept of StreamPipes are data processing pipelines.
-Those pipelines use data from different sources (Data Streams), then transform it via Processing Elements and store them in an database or send it to third party systems (Data Sinks).
-A brief introduction is given in the following sections.
-At the next page a detailed tour through StreamPies explains all the different features that are available.
-
-
-## Data Streams
-Data Streams represent the primary source for data in StreamPipes.
-A stream is an ordered sequence of events, where an event is described as one or more observation values.
-Those events can come from different sources like sensors, machines, log files or many more.
-It does not matter what kind of serialization format the events have or which kind of transportation protocol the individual data streams use.
-As long as a semantic description is provided StreamPipes is capable of processing the data.
-
-
-## Processing Elements
-Processing Elements are defined as an processor that transforms one or more input event streams to an output event stream. 
-Those transformations can be rather simple like filtering out events based on a predefined rule or more complex by applying algorithms on the data.  
-Processing elements define stream requirements that are a set of minimum properties an incoming event stream must provide. 
-Furthermore, Processing Elements describe their output based on a set of output strategies.
-They also describe further (human) input in form of configuration parameters.
-The Processing Elements can be implemented in multiple technologies.
-This information is not necessary when constructing a pipeline, the user does not need to know where and how the actual algorithm is deployed and executed.
-During the modelling phase it is possible to set configuration parameters, wich are then injected into the program when it is started.
-A description is provided for all parameters and it is ensured by the system that the user can just enter semantically correct values.
-
-
-## Data Sinks
-Data Sinks consume event streams similar to processing elements with the difference that sinks do not provide an output stream, i.e., they are defined as sinks that perform some action or trigger a visualization as a result of a stream transformation.
-The sinks also define stream requirements that must be fulfilled.
-In a pipeline it is not necessary to use a processing element to transform data.
-Often it can make sense to just use a data sink and connect it directly to the sensor to store the raw data into a data store for offline analysis.
-This is very simple with StreamPipes and no additional code must be written to create such a data lake.
-
-
-## Target Audience
-StreamPipes focuses on multiple target groups.
-This guide is for users who interact with the graphical user interface in the browser.
-If you are interested in the technical details or plan to extend the system with new algorithms, please read the Developer Guide.
-The graphical user interface is designed for domain experts who want to analyze data, but are not interested in technical details and do not want to write code.
-The SDK can be used by software developers to extend the framework with new functionality.
-After importing newly developed pipeline elements, they are available to all users of StreamPipes.
-
-
-## Next Steps
-To test StreamPipes on your local environment go to the [installation guide](user-guide-installation.md).
-If you are further interested in the concepts of StreamPipes continue with the [tour](user-guide-tour.md).
diff --git a/website-v2/versioned_docs/version-0.90.0/01_try-installation.md b/website-v2/versioned_docs/version-0.90.0/01_try-installation.md
index 98afbfa..2f6c131 100644
--- a/website-v2/versioned_docs/version-0.90.0/01_try-installation.md
+++ b/website-v2/versioned_docs/version-0.90.0/01_try-installation.md
@@ -35,9 +35,7 @@
 
 ## Install StreamPipes
 
-<ul style="padding-left:0">
-    <DownloadSection version={'0.90.0'}></DownloadSection>
-</ul>
+<DownloadSection version={'0.90.0'}></DownloadSection>
 
 ## Setup StreamPipes
 
diff --git a/website-v2/versioned_docs/version-0.90.0/01_try-overview.md b/website-v2/versioned_docs/version-0.90.0/01_try-overview.md
index a16f27a..db798a0 100644
--- a/website-v2/versioned_docs/version-0.90.0/01_try-overview.md
+++ b/website-v2/versioned_docs/version-0.90.0/01_try-overview.md
@@ -7,7 +7,7 @@
 
 This is the documentation of Apache StreamPipes.
 
-<img class="docs-image docs-image-small docs-image-no-shadow" style="padding: 10px;" src="/img/01_try-overview/01_streampipes-overview.png" alt="StreamPipes Overview"/>
+<img class="docs-image docs-image-small docs-image-no-shadow" style={{padding: '10px'}} src="/img/01_try-overview/01_streampipes-overview.png" alt="StreamPipes Overview"/>
 
 
 <div class="container grid col-3">
diff --git a/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-event-model.md b/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-event-model.md
index 1152010..f4bb8ed 100644
--- a/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-event-model.md
+++ b/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-event-model.md
@@ -11,11 +11,11 @@
 
 ## Prerequisites
 
-This guide assumes that you are already familiar with the basic setup of [data processors](dev-guide-processor-sdk.md) and [data sinks](dev-guide-sink-sdk.md).
+This guide assumes that you are already familiar with the basic setup of [data processors](extend-first-processor).
 
 ### Property Selectors
 
-In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](dev-guide-static-properties.md)).
+In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](extend-sdk-static-properties)).
 
 Mapping properties return a ``PropertySelector`` that identifies a field based on (i) the **streamIndex** and (ii) the runtime name of the field.
 Let's assume we have an event with the following structure:
@@ -42,7 +42,7 @@
 
 ``s0`` identifies the stream (in this case, only one input streams exist, but as data processors might require more than one input stream, a stream identifier is required), while the appendix identifies the runtime name.
 
-Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](dev-guide-output-strategies.md).
+Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](extend-sdk-output-strategies).
 
 ### Reading Fields
 
diff --git a/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-stream-requirements.md b/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-stream-requirements.md
index 6f1213a..98a9215 100644
--- a/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-stream-requirements.md
+++ b/website-v2/versioned_docs/version-0.90.0/06_extend-sdk-stream-requirements.md
@@ -88,7 +88,7 @@
   }
 ```
 
-See also the developer guide on [static properties](dev-guide-static-properties.md) to better understand the usage of ``MappingProperties``.
+See also the developer guide on [static properties](extend-sdk-static-properties) to better understand the usage of ``MappingProperties``.
 
 Requirements on primitive fields can be specified for all common datatypes:
 
diff --git a/website-v2/versioned_docs/version-0.90.0/user-guide-first-steps.md b/website-v2/versioned_docs/version-0.90.0/user-guide-first-steps.md
index b031145..b7ca89f 100644
--- a/website-v2/versioned_docs/version-0.90.0/user-guide-first-steps.md
+++ b/website-v2/versioned_docs/version-0.90.0/user-guide-first-steps.md
@@ -205,5 +205,5 @@
 
 We hope we gave you an easy quick start into StreamPipes.
 If you have any questions or suggestions, just send us an email.
-From here on you can explore all features in the [User Guide](user-guide-introduction.md) or go to the [Developer Guide](dev-guide-introduction.md) to learn how to write your own StreamPipes processing elements.
+From here on you can explore all features in the [User Guide](user-guide-introduction) or go to the [Developer Guide](extend-setup) to learn how to write your own StreamPipes processing elements.
 
diff --git a/website-v2/versioned_docs/version-0.90.0/user-guide-installation.md b/website-v2/versioned_docs/version-0.90.0/user-guide-installation.md
deleted file mode 100644
index 2ac726a..0000000
--- a/website-v2/versioned_docs/version-0.90.0/user-guide-installation.md
+++ /dev/null
@@ -1,121 +0,0 @@
----
-id: user-guide-installation
-title: Installation
-sidebar_label: Installation
-original_id: user-guide-installation
----
-## Prerequisites
-
-### Hardware
-
-* Docker (latest version, see instructions below)
-* Docker Compose (latest version., see instructions below)
-
-### Supported operating systems
-We rely on Docker and support three operating systems for the StreamPipes system
-
-* Linux
-* OSX
-* Windows 10
-    * Please note that older Windows versions are not compatible with Docker. Also Linux VMs under Windows might not work, due to network problems with docker.
-
-### Web Browser
-StreamPipes is a modern web application, therefore you need a recent version of Chrome (recommended), Firefox or Edge.
-
-### Docker
-You need to have Docker installed on your system before you continue with the installation guide.
-
-
-<div class="admonition info">
-<div class="admonition-title">Install Docker</div>
-<p>Go to https://docs.docker.com/installation/ and follow the instructions to install Docker for your OS. Make sure docker can be started as a non-root user (described in the installation manual, don’t forget to log out and in again) and check that Docker is installed correctly by executing docker-run hello-world</p>
-</div>
-
-<div class="admonition info">
-<div class="admonition-title">Configure Docker</div>
-<p>By default, Docker uses only a limited number of CPU cores and memory.
-       If you run StreamPipes on Windows or on a Mac you need to adjust the default settings.
-       To do that, click on the Docker icon in your tab bar and open the preferences.
-       Go to the advanced preferences and set the **number of CPUs to 6** (recommended) and the **Memory to 4GB**.
-       After changing the settings, Docker needs to be restarted.</p></div>
-
-
-## Install StreamPipes
-
-<div class="tab-content" id="myTabContent">
-    <div class="tab-pane fade show active" id="linux" role="tabpanel" aria-labelledby="linux-tab">
-        <ul style="padding-left:0">
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">1</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                        <a href="https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=streampipes/installer/0.70.0/apache-streampipes-installer-0.70.0-incubating-source-release.zip">Download</a>
-                        the latest Apache StreamPipes release and extract the zip file to a directory of your choice.
-                    </div>
-                </div>
-            </li>
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">2</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                       In a command prompt, open the folder <code>/compose</code> and run <code>docker-compose up -d</code>.
-                    </div>
-                </div>
-            </li>
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">3</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                        Open your browser, navigate to http://localhost:80 (or the domain name of your server) and finish the setup according to the instructions below.
-                    </div>
-                </div>
-            </li>
-        </ul>
-        </div>
-    </div>
-
-## Setup StreamPipes
-
-Once you've opened the browser at the URL given above, you should see StreamPipes application as shown below.
-To set up the system, enter an email address and a password and click on install.
-At this point, it is not necessary to change anything in the advanced settings menu.
-The installation might take some time, continue by clicking on "Go to login page", once all components are successfully configured.
-
-
-On the login page, enter your credentials, then you should be forwarded to the home page.
-
-Congratulations! You've successfully managed to install StreamPipes. Now we're ready to build our first pipeline!
-
-<div class="my-carousel">
-    <img src="/img/quickstart/setup/01_register_user.png" alt="Set Up User"/>
-    <img src="/img/quickstart/setup/02_user_set_up.png" alt="SetUp StreamPipes Components"/>
-    <img src="/img/quickstart/setup/03_login.png" alt="Go to login page"/>
-    <img src="/img/quickstart/setup/04_home.png" alt="Home page"/>
-</div>
-
-<div class="admonition error">
-<div class="admonition-title">Errors during the installation process</div>
-<p>In most cases, errors during the installation are due to an under-powered system.<br/>
-If there is a problem with any of the components, please restart the whole system and delete the "config" directory on the server.
-   This directory is in the same folder as the docker-compose.yml file.<br/>
-   Please also make sure that your system meets the hardware requirements as mentioned in the first section of the installation guide.</p>
-</div>
-
-## Next Steps
-
-Now you can continue with the tutorial on page [First steps](user-guide-first-steps.md).
diff --git a/website-v2/versioned_docs/version-0.90.0/user-guide-introduction.md b/website-v2/versioned_docs/version-0.90.0/user-guide-introduction.md
deleted file mode 100644
index 0f84725..0000000
--- a/website-v2/versioned_docs/version-0.90.0/user-guide-introduction.md
+++ /dev/null
@@ -1,61 +0,0 @@
----
-id: user-guide-introduction-old
-title: Introduction
-sidebar_label: Introduction
----
-
-StreamPipes is a framework that enables users to work with data streams.
-It uses a lot of different technologies especially form the fields of big data, distributed computing and semantic web.
-One of the core concepts of StreamPipes is to add a higher semantic layer on top of big data processing technologies to ease their usage.
-StreamPipes is not just a UI, it is a framework with a lot of different capabilities, like modelling new data processing pipelines, execute them in a distributed environment.
-On top it uses semantics to provide guidance to non-technical people for better analyzing their data streams in a self-service manner.
-
-
-
-## Pipelines
-The core concept of StreamPipes are data processing pipelines.
-Those pipelines use data from different sources (Data Streams), then transform it via Processing Elements and store them in an database or send it to third party systems (Data Sinks).
-A brief introduction is given in the following sections.
-At the next page a detailed tour through StreamPies explains all the different features that are available.
-
-
-## Data Streams
-Data Streams represent the primary source for data in StreamPipes.
-A stream is an ordered sequence of events, where an event is described as one or more observation values.
-Those events can come from different sources like sensors, machines, log files or many more.
-It does not matter what kind of serialization format the events have or which kind of transportation protocol the individual data streams use.
-As long as a semantic description is provided StreamPipes is capable of processing the data.
-
-
-## Processing Elements
-Processing Elements are defined as an processor that transforms one or more input event streams to an output event stream. 
-Those transformations can be rather simple like filtering out events based on a predefined rule or more complex by applying algorithms on the data.  
-Processing elements define stream requirements that are a set of minimum properties an incoming event stream must provide. 
-Furthermore, Processing Elements describe their output based on a set of output strategies.
-They also describe further (human) input in form of configuration parameters.
-The Processing Elements can be implemented in multiple technologies.
-This information is not necessary when constructing a pipeline, the user does not need to know where and how the actual algorithm is deployed and executed.
-During the modelling phase it is possible to set configuration parameters, wich are then injected into the program when it is started.
-A description is provided for all parameters and it is ensured by the system that the user can just enter semantically correct values.
-
-
-## Data Sinks
-Data Sinks consume event streams similar to processing elements with the difference that sinks do not provide an output stream, i.e., they are defined as sinks that perform some action or trigger a visualization as a result of a stream transformation.
-The sinks also define stream requirements that must be fulfilled.
-In a pipeline it is not necessary to use a processing element to transform data.
-Often it can make sense to just use a data sink and connect it directly to the sensor to store the raw data into a data store for offline analysis.
-This is very simple with StreamPipes and no additional code must be written to create such a data lake.
-
-
-## Target Audience
-StreamPipes focuses on multiple target groups.
-This guide is for users who interact with the graphical user interface in the browser.
-If you are interested in the technical details or plan to extend the system with new algorithms, please read the Developer Guide.
-The graphical user interface is designed for domain experts who want to analyze data, but are not interested in technical details and do not want to write code.
-The SDK can be used by software developers to extend the framework with new functionality.
-After importing newly developed pipeline elements, they are available to all users of StreamPipes.
-
-
-## Next Steps
-To test StreamPipes on your local environment go to the [installation guide](user-guide-installation.md).
-If you are further interested in the concepts of StreamPipes continue with the [tour](user-guide-tour.md).
diff --git a/website-v2/versioned_docs/version-0.90.0/user-guide-software-components.md b/website-v2/versioned_docs/version-0.90.0/user-guide-software-components.md
deleted file mode 100644
index cbcd7a3..0000000
--- a/website-v2/versioned_docs/version-0.90.0/user-guide-software-components.md
+++ /dev/null
@@ -1,334 +0,0 @@
----
-id: user-guide-software-components
-title: Software Components
-sidebar_label: Software Components
----
-
-This page contains all the software components that can be used within the StreamPipes framework.
-Some of them are mandatory but others are just necessary for a special capabilities.
-In the [Installation Guide](user-guide-installation.md#installation_1) we  already provide a docker-compose.yml file with all the necessary components
-for a minimal setup.
-Extend this configuration files with further containers described on this page and configure StreamPipes
-according to your needs.
-
-
-## StreamPipes Framework
-
-<details class="tip">
-<summary>StreamPipes Backend</summary>
-
-#### Description
-The StreamPipes Backend is the main component of the StreamPipes Framework. It contains the application logic to create and execute pipelines.
-Furthermore, it provides a REST-API that is used by other components for communication.
-
-#### Docker Compose
-```yaml
-backend:
-  image: streampipes/backend
-  depends_on:
-    - "consul"
-  ports:
-    - "8030:8030"
-  volumes:
-    - ./config:/root/.streampipes
-    - ./config/aduna:/root/.aduna
-  networks:
-    spnet:
-```
-</details>
-
-
-<details class="tip">
-<summary>StreamPipes UI</summary>
-
-#### Description
-This service uses nginx and contains the UI of StreamPipes.
-The UI can, for example, be used to import new pipeline elements, create new pipelines and manage the pipeline
-execution. The UI communicates with the backend via the REST interface.
-
-#### Docker Compose
-```yaml
-nginx:
-  image: streampipes/ui
-  ports:
-    - "80:80"
-  depends_on:
-    - backend
-  networks:
-    spnet:
-```
-</details>
-
-## StreamPipes Services
-
-<details class="tip">
-<summary>Consul</summary>
-#### Description
-Consul is used to store configuration parameters of the backend service and processing elements.
-It is further used for service discovery. Once a processing element container is started in the network, it is
-automatically discovered via the service discovery feature of Consul.
-
-#### Docker Compose
-```yaml
-consul:
-    image: consul
-    environment:
-      - "CONSUL_LOCAL_CONFIG={'{\"disable_update_check\": true}'}"
-      - "CONSUL_BIND_INTERFACE=eth0"
-      - "CONSUL_HTTP_ADDR=0.0.0.0"
-    entrypoint:
-      - consul
-      - agent
-      - -server
-      - -bootstrap-expect=1
-      - -data-dir=/consul/data
-      - -node=consul-one
-      - -bind={'{{ GetInterfaceIP "eth0" }}'}
-      - -client=0.0.0.0
-      - -enable-script-checks=true
-      - -ui
-    volumes:
-      - ./config/consul:/consul/data
-    ports:
-      - "8500:8500"
-      - "8600:8600"
-    networks:
-      spnet:
-        ipv4_address: 172.30.0.9
-```
-</details>
-
-<details class="tip">
-<summary>Zookeeper</summary>
-#### Description
-Apache Kafka and Apache Flink require zookeeper to manage their clusters.
-
-#### Docker Compose
-```yaml
-zookeeper:
-    image: wurstmeister/zookeeper
-    ports:
-      - "2181:2181"
-    networks:
-      spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Kafka</summary>
-
-#### Description
-Kafka is used as the primary message broker. It is possible to use other brokers or even multiple message brokers in a single pipeline, but Kafka is the
-default. The communication between the processing elements in a pipeline is mostly done via Kafka.
-
-#### Docker Compose
-```yaml
-  kafka:
-    image: wurstmeister/kafka:0.10.0.1
-    ports:
-      - "9092:9092"
-    environment:
-      KAFKA_ADVERTISED_HOST_NAME: ###TODO ADD HOSTNAME HERE ###
-      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
-    volumes:
-      - /var/run/docker.sock:/var/run/docker.sock
-    networks:
-      spnet:
-```
-</details>
-
-<details class="tip">
-<summary>ActiveMQ</summary>
-#### Description
-ActiveMQ is another message broker which can be used in addition to Kafka. Currently, the main purpose is to provide
-an endpoint for the websocket connections required by the real-time dashboard of the StreamPipes UI.
-
-#### Docker Compose
-```yaml
-activemq:
-  image: streampipes/activemq
-  ports:
-    - "61616:61616"
-    - "61614:61614"
-    - "8161:8161"
-  networks:
-    spnet:
-
-    ```
-</details>
-
-<details class="tip">
-<summary>CouchDB</summary>
-
-#### Description
-CouchDB is the main database for StreamPipes data that needs to be persisted such as pipelines, users and visualizations created in the dashboard.
-
-#### Docker Compose
-```yaml
-couchdb:
-  image: couchdb
-  ports:
-    - "5984:5984"
-  volumes:
-    - ./config/couchdb/data:/usr/local/var/lib/couchdb
-  networks:
-    spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Flink</summary>
-#### Description
-This service sets up a sample flink cluster with one jobmanager and one taskmanager. Although this cluster can be used for testing, it is not recommended for production use.
-
-#### Docker Compose
-```yaml
-jobmanager:
-  image: streampipes/flink
-  ports:
-    - "8081:8099"
-  command: jobmanager
-  networks:
-    spnet:
-
-
-taskmanager:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/flink
-  command: taskmanager
-  environment:
-    - FLINK_NUM_SLOTS=20
-  networks:
-    spnet:
-```
-</details>
-
-
-## Processing Elements
-
-<details class="tip">
-<summary>PE Examples Sources</summary>
-#### Description
-This Processing Element Container contains several sample data sources that can be used to work with StreamPipes.
-It consists of sources descriptions and data simulators that constantly produce data.
-
-#### Docker Compose
-```yaml
-    pe-examples-sources:
-      image: streampipes/pe-examples-sources:
-      depends_on:
-        - "consul"
-      ports:
-        - "8098:8090"
-      networks:
-        spnet:
-```
-</details>
-
-<details class="tip">
-<summary>PE Examples JVM</summary>
-
-#### Description
-This Processing Element Container contains some sink example implementations, like for example the real-time
-dashboard. This can be used to visualize data within StreamPipes.
-
-#### Docker Compose
-```yaml
-      pe-exanmples-jvm:
-        image: streampipes/pe-examples-jvm
-        depends_on:
-          - "consul"
-        environment:
-          - STREAMPIPES_HOST=###TODO ADD HOSTNAME HERE ###
-        ports:
-          - "8096:8090"
-        networks:
-          spnet:
-```
-</details>
-
-<details class="tip">
-<summary>PE Examples Flink</summary>
-
-#### Description
-The Flink Samples Processing Element Container contains some example algorithms that can be used within processing
-pipelines in the pipeline editor. Those algorithms are deployed to a Flink cluster once the pipeline is started.
-
-#### Docker Compose
-```yaml
-  pe-flink-samples:
-    image: streampipes/pe-examples-flink
-    depends_on:
-      - "consul"
-    ports:
-      - "8094:8090"
-    volumes:
-      - ./config:/root/.streampipes
-    networks:
-      spnet:
-```
-</details>
-
-### Third Party Services
-
-<details class="tip">
-<summary>Elasticsearch</summary>
-
-#### Description
-This service can be used to run Elasticsearch. Data can be written into Elasticsearch with the Elasticsearch
-sink of the PE Flink samples conatiner.
-
-#### Docker Compose
-```yaml
-elasticsearch:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/elasticsearch
-  ports:
-    - "9200:9200"
-    - "9300:9300"
-  volumes:
-    - ./config/elasticsearch/data:/usr/share/elasticsearch/data
-  networks:
-    spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Kibana</summary>
-#### Description
-Kibana is used to visualize data that is written into Elasticsearch. It can be used in addition to our live dashboard
-to analyse and visualize historic data.
-
-#### Docker Compose
-```yaml
-kibana:
-  image: kibana:5.2.2
-  ports:
-    - "5601:5601"
-  volumes:
-    - ./config/kibana/kibana.yml:/opt/kibana/config/kibana.yml
-  environment:
-    - ELASTICSEARCH_URL=http://elasticsearch:9200
-  networks:
-    spnet:
-```
-</details>
-
-
-<details class="tip">
-<summary>Kafka Web Console</summary>
-
-#### Description
-The kafka web console can be used to monitor the kafka cluster. This is a good tool for debugging your newly
-developed pipeline elements.
-
-#### Docker Compose
-```yaml
-kafka-web-console:
-  image: hwestphal/kafka-web-console
-  ports:
-    - "9000:9000"
-  volumes:
-    - ./config:/data
-  networks:
-    spnet:
-```
-</details>
diff --git a/website-v2/versioned_docs/version-0.91.0/01_try-installation.md b/website-v2/versioned_docs/version-0.91.0/01_try-installation.md
index 3d7dab0..6860357 100644
--- a/website-v2/versioned_docs/version-0.91.0/01_try-installation.md
+++ b/website-v2/versioned_docs/version-0.91.0/01_try-installation.md
@@ -35,9 +35,7 @@
 
 ## Install StreamPipes
 
-<ul style="padding-left:0">
-    <DownloadSection version={'0.91.0'}></DownloadSection>
-</ul>
+<DownloadSection version={'0.91.0'}></DownloadSection>
 
 ## Setup StreamPipes
 
diff --git a/website-v2/versioned_docs/version-0.91.0/01_try-overview.md b/website-v2/versioned_docs/version-0.91.0/01_try-overview.md
index 8be91ea..708ed05 100644
--- a/website-v2/versioned_docs/version-0.91.0/01_try-overview.md
+++ b/website-v2/versioned_docs/version-0.91.0/01_try-overview.md
@@ -7,7 +7,7 @@
 
 This is the documentation of Apache StreamPipes.
 
-<img class="docs-image docs-image-small docs-image-no-shadow" style="padding: 10px;" src="/img/01_try-overview/01_streampipes-overview.png" alt="StreamPipes Overview"/>
+<img class="docs-image docs-image-small docs-image-no-shadow" style={{padding: '10px'}} src="/img/01_try-overview/01_streampipes-overview.png" alt="StreamPipes Overview"/>
 
 
 <div class="container grid col-3">
diff --git a/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-event-model.md b/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-event-model.md
index 1152010..f4bb8ed 100644
--- a/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-event-model.md
+++ b/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-event-model.md
@@ -11,11 +11,11 @@
 
 ## Prerequisites
 
-This guide assumes that you are already familiar with the basic setup of [data processors](dev-guide-processor-sdk.md) and [data sinks](dev-guide-sink-sdk.md).
+This guide assumes that you are already familiar with the basic setup of [data processors](extend-first-processor).
 
 ### Property Selectors
 
-In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](dev-guide-static-properties.md)).
+In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](extend-sdk-static-properties)).
 
 Mapping properties return a ``PropertySelector`` that identifies a field based on (i) the **streamIndex** and (ii) the runtime name of the field.
 Let's assume we have an event with the following structure:
@@ -42,7 +42,7 @@
 
 ``s0`` identifies the stream (in this case, only one input streams exist, but as data processors might require more than one input stream, a stream identifier is required), while the appendix identifies the runtime name.
 
-Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](dev-guide-output-strategies.md).
+Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](extend-sdk-output-strategies).
 
 ### Reading Fields
 
diff --git a/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-stream-requirements.md b/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-stream-requirements.md
index 6f1213a..98a9215 100644
--- a/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-stream-requirements.md
+++ b/website-v2/versioned_docs/version-0.91.0/06_extend-sdk-stream-requirements.md
@@ -88,7 +88,7 @@
   }
 ```
 
-See also the developer guide on [static properties](dev-guide-static-properties.md) to better understand the usage of ``MappingProperties``.
+See also the developer guide on [static properties](extend-sdk-static-properties) to better understand the usage of ``MappingProperties``.
 
 Requirements on primitive fields can be specified for all common datatypes:
 
diff --git a/website-v2/versioned_docs/version-0.91.0/user-guide-first-steps.md b/website-v2/versioned_docs/version-0.91.0/user-guide-first-steps.md
index 95374b7..a4b9b2d 100644
--- a/website-v2/versioned_docs/version-0.91.0/user-guide-first-steps.md
+++ b/website-v2/versioned_docs/version-0.91.0/user-guide-first-steps.md
@@ -211,4 +211,4 @@
 
 We hope we gave you an easy quick start into StreamPipes.
 If you have any questions or suggestions, just send us an email.
-From here on you can explore all features in the [User Guide](user-guide-introduction.md) or go to the [Developer Guide](dev-guide-introduction.md) to learn how to write your own StreamPipes processing elements.
+From here on you can explore all features in the [User Guide](user-guide-introduction) or go to the [Developer Guide](extend-setup) to learn how to write your own StreamPipes processing elements.
diff --git a/website-v2/versioned_docs/version-0.91.0/user-guide-installation.md b/website-v2/versioned_docs/version-0.91.0/user-guide-installation.md
deleted file mode 100644
index 4c78d3b..0000000
--- a/website-v2/versioned_docs/version-0.91.0/user-guide-installation.md
+++ /dev/null
@@ -1,120 +0,0 @@
----
-id: user-guide-installation
-title: Installation
-sidebar_label: Installation
----
-## Prerequisites
-
-### Hardware
-
--   Docker (latest version, see instructions below)
--   Docker Compose (latest version., see instructions below)
-
-### Supported operating systems
-
-We rely on Docker and support three operating systems for the StreamPipes system
-
--   Linux
--   OSX
--   Windows 10
-    -   Please note that older Windows versions are not compatible with Docker. Also Linux VMs under Windows might not work, due to network problems with docker.
-
-### Web Browser
-
-StreamPipes is a modern web application, therefore you need a recent version of Chrome (recommended), Firefox or Edge.
-
-### Docker
-
-You need to have Docker installed on your system before you continue with the installation guide.
-
-<div className="admonition info">
-<div className="admonition-title">Install Docker</div>
-<p>Go to https://docs.docker.com/installation/ and follow the instructions to install Docker for your OS. Make sure docker can be started as a non-root user (described in the installation manual, don’t forget to log out and in again) and check that Docker is installed correctly by executing docker-run hello-world</p>
-</div>
-
-<div className="admonition info">
-<div className="admonition-title">Configure Docker</div>
-<p>By default, Docker uses only a limited number of CPU cores and memory.
-       If you run StreamPipes on Windows or on a Mac you need to adjust the default settings.
-       To do that, click on the Docker icon in your tab bar and open the preferences.
-       Go to the advanced preferences and set the **number of CPUs to 6** (recommended) and the **Memory to 4GB**.
-       After changing the settings, Docker needs to be restarted.</p></div>
-
-## Install StreamPipes
-
-<div className="tab-content" id="myTabContent">
-    <div className="tab-pane fade show active" id="linux" role="tabpanel" aria-labelled-by="linux-tab">
-        <ul style={{paddingLeft: "0"}}>
-            <li className="installation-step">
-                <div className="wrapper-container" style={{alignItems: "center", justifyContent: "center"}}>
-                    <div className="wrapper-step">
-                        <span className="fa-stack fa-2x">
-                             <i className="fas fa-circle fa-stack-2x sp-color-green" />
-                             <strong className="fa-stack-1x" style={{color: "white"}}>1</strong>
-                        </span>
-                    </div>
-                    <div className="wrapper-instruction">
-                        <a href="https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=streampipes/installer/0.70.0/apache-streampipes-installer-0.70.0-incubating-source-release.zip">Download</a>
-                        the latest Apache StreamPipes release and extract the zip file to a directory of your choice.
-                    </div>
-                </div>
-            </li>
-            <li className="installation-step">
-                <div className="wrapper-container" style={{alignItems: "center", justifyContent: "center"}}>
-                    <div className="wrapper-step">
-                        <span className="fa-stack fa-2x">
-                             <i className="fas fa-circle fa-stack-2x sp-color-green" />
-                             <strong className="fa-stack-1x" style={{color: "white"}}>2</strong>
-                        </span>
-                    </div>
-                    <div className="wrapper-instruction">
-                       In a command prompt, open the folder <code>/compose</code> and run <code>docker-compose up -d</code>.
-                    </div>
-                </div>
-            </li>
-            <li className="installation-step">
-                <div className="wrapper-container" style={{alignItems: "center", justifyContent: "center"}}>
-                    <div className="wrapper-step">
-                        <span className="fa-stack fa-2x">
-                             <i className="fas fa-circle fa-stack-2x sp-color-green" />
-                             <strong className="fa-stack-1x" style={{color: "white"}}>3</strong>
-                        </span>
-                    </div>
-                    <div className="wrapper-instruction">
-                        Open your browser, navigate to http://localhost:80 (or the domain name of your server) and finish the setup according to the instructions below.
-                    </div>
-                </div>
-            </li>
-        </ul>
-        </div>
-    </div>
-
-## Setup StreamPipes
-
-Once you've opened the browser at the URL given above, you should see StreamPipes application as shown below.
-To set up the system, enter an email address and a password and click on install.
-At this point, it is not necessary to change anything in the advanced settings menu.
-The installation might take some time, continue by clicking on "Go to login page", once all components are successfully configured.
-
-On the login page, enter your credentials, then you should be forwarded to the home page.
-
-Congratulations! You've successfully managed to install StreamPipes. Now we're ready to build our first pipeline!
-
-<div className="my-carousel">
-    <img src="/img/quickstart/setup/01_register_user.png" alt="Set Up User" />
-    <img src="/img/quickstart/setup/02_user_set_up.png" alt="SetUp StreamPipes Components" />
-    <img src="/img/quickstart/setup/03_login.png" alt="Go to login page" />
-    <img src="/img/quickstart/setup/04_home.png" alt="Home page" />
-</div>
-
-<div className="admonition error">
-<div className="admonition-title">Errors during the installation process</div>
-<p>In most cases, errors during the installation are due to an under-powered system.<br />
-If there is a problem with any of the components, please restart the whole system and delete the "config" directory on the server.
-   This directory is in the same folder as the docker-compose.yml file.<br />
-   Please also make sure that your system meets the hardware requirements as mentioned in the first section of the installation guide.</p>
-</div>
-
-## Next Steps
-
-Now you can continue with the tutorial on page [First steps](user-guide-first-steps.md).
diff --git a/website-v2/versioned_docs/version-0.91.0/user-guide-introduction.md b/website-v2/versioned_docs/version-0.91.0/user-guide-introduction.md
deleted file mode 100644
index 484a640..0000000
--- a/website-v2/versioned_docs/version-0.91.0/user-guide-introduction.md
+++ /dev/null
@@ -1,59 +0,0 @@
----
-id: user-guide-introduction-old
-title: Introduction
-sidebar_label: Introduction
----
-StreamPipes is a framework that enables users to work with data streams.
-It uses a lot of different technologies especially form the fields of big data, distributed computing and semantic web.
-One of the core concepts of StreamPipes is to add a higher semantic layer on top of big data processing technologies to ease their usage.
-StreamPipes is not just a UI, it is a framework with a lot of different capabilities, like modelling new data processing pipelines, execute them in a distributed environment.
-On top it uses semantics to provide guidance to non-technical people for better analyzing their data streams in a self-service manner.
-
-## Pipelines
-
-The core concept of StreamPipes are data processing pipelines.
-Those pipelines use data from different sources (Data Streams), then transform it via Processing Elements and store them in an database or send it to third party systems (Data Sinks).
-A brief introduction is given in the following sections.
-At the next page a detailed tour through StreamPies explains all the different features that are available.
-
-## Data Streams
-
-Data Streams represent the primary source for data in StreamPipes.
-A stream is an ordered sequence of events, where an event is described as one or more observation values.
-Those events can come from different sources like sensors, machines, log files or many more.
-It does not matter what kind of serialization format the events have or which kind of transportation protocol the individual data streams use.
-As long as a semantic description is provided StreamPipes is capable of processing the data.
-
-## Processing Elements
-
-Processing Elements are defined as an processor that transforms one or more input event streams to an output event stream. 
-Those transformations can be rather simple like filtering out events based on a predefined rule or more complex by applying algorithms on the data.  
-Processing elements define stream requirements that are a set of minimum properties an incoming event stream must provide. 
-Furthermore, Processing Elements describe their output based on a set of output strategies.
-They also describe further (human) input in form of configuration parameters.
-The Processing Elements can be implemented in multiple technologies.
-This information is not necessary when constructing a pipeline, the user does not need to know where and how the actual algorithm is deployed and executed.
-During the modelling phase it is possible to set configuration parameters, wich are then injected into the program when it is started.
-A description is provided for all parameters and it is ensured by the system that the user can just enter semantically correct values.
-
-## Data Sinks
-
-Data Sinks consume event streams similar to processing elements with the difference that sinks do not provide an output stream, i.e., they are defined as sinks that perform some action or trigger a visualization as a result of a stream transformation.
-The sinks also define stream requirements that must be fulfilled.
-In a pipeline it is not necessary to use a processing element to transform data.
-Often it can make sense to just use a data sink and connect it directly to the sensor to store the raw data into a data store for offline analysis.
-This is very simple with StreamPipes and no additional code must be written to create such a data lake.
-
-## Target Audience
-
-StreamPipes focuses on multiple target groups.
-This guide is for users who interact with the graphical user interface in the browser.
-If you are interested in the technical details or plan to extend the system with new algorithms, please read the Developer Guide.
-The graphical user interface is designed for domain experts who want to analyze data, but are not interested in technical details and do not want to write code.
-The SDK can be used by software developers to extend the framework with new functionality.
-After importing newly developed pipeline elements, they are available to all users of StreamPipes.
-
-## Next Steps
-
-To test StreamPipes on your local environment go to the [installation guide](user-guide-installation.md).
-If you are further interested in the concepts of StreamPipes continue with the [tour](user-guide-tour.md).
diff --git a/website-v2/versioned_docs/version-0.91.0/user-guide-software-components.md b/website-v2/versioned_docs/version-0.91.0/user-guide-software-components.md
deleted file mode 100644
index 6fb343d..0000000
--- a/website-v2/versioned_docs/version-0.91.0/user-guide-software-components.md
+++ /dev/null
@@ -1,388 +0,0 @@
----
-id: user-guide-software-components
-title: Software Components
-sidebar_label: Software Components
----
-This page contains all the software components that can be used within the StreamPipes framework.
-Some of them are mandatory but others are just necessary for a special capabilities.
-In the [Installation Guide](user-guide-installation.md#installation_1) we  already provide a docker-compose.yml file with all the necessary components
-for a minimal setup.
-Extend this configuration files with further containers described on this page and configure StreamPipes
-according to your needs.
-
-## StreamPipes Framework
-
-<details className="tip">
-<summary>StreamPipes Backend</summary>
-
-#### Description
-
-The StreamPipes Backend is the main component of the StreamPipes Framework. It contains the application logic to create and execute pipelines.
-Furthermore, it provides a REST-API that is used by other components for communication.
-
-#### Docker Compose
-
-```yaml
-
-backend:
-  image: streampipes/backend
-  depends_on:
-    - "consul"
-  ports:
-    - "8030:8030"
-  volumes:
-    - ./config:/root/.streampipes
-    - ./config/aduna:/root/.aduna
-  networks:
-    spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>StreamPipes UI</summary>
-
-#### Description
-
-This service uses nginx and contains the UI of StreamPipes.
-The UI can, for example, be used to import new pipeline elements, create new pipelines and manage the pipeline
-execution. The UI communicates with the backend via the REST interface.
-
-#### Docker Compose
-
-```yaml
-
-nginx:
-  image: streampipes/ui
-  ports:
-    - "80:80"
-  depends_on:
-    - backend
-  networks:
-    spnet:
-
-```
-
-</details>
-
-## StreamPipes Services
-
-<details className="tip">
-<summary>Consul</summary>
-#### Description
-Consul is used to store configuration parameters of the backend service and processing elements.
-It is further used for service discovery. Once a processing element container is started in the network, it is
-automatically discovered via the service discovery feature of Consul.
-
-#### Docker Compose
-
-```yaml
-
-consul:
-    image: consul
-    environment:
-      - "CONSUL_LOCAL_CONFIG={'{\"disable_update_check\": true}'}"
-      - "CONSUL_BIND_INTERFACE=eth0"
-      - "CONSUL_HTTP_ADDR=0.0.0.0"
-    entrypoint:
-      - consul
-      - agent
-      - -server
-      - -bootstrap-expect=1
-      - -data-dir=/consul/data
-      - -node=consul-one
-      - -bind={'{{ GetInterfaceIP "eth0" }}'}
-      - -client=0.0.0.0
-      - -enable-script-checks=true
-      - -ui
-    volumes:
-      - ./config/consul:/consul/data
-    ports:
-      - "8500:8500"
-      - "8600:8600"
-    networks:
-      spnet:
-        ipv4_address: 172.30.0.9
-
-```
-
-</details>
-
-<details className="tip">
-<summary>Zookeeper</summary>
-#### Description
-Apache Kafka and Apache Flink require zookeeper to manage their clusters.
-
-#### Docker Compose
-
-```yaml
-
-zookeeper:
-    image: wurstmeister/zookeeper
-    ports:
-      - "2181:2181"
-    networks:
-      spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>Kafka</summary>
-
-#### Description
-
-Kafka is used as the primary message broker. It is possible to use other brokers or even multiple message brokers in a single pipeline, but Kafka is the
-default. The communication between the processing elements in a pipeline is mostly done via Kafka.
-
-#### Docker Compose
-
-```yaml
-
-  kafka:
-    image: wurstmeister/kafka:0.10.0.1
-    ports:
-      - "9092:9092"
-    environment:
-      KAFKA_ADVERTISED_HOST_NAME: ###TODO ADD HOSTNAME HERE ###
-      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
-    volumes:
-      - /var/run/docker.sock:/var/run/docker.sock
-    networks:
-      spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>ActiveMQ</summary>
-#### Description
-ActiveMQ is another message broker which can be used in addition to Kafka. Currently, the main purpose is to provide
-an endpoint for the websocket connections required by the real-time dashboard of the StreamPipes UI.
-
-#### Docker Compose
-
-````yaml
-
-activemq:
-  image: streampipes/activemq
-  ports:
-    - "61616:61616"
-    - "61614:61614"
-    - "8161:8161"
-  networks:
-    spnet:
-
-    ```
-</details>
-
-<details class="tip">
-<summary>CouchDB</summary>
-
-#### Description
-CouchDB is the main database for StreamPipes data that needs to be persisted such as pipelines, users and visualizations created in the dashboard.
-
-#### Docker Compose
-```yaml
-couchdb:
-  image: couchdb
-  ports:
-    - "5984:5984"
-  volumes:
-    - ./config/couchdb/data:/usr/local/var/lib/couchdb
-  networks:
-    spnet:
-
-````
-
-</details>
-
-<details className="tip">
-<summary>Flink</summary>
-#### Description
-This service sets up a sample flink cluster with one jobmanager and one taskmanager. Although this cluster can be used for testing, it is not recommended for production use.
-
-#### Docker Compose
-
-```yaml
-
-jobmanager:
-  image: streampipes/flink
-  ports:
-    - "8081:8099"
-  command: jobmanager
-  networks:
-    spnet:
-
-
-taskmanager:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/flink
-  command: taskmanager
-  environment:
-    - FLINK_NUM_SLOTS=20
-  networks:
-    spnet:
-
-```
-
-</details>
-
-## Processing Elements
-
-<details className="tip">
-<summary>PE Examples Sources</summary>
-#### Description
-This Processing Element Container contains several sample data sources that can be used to work with StreamPipes.
-It consists of sources descriptions and data simulators that constantly produce data.
-
-#### Docker Compose
-
-```yaml
-
-    pe-examples-sources:
-      image: streampipes/pe-examples-sources:
-      depends_on:
-        - "consul"
-      ports:
-        - "8098:8090"
-      networks:
-        spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>PE Examples JVM</summary>
-
-#### Description
-
-This Processing Element Container contains some sink example implementations, like for example the real-time
-dashboard. This can be used to visualize data within StreamPipes.
-
-#### Docker Compose
-
-```yaml
-
-      pe-exanmples-jvm:
-        image: streampipes/pe-examples-jvm
-        depends_on:
-          - "consul"
-        environment:
-          - STREAMPIPES_HOST=###TODO ADD HOSTNAME HERE ###
-        ports:
-          - "8096:8090"
-        networks:
-          spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>PE Examples Flink</summary>
-
-#### Description
-
-The Flink Samples Processing Element Container contains some example algorithms that can be used within processing
-pipelines in the pipeline editor. Those algorithms are deployed to a Flink cluster once the pipeline is started.
-
-#### Docker Compose
-
-```yaml
-
-  pe-flink-samples:
-    image: streampipes/pe-examples-flink
-    depends_on:
-      - "consul"
-    ports:
-      - "8094:8090"
-    volumes:
-      - ./config:/root/.streampipes
-    networks:
-      spnet:
-
-```
-
-</details>
-
-### Third Party Services
-
-<details className="tip">
-<summary>Elasticsearch</summary>
-
-#### Description
-
-This service can be used to run Elasticsearch. Data can be written into Elasticsearch with the Elasticsearch
-sink of the PE Flink samples conatiner.
-
-#### Docker Compose
-
-```yaml
-
-elasticsearch:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/elasticsearch
-  ports:
-    - "9200:9200"
-    - "9300:9300"
-  volumes:
-    - ./config/elasticsearch/data:/usr/share/elasticsearch/data
-  networks:
-    spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>Kibana</summary>
-#### Description
-Kibana is used to visualize data that is written into Elasticsearch. It can be used in addition to our live dashboard
-to analyse and visualize historic data.
-
-#### Docker Compose
-
-```yaml
-
-kibana:
-  image: kibana:5.2.2
-  ports:
-    - "5601:5601"
-  volumes:
-    - ./config/kibana/kibana.yml:/opt/kibana/config/kibana.yml
-  environment:
-    - ELASTICSEARCH_URL=http://elasticsearch:9200
-  networks:
-    spnet:
-
-```
-
-</details>
-
-<details className="tip">
-<summary>Kafka Web Console</summary>
-
-#### Description
-
-The kafka web console can be used to monitor the kafka cluster. This is a good tool for debugging your newly
-developed pipeline elements.
-
-#### Docker Compose
-
-```yaml
-
-kafka-web-console:
-  image: hwestphal/kafka-web-console
-  ports:
-    - "9000:9000"
-  volumes:
-    - ./config:/data
-  networks:
-    spnet:
-
-```
-
-</details>
diff --git a/website-v2/versioned_docs/version-0.92.0/01_try-installation.md b/website-v2/versioned_docs/version-0.92.0/01_try-installation.md
index 329db72..1443124 100644
--- a/website-v2/versioned_docs/version-0.92.0/01_try-installation.md
+++ b/website-v2/versioned_docs/version-0.92.0/01_try-installation.md
@@ -35,9 +35,7 @@
 
 ## Install StreamPipes
 
-<ul style="padding-left:0">
-    <DownloadSection version={'0.92.0'} releaseDate={'2023-06-16'}></DownloadSection>
-</ul>
+<DownloadSection version={'0.92.0'} releaseDate={'2023-06-16'}></DownloadSection>
 
 ## Setup StreamPipes
 
diff --git a/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-event-model.md b/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-event-model.md
index 5a21d9c..a9f48ba 100644
--- a/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-event-model.md
+++ b/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-event-model.md
@@ -10,11 +10,11 @@
 
 ## Prerequisites
 
-This guide assumes that you are already familiar with the basic setup of [data processors](dev-guide-processor-sdk.md) and [data sinks](dev-guide-sink-sdk.md).
+This guide assumes that you are already familiar with the basic setup of [data processors](extend-first-processor).
 
 ### Property Selectors
 
-In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](dev-guide-static-properties.md)).
+In most cases, fields that are subject to be transformed by pipeline elements are provided by the assigned ``MappingProperty`` (see the guide on [static properties](extend-sdk-static-properties).
 
 Mapping properties return a ``PropertySelector`` that identifies a field based on (i) the **streamIndex** and (ii) the runtime name of the field.
 Let's assume we have an event with the following structure:
@@ -41,7 +41,7 @@
 
 ``s0`` identifies the stream (in this case, only one input streams exist, but as data processors might require more than one input stream, a stream identifier is required), while the appendix identifies the runtime name.
 
-Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](dev-guide-output-strategies.md).
+Note: If you add a new field to an input event, you don't need to provide the selector, you can just assign the runtime name as defined by the [output strategy](extend-sdk-output-strategies).
 
 ### Reading Fields
 
diff --git a/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-stream-requirements.md b/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-stream-requirements.md
index 9d06735..a035274 100644
--- a/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-stream-requirements.md
+++ b/website-v2/versioned_docs/version-0.92.0/06_extend-sdk-stream-requirements.md
@@ -87,7 +87,7 @@
   }
 ```
 
-See also the developer guide on [static properties](dev-guide-static-properties.md) to better understand the usage of ``MappingProperties``.
+See also the developer guide on [static properties](extend-sdk-static-properties) to better understand the usage of ``MappingProperties``.
 
 Requirements on primitive fields can be specified for all common datatypes:
 
diff --git a/website-v2/versioned_docs/version-0.92.0/user-guide-first-steps.md b/website-v2/versioned_docs/version-0.92.0/user-guide-first-steps.md
index b031145..b7ca89f 100644
--- a/website-v2/versioned_docs/version-0.92.0/user-guide-first-steps.md
+++ b/website-v2/versioned_docs/version-0.92.0/user-guide-first-steps.md
@@ -205,5 +205,5 @@
 
 We hope we gave you an easy quick start into StreamPipes.
 If you have any questions or suggestions, just send us an email.
-From here on you can explore all features in the [User Guide](user-guide-introduction.md) or go to the [Developer Guide](dev-guide-introduction.md) to learn how to write your own StreamPipes processing elements.
+From here on you can explore all features in the [User Guide](user-guide-introduction) or go to the [Developer Guide](extend-setup) to learn how to write your own StreamPipes processing elements.
 
diff --git a/website-v2/versioned_docs/version-0.92.0/user-guide-installation.md b/website-v2/versioned_docs/version-0.92.0/user-guide-installation.md
deleted file mode 100644
index 74cf1fa..0000000
--- a/website-v2/versioned_docs/version-0.92.0/user-guide-installation.md
+++ /dev/null
@@ -1,120 +0,0 @@
----
-id: user-guide-installation
-title: Installation
-sidebar_label: Installation
----
-## Prerequisites
-
-### Hardware
-
-* Docker (latest version, see instructions below)
-* Docker Compose (latest version., see instructions below)
-
-### Supported operating systems
-We rely on Docker and support three operating systems for the StreamPipes system
-
-* Linux
-* OSX
-* Windows 10
-    * Please note that older Windows versions are not compatible with Docker. Also Linux VMs under Windows might not work, due to network problems with docker.
-
-### Web Browser
-StreamPipes is a modern web application, therefore you need a recent version of Chrome (recommended), Firefox or Edge.
-
-### Docker
-You need to have Docker installed on your system before you continue with the installation guide.
-
-
-<div class="admonition info">
-<div class="admonition-title">Install Docker</div>
-<p>Go to https://docs.docker.com/installation/ and follow the instructions to install Docker for your OS. Make sure docker can be started as a non-root user (described in the installation manual, don’t forget to log out and in again) and check that Docker is installed correctly by executing docker-run hello-world</p>
-</div>
-
-<div class="admonition info">
-<div class="admonition-title">Configure Docker</div>
-<p>By default, Docker uses only a limited number of CPU cores and memory.
-       If you run StreamPipes on Windows or on a Mac you need to adjust the default settings.
-       To do that, click on the Docker icon in your tab bar and open the preferences.
-       Go to the advanced preferences and set the **number of CPUs to 6** (recommended) and the **Memory to 4GB**.
-       After changing the settings, Docker needs to be restarted.</p></div>
-
-
-## Install StreamPipes
-
-<div class="tab-content" id="myTabContent">
-    <div class="tab-pane fade show active" id="linux" role="tabpanel" aria-labelledby="linux-tab">
-        <ul style="padding-left:0">
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">1</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                        <a href="https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=streampipes/installer/0.70.0/apache-streampipes-installer-0.70.0-incubating-source-release.zip">Download</a>
-                        the latest Apache StreamPipes release and extract the zip file to a directory of your choice.
-                    </div>
-                </div>
-            </li>
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">2</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                       In a command prompt, open the folder <code>/compose</code> and run <code>docker-compose up -d</code>.
-                    </div>
-                </div>
-            </li>
-            <li class="installation-step">
-                <div class="wrapper-container" style="align-items: center;justify-content: center;">
-                    <div class="wrapper-step">
-                        <span class="fa-stack fa-2x">
-                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
-                             <strong class="fa-stack-1x" style="color:white;">3</strong>
-                        </span>
-                    </div>
-                    <div class="wrapper-instruction">
-                        Open your browser, navigate to http://localhost:80 (or the domain name of your server) and finish the setup according to the instructions below.
-                    </div>
-                </div>
-            </li>
-        </ul>
-        </div>
-    </div>
-
-## Setup StreamPipes
-
-Once you've opened the browser at the URL given above, you should see StreamPipes application as shown below.
-To set up the system, enter an email address and a password and click on install.
-At this point, it is not necessary to change anything in the advanced settings menu.
-The installation might take some time, continue by clicking on "Go to login page", once all components are successfully configured.
-
-
-On the login page, enter your credentials, then you should be forwarded to the home page.
-
-Congratulations! You've successfully managed to install StreamPipes. Now we're ready to build our first pipeline!
-
-<div class="my-carousel">
-    <img src="/img/quickstart/setup/01_register_user.png" alt="Set Up User"/>
-    <img src="/img/quickstart/setup/02_user_set_up.png" alt="SetUp StreamPipes Components"/>
-    <img src="/img/quickstart/setup/03_login.png" alt="Go to login page"/>
-    <img src="/img/quickstart/setup/04_home.png" alt="Home page"/>
-</div>
-
-<div class="admonition error">
-<div class="admonition-title">Errors during the installation process</div>
-<p>In most cases, errors during the installation are due to an under-powered system.<br/>
-If there is a problem with any of the components, please restart the whole system and delete the "config" directory on the server.
-   This directory is in the same folder as the docker-compose.yml file.<br/>
-   Please also make sure that your system meets the hardware requirements as mentioned in the first section of the installation guide.</p>
-</div>
-
-## Next Steps
-
-Now you can continue with the tutorial on page [First steps](user-guide-first-steps.md).
diff --git a/website-v2/versioned_docs/version-0.92.0/user-guide-introduction.md b/website-v2/versioned_docs/version-0.92.0/user-guide-introduction.md
deleted file mode 100644
index 0f84725..0000000
--- a/website-v2/versioned_docs/version-0.92.0/user-guide-introduction.md
+++ /dev/null
@@ -1,61 +0,0 @@
----
-id: user-guide-introduction-old
-title: Introduction
-sidebar_label: Introduction
----
-
-StreamPipes is a framework that enables users to work with data streams.
-It uses a lot of different technologies especially form the fields of big data, distributed computing and semantic web.
-One of the core concepts of StreamPipes is to add a higher semantic layer on top of big data processing technologies to ease their usage.
-StreamPipes is not just a UI, it is a framework with a lot of different capabilities, like modelling new data processing pipelines, execute them in a distributed environment.
-On top it uses semantics to provide guidance to non-technical people for better analyzing their data streams in a self-service manner.
-
-
-
-## Pipelines
-The core concept of StreamPipes are data processing pipelines.
-Those pipelines use data from different sources (Data Streams), then transform it via Processing Elements and store them in an database or send it to third party systems (Data Sinks).
-A brief introduction is given in the following sections.
-At the next page a detailed tour through StreamPies explains all the different features that are available.
-
-
-## Data Streams
-Data Streams represent the primary source for data in StreamPipes.
-A stream is an ordered sequence of events, where an event is described as one or more observation values.
-Those events can come from different sources like sensors, machines, log files or many more.
-It does not matter what kind of serialization format the events have or which kind of transportation protocol the individual data streams use.
-As long as a semantic description is provided StreamPipes is capable of processing the data.
-
-
-## Processing Elements
-Processing Elements are defined as an processor that transforms one or more input event streams to an output event stream. 
-Those transformations can be rather simple like filtering out events based on a predefined rule or more complex by applying algorithms on the data.  
-Processing elements define stream requirements that are a set of minimum properties an incoming event stream must provide. 
-Furthermore, Processing Elements describe their output based on a set of output strategies.
-They also describe further (human) input in form of configuration parameters.
-The Processing Elements can be implemented in multiple technologies.
-This information is not necessary when constructing a pipeline, the user does not need to know where and how the actual algorithm is deployed and executed.
-During the modelling phase it is possible to set configuration parameters, wich are then injected into the program when it is started.
-A description is provided for all parameters and it is ensured by the system that the user can just enter semantically correct values.
-
-
-## Data Sinks
-Data Sinks consume event streams similar to processing elements with the difference that sinks do not provide an output stream, i.e., they are defined as sinks that perform some action or trigger a visualization as a result of a stream transformation.
-The sinks also define stream requirements that must be fulfilled.
-In a pipeline it is not necessary to use a processing element to transform data.
-Often it can make sense to just use a data sink and connect it directly to the sensor to store the raw data into a data store for offline analysis.
-This is very simple with StreamPipes and no additional code must be written to create such a data lake.
-
-
-## Target Audience
-StreamPipes focuses on multiple target groups.
-This guide is for users who interact with the graphical user interface in the browser.
-If you are interested in the technical details or plan to extend the system with new algorithms, please read the Developer Guide.
-The graphical user interface is designed for domain experts who want to analyze data, but are not interested in technical details and do not want to write code.
-The SDK can be used by software developers to extend the framework with new functionality.
-After importing newly developed pipeline elements, they are available to all users of StreamPipes.
-
-
-## Next Steps
-To test StreamPipes on your local environment go to the [installation guide](user-guide-installation.md).
-If you are further interested in the concepts of StreamPipes continue with the [tour](user-guide-tour.md).
diff --git a/website-v2/versioned_docs/version-0.92.0/user-guide-software-components.md b/website-v2/versioned_docs/version-0.92.0/user-guide-software-components.md
deleted file mode 100644
index cbcd7a3..0000000
--- a/website-v2/versioned_docs/version-0.92.0/user-guide-software-components.md
+++ /dev/null
@@ -1,334 +0,0 @@
----
-id: user-guide-software-components
-title: Software Components
-sidebar_label: Software Components
----
-
-This page contains all the software components that can be used within the StreamPipes framework.
-Some of them are mandatory but others are just necessary for a special capabilities.
-In the [Installation Guide](user-guide-installation.md#installation_1) we  already provide a docker-compose.yml file with all the necessary components
-for a minimal setup.
-Extend this configuration files with further containers described on this page and configure StreamPipes
-according to your needs.
-
-
-## StreamPipes Framework
-
-<details class="tip">
-<summary>StreamPipes Backend</summary>
-
-#### Description
-The StreamPipes Backend is the main component of the StreamPipes Framework. It contains the application logic to create and execute pipelines.
-Furthermore, it provides a REST-API that is used by other components for communication.
-
-#### Docker Compose
-```yaml
-backend:
-  image: streampipes/backend
-  depends_on:
-    - "consul"
-  ports:
-    - "8030:8030"
-  volumes:
-    - ./config:/root/.streampipes
-    - ./config/aduna:/root/.aduna
-  networks:
-    spnet:
-```
-</details>
-
-
-<details class="tip">
-<summary>StreamPipes UI</summary>
-
-#### Description
-This service uses nginx and contains the UI of StreamPipes.
-The UI can, for example, be used to import new pipeline elements, create new pipelines and manage the pipeline
-execution. The UI communicates with the backend via the REST interface.
-
-#### Docker Compose
-```yaml
-nginx:
-  image: streampipes/ui
-  ports:
-    - "80:80"
-  depends_on:
-    - backend
-  networks:
-    spnet:
-```
-</details>
-
-## StreamPipes Services
-
-<details class="tip">
-<summary>Consul</summary>
-#### Description
-Consul is used to store configuration parameters of the backend service and processing elements.
-It is further used for service discovery. Once a processing element container is started in the network, it is
-automatically discovered via the service discovery feature of Consul.
-
-#### Docker Compose
-```yaml
-consul:
-    image: consul
-    environment:
-      - "CONSUL_LOCAL_CONFIG={'{\"disable_update_check\": true}'}"
-      - "CONSUL_BIND_INTERFACE=eth0"
-      - "CONSUL_HTTP_ADDR=0.0.0.0"
-    entrypoint:
-      - consul
-      - agent
-      - -server
-      - -bootstrap-expect=1
-      - -data-dir=/consul/data
-      - -node=consul-one
-      - -bind={'{{ GetInterfaceIP "eth0" }}'}
-      - -client=0.0.0.0
-      - -enable-script-checks=true
-      - -ui
-    volumes:
-      - ./config/consul:/consul/data
-    ports:
-      - "8500:8500"
-      - "8600:8600"
-    networks:
-      spnet:
-        ipv4_address: 172.30.0.9
-```
-</details>
-
-<details class="tip">
-<summary>Zookeeper</summary>
-#### Description
-Apache Kafka and Apache Flink require zookeeper to manage their clusters.
-
-#### Docker Compose
-```yaml
-zookeeper:
-    image: wurstmeister/zookeeper
-    ports:
-      - "2181:2181"
-    networks:
-      spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Kafka</summary>
-
-#### Description
-Kafka is used as the primary message broker. It is possible to use other brokers or even multiple message brokers in a single pipeline, but Kafka is the
-default. The communication between the processing elements in a pipeline is mostly done via Kafka.
-
-#### Docker Compose
-```yaml
-  kafka:
-    image: wurstmeister/kafka:0.10.0.1
-    ports:
-      - "9092:9092"
-    environment:
-      KAFKA_ADVERTISED_HOST_NAME: ###TODO ADD HOSTNAME HERE ###
-      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
-    volumes:
-      - /var/run/docker.sock:/var/run/docker.sock
-    networks:
-      spnet:
-```
-</details>
-
-<details class="tip">
-<summary>ActiveMQ</summary>
-#### Description
-ActiveMQ is another message broker which can be used in addition to Kafka. Currently, the main purpose is to provide
-an endpoint for the websocket connections required by the real-time dashboard of the StreamPipes UI.
-
-#### Docker Compose
-```yaml
-activemq:
-  image: streampipes/activemq
-  ports:
-    - "61616:61616"
-    - "61614:61614"
-    - "8161:8161"
-  networks:
-    spnet:
-
-    ```
-</details>
-
-<details class="tip">
-<summary>CouchDB</summary>
-
-#### Description
-CouchDB is the main database for StreamPipes data that needs to be persisted such as pipelines, users and visualizations created in the dashboard.
-
-#### Docker Compose
-```yaml
-couchdb:
-  image: couchdb
-  ports:
-    - "5984:5984"
-  volumes:
-    - ./config/couchdb/data:/usr/local/var/lib/couchdb
-  networks:
-    spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Flink</summary>
-#### Description
-This service sets up a sample flink cluster with one jobmanager and one taskmanager. Although this cluster can be used for testing, it is not recommended for production use.
-
-#### Docker Compose
-```yaml
-jobmanager:
-  image: streampipes/flink
-  ports:
-    - "8081:8099"
-  command: jobmanager
-  networks:
-    spnet:
-
-
-taskmanager:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/flink
-  command: taskmanager
-  environment:
-    - FLINK_NUM_SLOTS=20
-  networks:
-    spnet:
-```
-</details>
-
-
-## Processing Elements
-
-<details class="tip">
-<summary>PE Examples Sources</summary>
-#### Description
-This Processing Element Container contains several sample data sources that can be used to work with StreamPipes.
-It consists of sources descriptions and data simulators that constantly produce data.
-
-#### Docker Compose
-```yaml
-    pe-examples-sources:
-      image: streampipes/pe-examples-sources:
-      depends_on:
-        - "consul"
-      ports:
-        - "8098:8090"
-      networks:
-        spnet:
-```
-</details>
-
-<details class="tip">
-<summary>PE Examples JVM</summary>
-
-#### Description
-This Processing Element Container contains some sink example implementations, like for example the real-time
-dashboard. This can be used to visualize data within StreamPipes.
-
-#### Docker Compose
-```yaml
-      pe-exanmples-jvm:
-        image: streampipes/pe-examples-jvm
-        depends_on:
-          - "consul"
-        environment:
-          - STREAMPIPES_HOST=###TODO ADD HOSTNAME HERE ###
-        ports:
-          - "8096:8090"
-        networks:
-          spnet:
-```
-</details>
-
-<details class="tip">
-<summary>PE Examples Flink</summary>
-
-#### Description
-The Flink Samples Processing Element Container contains some example algorithms that can be used within processing
-pipelines in the pipeline editor. Those algorithms are deployed to a Flink cluster once the pipeline is started.
-
-#### Docker Compose
-```yaml
-  pe-flink-samples:
-    image: streampipes/pe-examples-flink
-    depends_on:
-      - "consul"
-    ports:
-      - "8094:8090"
-    volumes:
-      - ./config:/root/.streampipes
-    networks:
-      spnet:
-```
-</details>
-
-### Third Party Services
-
-<details class="tip">
-<summary>Elasticsearch</summary>
-
-#### Description
-This service can be used to run Elasticsearch. Data can be written into Elasticsearch with the Elasticsearch
-sink of the PE Flink samples conatiner.
-
-#### Docker Compose
-```yaml
-elasticsearch:
-  image: ipe-wim-gitlab.fzi.de:5000/streampipes/services/elasticsearch
-  ports:
-    - "9200:9200"
-    - "9300:9300"
-  volumes:
-    - ./config/elasticsearch/data:/usr/share/elasticsearch/data
-  networks:
-    spnet:
-```
-</details>
-
-<details class="tip">
-<summary>Kibana</summary>
-#### Description
-Kibana is used to visualize data that is written into Elasticsearch. It can be used in addition to our live dashboard
-to analyse and visualize historic data.
-
-#### Docker Compose
-```yaml
-kibana:
-  image: kibana:5.2.2
-  ports:
-    - "5601:5601"
-  volumes:
-    - ./config/kibana/kibana.yml:/opt/kibana/config/kibana.yml
-  environment:
-    - ELASTICSEARCH_URL=http://elasticsearch:9200
-  networks:
-    spnet:
-```
-</details>
-
-
-<details class="tip">
-<summary>Kafka Web Console</summary>
-
-#### Description
-The kafka web console can be used to monitor the kafka cluster. This is a good tool for debugging your newly
-developed pipeline elements.
-
-#### Docker Compose
-```yaml
-kafka-web-console:
-  image: hwestphal/kafka-web-console
-  ports:
-    - "9000:9000"
-  volumes:
-    - ./config:/data
-  networks:
-    spnet:
-```
-</details>