Merge branch 'livedoc' into develop

# Conflicts:
#	README.md
diff --git a/docs/manual/source/install/launch-aws.html.md.erb b/docs/manual/source/archived/launch-aws.html.md.erb
similarity index 100%
rename from docs/manual/source/install/launch-aws.html.md.erb
rename to docs/manual/source/archived/launch-aws.html.md.erb
diff --git a/docs/manual/source/community/contribute-documentation.html.md b/docs/manual/source/community/contribute-documentation.html.md
index 1645066..e4b1d55 100644
--- a/docs/manual/source/community/contribute-documentation.html.md
+++ b/docs/manual/source/community/contribute-documentation.html.md
@@ -196,7 +196,7 @@
 
 ```
 $ git remote -v
-$ git remote add apache https://git-wip-us.apache.org/repos/asf/predictionio.git
+$ git remote add apache https://gitbox.apache.org/repos/asf/predictionio.git
 ```
 
 Then, push the `livedoc` branch. (It will be published and synced with the public GitHub mirror):
diff --git a/docs/manual/source/index.html.md.erb b/docs/manual/source/index.html.md.erb
index 4289278..7c74901 100644
--- a/docs/manual/source/index.html.md.erb
+++ b/docs/manual/source/index.html.md.erb
@@ -49,9 +49,9 @@
 | --------------- | ---------------- | ------------------------------------ | ----------------- |
 | [Quick Intro](/start/) | [System Architecture](/system/) | [Demo: Recommending Comics](/demo/tapster/) | [Java](/sdk/java/) |
 | [Installation Guide](/install/) | [Event Server Overview](/datacollection/) | [Text Classification](/demo/textclassification/) | [PHP](/sdk/php/) |
-| [Downloading Template](/start/download/) | [Collecting Data](/datacollection/eventapi/) | [Community Contributed Demo](/demo/community/) | [Python](/sdk/python/) |
+| [Downloading Template](/start/download/) | [Collecting Data](/datacollection/eventapi/) | [Community Contributed Demo](/community/projects.html#demos) | [Python](/sdk/python/) |
 | [Deploying an Engine](/start/deploy/) | [Learning DASE](/customize/) |[Dimensionality Reduction](/machinelearning/dimensionalityreduction/)| [Ruby](/sdk/ruby/) |
-| [Customizing an Engine](/start/customize/) | [Implementing DASE](/customize/dase/) ||[Community Contributed](/sdk/community/) |
+| [Customizing an Engine](/start/customize/) | [Implementing DASE](/customize/dase/) ||[Community Contributed](/community/projects.html#sdks) |
 | [App Integration Overview](/appintegration/) | [Evaluation Overview](/evaluation/) |||
 || [Intellij IDEA Guide](/resources/intellij/) |||
 || [Scala API](/api/current/#package) |||
diff --git a/docs/manual/source/install/index.html.md.erb b/docs/manual/source/install/index.html.md.erb
index 5db10b2..7da2a94 100644
--- a/docs/manual/source/install/index.html.md.erb
+++ b/docs/manual/source/install/index.html.md.erb
@@ -55,11 +55,9 @@
 
 * [Installing Apache PredictionIO](install-sourcecode.html)
 
-You may also use one of the community-contributed packages to install
-Apache PredictionIO®:
+You may also use Docker to install Apache PredictionIO®
 
-* [Installing Apache PredictionIO with
-  Docker](/community/projects.html#docker-images)
+* [Installing Apache PredictionIO with Docker](install-docker.html)
 
 
 [//]: # (* *(coming soon)* Installing Apache PredictionIO with Homebrew)
diff --git a/docs/manual/source/install/install-docker.html.md.erb b/docs/manual/source/install/install-docker.html.md.erb
new file mode 100644
index 0000000..44435af
--- /dev/null
+++ b/docs/manual/source/install/install-docker.html.md.erb
@@ -0,0 +1,127 @@
+---
+title: Installing Apache PredictionIO® with Docker
+---
+
+<!--
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+    http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+
+## Download and Start Docker
+
+Docker is a widely used container solution. Please download and start Docker by following their [guide](https://www.docker.com/get-started).
+
+## Get PredictionIO and Dependencies Configuration
+
+Starting from v0.13.0, Apache PredictionIO® starts to provide docker support for the production environment. `Dockerfile` and dependencies configuration can be found in the `docker` folder in the [git repository](https://github.com/apache/predictionio/tree/develop/docker).
+
+```bash
+git clone https://github.com/apache/predictionio.git
+cd predictionio/docker
+```
+
+INFO: In this installation, we only need the `docker` sub-directory in the repository. One can use other tools to get the folder without cloning the whole project.
+
+## Build Docker Image
+
+To build PredictionIO docker image, `Dockerfile` is provided in sub-directory `pio`.
+
+```
+docker build -t predictionio/pio pio
+```
+
+One will be able to build an image with tag `prediction/pio:latest` using the above command.
+
+WARNING: People can get PredictionIO image from Dockerhub through `docker pull predictionio/pio`. However, since the image cannot run without a properly configured storage, please follow the following steps to complete the installation.
+
+WARNING: Image `prediction/pio` hosted on Dockerhub is **NOT** regarded as an official ASF release and might provide a different PredictionIO version from your desired PredictionIO version. It is recommended to build the image locally other than pulling directly from Dockerhub.
+
+## Pull Images and Start
+
+In this repository, PostgreSQL, MySQL, ElasticSearch, and local file system are supported with their corresponding configuration.
+
+### Supported storages are as below:
+
+Event Storage
+
+ - PostgreSQL, MySQL, Elasticsearch
+
+Metadata Storage
+
+ - PostgreSQL, MySQL, Elasticsearch
+
+Model Storage
+
+ - PostgreSQL, MySQL, LocalFS
+
+One can use `docker-compose -f` to pull and start the corresponding services. More details are provided in [this document](https://github.com/apache/predictionio/blob/develop/docker/README.md#run-predictionio-with-selectable-docker-compose-files).
+
+### Service Starting Sample
+
+```
+docker-compose -f docker-compose.yml \
+    -f pgsql/docker-compose.base.yml \
+    -f pgsql/docker-compose.meta.yml \
+    -f pgsql/docker-compose.event.yml \
+    -f pgsql/docker-compose.model.yml \
+    up
+```
+
+In this examples, we pull and start `predictionio/pio` image with `docker-compose.yml`.
+
+And pull `postgres:9` image with `pgsql/docker-compose.base.yml`.
+
+And config PostgreSQL to store our metadata, event, and model with `pgsql/docker-compose.meta.yml`, `pgsql/docker-compose.event.yml`, and `pgsql/docker-compose.model.yml`.
+
+After pulling the images, the script will start PostgreSQL, Apache PredictionIO, and Apache Spark. The event server should be ready at port `7070`, and one should see these logs in the command line interface.
+
+```
+...
+pio_1       | [INFO] [Management$] Your system is all ready to go.
+pio_1       | [INFO] [Management$] Creating Event Server at 0.0.0.0:7070
+pio_1       | [INFO] [HttpListener] Bound to /0.0.0.0:7070
+pio_1       | [INFO] [EventServerActor] Bound received. EventServer is ready.
+```
+
+## Verifying Service
+
+A command tool `pio-docker` is provided to invoke `pio` command in the PredictionIO container. Set `pio-docker` to default execution path and use `status` to check the current PredictionIO service with the following script.
+
+```bash
+$ export PATH=`pwd`/bin:$PATH
+$ pio-docker status
+```
+
+One should be able to see the corresponding log in the following structure, and your system is ready to go!
+
+```
+[INFO] [Management$] Inspecting PredictionIO...
+[INFO] [Management$] PredictionIO 0.13.0 is installed at /usr/share/predictionio
+[INFO] [Management$] Inspecting Apache Spark...
+[INFO] [Management$] Apache Spark is installed at /usr/share/spark-2.2.2-bin-hadoop2.7
+[INFO] [Management$] Apache Spark 2.2.2 detected (meets minimum requirement of 1.3.0)
+[INFO] [Management$] Inspecting storage backend connections...
+[INFO] [Storage$] Verifying Meta Data Backend (Source: PGSQL)...
+[INFO] [Storage$] Verifying Model Data Backend (Source: PGSQL)...
+[INFO] [Storage$] Verifying Event Data Backend (Source: PGSQL)...
+[INFO] [Storage$] Test writing to Event Store (App Id 0)...
+[INFO] [Management$] Your system is all ready to go.
+```
+
+INFO: After the service is up, one can continue by changing `pio` to `pio-docker` for further deployment. More details are provided in [this document](https://github.com/apache/predictionio/tree/develop/docker#tutorial).
+
+## Community Docker Support
+
+[More PredictionIO Docker packages supported by our great community](/community/projects.html#docker-images).
diff --git a/docs/manual/source/resources/upgrade.html.md b/docs/manual/source/resources/upgrade.html.md
index 7c79f2b..2cffc75 100644
--- a/docs/manual/source/resources/upgrade.html.md
+++ b/docs/manual/source/resources/upgrade.html.md
@@ -219,7 +219,7 @@
 - remove `import org.apache.predictionio.data.storage.Storage` and replace it by `import org.apache.predictionio.data.store.LEventStore`
 - change `appId` to `appName` in the XXXAlgorithmParams class.
 - remove this line of code: `@transient lazy val lEventsDb = Storage.getLEvents()`
-- locate where `LEventStore.findByEntity()` is used, change it to `LEventStore.findByEntity()`:
+- locate where `lEventsDb.findSingleEntity()` is used, change it to `LEventStore.findByEntity()`:
 
     For example, change following code
 
diff --git a/docs/manual/source/start/index.html.md b/docs/manual/source/start/index.html.md
index 44b7858..9c40e06 100644
--- a/docs/manual/source/start/index.html.md
+++ b/docs/manual/source/start/index.html.md
@@ -21,7 +21,7 @@
 
 ## Overview
 
-PredictionIO consist of the following components:
+PredictionIO consists of the following components:
 
 * **PredictionIO platform** - our open source machine learning stack for building, evaluating and deploying engines with machine learning algorithms.
 * **Event Server** - our open source machine learning analytics layer for unifying events from multiple platforms
diff --git a/docs/manual/source/templates/recommendation/quickstart.html.md.erb b/docs/manual/source/templates/recommendation/quickstart.html.md.erb
index b5dae35..8ad2ff3 100644
--- a/docs/manual/source/templates/recommendation/quickstart.html.md.erb
+++ b/docs/manual/source/templates/recommendation/quickstart.html.md.erb
@@ -37,12 +37,12 @@
 - user 'rate' item events
 - user 'buy' item events
 
-NOTE: You can customize to use other event.
+NOTE: You can customize this engine to use other events.
 
 ### Input Query
 
 - user ID
-- num of recommended items
+- number of recommended items
 
 ### Output PredictedResult
 
@@ -67,7 +67,7 @@
 the Recommendation Engine Template supports 2 types of events: **rate** and
 **buy**. A user can give a rating score to an item or buy an item. This template requires user-view-item and user-buy-item events.
 
-INFO: This template can easily be customized to consider more user events such as *like*, *dislike* etc.
+INFO: This template can easily be customized to consider more user events such as *like*, *dislike*, etc.
 
 <%= partial 'shared/quickstart/collect_data' %>
 
@@ -258,8 +258,8 @@
 
 <%= partial 'shared/quickstart/import_sample_data' %>
 
-A Python import script `import_eventserver.py` is provided in the template to import the data to
-Event Server using Python SDK. Please upgrade to the latest Python SDK.
+A Python import script `import_eventserver.py` is provided in the template to import the data to the
+Event Server using the Python SDK. Please upgrade to the latest Python SDK.
 
 <%= partial 'shared/quickstart/install_python_sdk' %>
 
@@ -284,7 +284,7 @@
 
 <%= partial 'shared/quickstart/query_eventserver_short' %>
 
-INFO: By default, the template train the model with "rate" events (explicit rating). You can customize the engine to [read other custom events](/templates/recommendation/reading-custom-events/) and [handle events of implicit preference (such as, view, buy)](/templates/recommendation/training-with-implicit-preference/)
+INFO: By default, the template trains the model with "rate" events (explicit rating). You can customize the engine to [read other custom events](/templates/recommendation/reading-custom-events/) and [handle events of implicit preference (such as, view, buy)](/templates/recommendation/training-with-implicit-preference/)
 
 ## 5. Deploy the Engine as a Service
 
@@ -294,12 +294,12 @@
 
 ## 6. Use the Engine
 
-Now, You can try to retrieve predicted results. To recommend 4 movies to user
+Now, you can try to retrieve predicted results. To recommend 4 movies to a user
 whose id is 1, you send this JSON `{ "user": "1", "num": 4 }` to the deployed
-engine and it will return a JSON of the recommended movies. Simply send a query
-by making a HTTP request or through the `EngineClient` of an SDK.
+engine and it will return a JSON result of the recommended movies. Simply send a query
+by making an HTTP request or through the `EngineClient` of an SDK.
 
-With the deployed engine running, open another terminal and run the following `curl` command or use SDK to send the query:
+With the deployed engine running, open another terminal and run the following `curl` command or use an SDK to send the query:
 
 <div class="tabs">
   <div data-tab="REST API" data-lang="json">