Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
The Docker Compose definition and wrapper script that creates Bigtop virtual Hadoop cluster on top of Docker containers for you, by pulling from existing publishing bigtop repositories. This cluster can be used:
This has been verified on Docker Engine 1.9.1, with api version 1.15, and Docker Compose 1.5.2 on Amazon Linux 2015.09 release.
Install Docker
Install Docker Compose
Install Ruby
Start the Docker daemon
service docker start
./docker-hadoop.sh --create 3
./docker-hadoop.sh --destroy
./docker-hadoop.sh --exec 1 bash
./docker-hadoop.sh --exec 2 hadoop fs -ls /
./docker-hadoop.sh --provision
./docker-hadoop.sh --smoke-tests
./docker-hadoop.sh --create 5 --smoke-tests --destroy
Commands will be executed by following order:
create 5 node cluster => run smoke tests => destroy the cluster
./docker-hadoop.sh -h usage: docker-hadoop.sh [-C file ] args -C file Use alternate file for config.yaml commands: -c NUM_INSTANCES, --create NUM_INSTANCES Create a Docker based Bigtop Hadoop cluster -d, --destroy Destroy the cluster -e, --exec INSTANCE_NO|INSTANCE_NAME Execute command on a specific instance. Instance can be specified by name or number. For example: docker-hadoop.sh --exec 1 bash docker-hadoop.sh --exec docker_bigtop_1 bash -E, --env-check Check whether required tools has been installed -l, --list List out container status for the cluster -p, --provision Deploy configuration changes -s, --smoke-tests Run Bigtop smoke tests -h, --help
docker: memory_limit: "2g"
If you've built packages using local cloned bigtop and produced the apt/yum repo, set the following to true to deploy those packages:
enable_local_repo = true
components: "hadoop, hbase, yarn,..."
By default, Apache Hadoop and YARN will be installed.