Ozone is an Apache project. The bug tracking system for Ozone is under the Apache Jira project named HDDS.
This document summarize the contribution process:
We welcome contributions of:
ozone freon), which can be used to test cluster and report problems.
If you have any questions, please don't hesitate to contact
Requirements to compile the code:
(Standard development tools such as make, gcc, etc. are required.)
After installing the requirements (especially Maven) build is as simple as:
mvn clean verify -DskipTests
-DskipShadeto skip shaded Ozone FS jar file creation. Saves time, but you can't test integration with other software that uses Ozone as a Hadoop-compatible file system.
-DskipReconto skip building Recon Web UI. It saves about 2 minutes.
-Pdistto build the binary tarball, similar to the one that gets released
Additional requirements for running Ozone in pseudo cluster (including acceptance tests):
After building Ozone locally, you can start your first pseudo cluster:
cd hadoop-ozone/dist/target/ozone-*-SNAPSHOT/compose/ozone OZONE_REPLICATION_FACTOR=3 ./run.sh -d
We use GitHub pull requests for contributing changes to the repository. The main workflow is as follows:
apache/ozonerepository (first time) and clone it to your local machine
build-branchGitHub Actions workflow (defined in
.github/workflows/post-commit.yml) in your fork
git checkout -b HDDS-1234)
build-branchworkflow to complete successfully for your commit.
master), e.g. to resolve conflicts, please do so by merge, not rebase:
git merge --no-edit origin/master.
Basic code conventions followed by Ozone:
@authortags, authorship is indicated by Git history
These are checked by tools like Checkstyle and RAT.
hadoop-ozone/dev-support/checks directory contains scripts to build and check Ozone. Most of these are executed by CI for every commit and pull request. Running them before creating a pull request is strongly recommended. This can be achieved by enabling the
build-branch workflow in your fork and letting GitHub run all of the checks, but most of the checks can also be run locally.
build.sh: compiles Ozone
author.sh: checks for
bats.sh: unit test for shell scripts
rat.sh: checks for Apache license header
docs.sh: sanity checks for Ozone documentation
dependency.sh: compares list of jars in build output with known list
kubernetes.sh: very limited set of tests run in Kubernetes environment
unit.sh: pure unit tests
integration.sh: Java-based tests using single JVM “mini cluster”
acceptance.sh: rather complete set of tests in Docker Compose-based environment
The set of tests run by
acceptance may be limited via arguments, please see the scripts for details. This is used by CI to run them in multiple splits to avoid taking too much time.
Some scripts require third-party tools, but most of these are installed during the first run, if needed.
Most scripts (except
build.sh) output results in
If you have very good reasons, you can ignore any Fingbugs warning. Your good reason can be persisted with the
@SuppressFBWarnings(value="AT_OPERATION_SEQUENCE_ON_CONCURRENT_ABSTRACTION", justification="The method is synchronized and this is the only place "+ "dnsToUuidMap is modified") private synchronized void addEntryTodnsToUuidMap( ...
As Ozone uses Apache Maven it can be developed from any IDE. IntelliJ IDEA is a common choice, here are some suggestions to use it for Ozone development.
Ozone components depends on maven classpath. We generate classpath descriptor from the maven pom.xml files to use exactly the same classpath at runtime.
As a result, it's easy to start all the components from IDE as the right classpath (without provided scope) has already been set.
To start Ozone from IntelliJ:
You can use the installed Run configurations in the following order:
Checkstyle plugin may help to detect violations directly from the IDE.
Checkstyleand Add (
+) a new
pom.xmlfor the current version of the used checkstyle and use the same version with the plugin (
Ozonerule and execute the check
IntelliJ may not pick up protoc generated classes as they can be very huge. If the protoc files can't be compiled try the following:
Sometimes during incremental build IDEA encounters the following error:
bad class file: hadoop-hdds/common/target/classes/org/apache/hadoop/ozone/common/ChunkBufferImplWithByteBufferList$1.class
Usually this can be fixed by removing the class file (outside of the IDE), but sometimes only by a full Rebuild.
The Ozone project uses Github Actions for its CI system. The configuration is described in detail here.