layout: global title: Troubleshooting Guide description: Troubleshooting Guide

  • This will become a table of contents (this text will be scraped). {:toc}

ClassNotFoundException for commons-math3

The Apache Commons Math library is utilized by SystemML. The commons-math3 dependency is included with Spark and with newer versions of Hadoop. Running SystemML on an older Hadoop cluster can potentially generate an error such as the following due to the missing commons-math3 dependency:

java.lang.ClassNotFoundException: org.apache.commons.math3.linear.RealMatrix

This issue can be fixed by changing the commons-math3 scope in the pom.xml file from provided to compile.


SystemML can then be rebuilt with the commons-math3 dependency using Maven (mvn clean package -P distribution).

OutOfMemoryError in Hadoop Reduce Phase

In Hadoop MapReduce, outputs from mapper nodes are copied to reducer nodes and then sorted (known as the shuffle phase) before being consumed by reducers. The shuffle phase utilizes several buffers that share memory space with other MapReduce tasks, which will throw an OutOfMemoryError if the shuffle buffers take too much space:

Error: java.lang.OutOfMemoryError: Java heap space
    at org.apache.hadoop.mapred.IFile$Reader.readNextBlock(
    at org.apache.hadoop.mapred.IFile$
    at org.apache.hadoop.mapred.Merger$
    at org.apache.hadoop.mapred.Merger$MergeQueue.adjustPriorityQueue(
    at org.apache.hadoop.mapred.Merger$
    at org.apache.hadoop.mapred.Merger.writeFile(

One way to fix this issue is lowering the following buffer thresholds.

mapred.job.shuffle.input.buffer.percent # default 0.70; try 0.20 
mapred.job.shuffle.merge.percent # default 0.66; try 0.20
mapred.job.reduce.input.buffer.percent # default 0.0; keep 0.0

These configurations can be modified globally by inserting/modifying the following in mapred-site.xml.


They can also be configured on a per SystemML-task basis by inserting the following in SystemML-config.xml.


Note: The default SystemML-config.xml is located in <path to SystemML root>/conf/. It is passed to SystemML using the -config argument:

hadoop jar SystemML.jar [-? | -help | -f <filename>] (-config=<config_filename>) ([-args | -nvargs] <args-list>)

See Invoking SystemML in Hadoop Batch Mode for details of the syntax.