layout: global title: Spark Configuration

Spark provides three locations to configure the system:

  • Spark properties control most application parameters and can be set by passing a SparkConf object to SparkContext, or through Java system properties.
  • Environment variables can be used to set per-machine settings, such as the IP address, through the conf/spark-env.sh script on each node.
  • Logging can be configured through log4j.properties.

Spark Properties

Spark properties control most application settings and are configured separately for each application. The preferred way to set them is by passing a SparkConf class to your SparkContext constructor. Alternatively, Spark will also load them from Java system properties, for compatibility with old versions of Spark.

SparkConf lets you configure most of the common properties to initialize a cluster (e.g., master URL and application name), as well as arbitrary key-value pairs through the set() method. For example, we could initialize an application as follows:

{% highlight scala %} val conf = new SparkConf() .setMaster(“local”) .setAppName(“My application”) .set(“spark.executor.memory”, “1g”) val sc = new SparkContext(conf) {% endhighlight %}

Most of the properties control internal settings that have reasonable default values. However, there are at least five properties that you will commonly want to control:

Apart from these, the following properties are also available, and may be useful in some situations:

Viewing Spark Properties

The application web UI at http://<driver>:4040 lists Spark properties in the “Environment” tab. This is a useful place to check to make sure that your properties have been set correctly.

Environment Variables

Certain Spark settings can be configured through environment variables, which are read from the conf/spark-env.sh script in the directory where Spark is installed (or conf/spark-env.cmd on Windows). These variables are meant to be for machine-specific settings, such as library search paths. While Spark properties can also be set there through SPARK_JAVA_OPTS, for per-application settings, we recommend setting these properties within the application instead of in spark-env.sh so that different applications can use different settings.

Note that conf/spark-env.sh does not exist by default when Spark is installed. However, you can copy conf/spark-env.sh.template to create it. Make sure you make the copy executable.

The following variables can be set in spark-env.sh:

  • JAVA_HOME, the location where Java is installed (if it's not on your default PATH)
  • PYSPARK_PYTHON, the Python binary to use for PySpark
  • SPARK_LOCAL_IP, to configure which IP address of the machine to bind to.
  • SPARK_LIBRARY_PATH, to add search directories for native libraries.
  • SPARK_CLASSPATH, to add elements to Spark's classpath that you want to be present for all applications. Note that applications can also add dependencies for themselves through SparkContext.addJar -- we recommend doing that when possible.
  • SPARK_JAVA_OPTS, to add JVM options. This includes Java options like garbage collector settings and any system properties that you'd like to pass with -D. One use case is to set some Spark properties differently on this machine, e.g., -Dspark.local.dir=/disk1,/disk2.
  • Options for the Spark standalone cluster scripts, such as number of cores to use on each machine and maximum memory.

Since spark-env.sh is a shell script, some of these can be set programmatically -- for example, you might compute SPARK_LOCAL_IP by looking up the IP of a specific network interface.

Configuring Logging

Spark uses log4j for logging. You can configure it by adding a log4j.properties file in the conf directory. One way to start is to copy the existing log4j.properties.template located there.