Spark Properties and spark-defaults.conf Properties File
Spark properties are the means of tuning the execution environment for your Spark applications.
The default Spark properties file is $SPARK_HOME/conf/spark-defaults.conf that could be overriden using spark-submit with –properties-file command-line option.
| Environment Variable | Default Value | Description |
|---|---|---|
|
|
|
Spark’s configuration directory (with |
|
Tip
|
Read the official documentation of Apache Spark on Spark Configuration. |
| Property Name | Default | Description |
|---|---|---|
|
|
Comma-separated list of directories that are used as a temporary storage for “scratch” space, including map output files and RDDs that get stored on disk. This should be on a fast, local disk in your system. It can also be a comma-separated list of multiple directories on different disks. |
spark-defaults.conf — Default Spark Properties File
spark-defaults.conf (under SPARK_CONF_DIR or $SPARK_HOME/conf) is the default properties file with the Spark properties of your Spark applications.
|
Note
|
spark-defaults.conf is loaded by AbstractCommandBuilder’s loadPropertiesFile internal method.
|
Calculating Path of Default Spark Properties — Utils.getDefaultPropertiesFile method
|
1 2 3 4 5 |
getDefaultPropertiesFile(env: Map[String, String] = sys.env): String |
getDefaultPropertiesFile calculates the absolute path to spark-defaults.conf properties file that can be either in directory specified by SPARK_CONF_DIR environment variable or $SPARK_HOME/conf directory.
|
Note
|
getDefaultPropertiesFile is part of private[spark] org.apache.spark.util.Utils object.
|
spark技术分享