关注 spark技术分享,
撸spark源码 玩spark最佳实践

Spark Properties and spark-defaults.conf Properties File

Spark Properties and spark-defaults.conf Properties File

Spark properties are the means of tuning the execution environment for your Spark applications.

The default Spark properties file is $SPARK_HOME/conf/spark-defaults.conf that could be overriden using spark-submit with –properties-file command-line option.

Table 1. Environment Variables
Environment Variable Default Value Description

SPARK_CONF_DIR

${SPARK_HOME}/conf

Spark’s configuration directory (with spark-defaults.conf)

Tip
Read the official documentation of Apache Spark on Spark Configuration.
Table 2. Spark Application’s Properties
Property Name Default Description

spark.local.dir

/tmp

Comma-separated list of directories that are used as a temporary storage for “scratch” space, including map output files and RDDs that get stored on disk.

This should be on a fast, local disk in your system. It can also be a comma-separated list of multiple directories on different disks.

spark-defaults.conf — Default Spark Properties File

spark-defaults.conf (under SPARK_CONF_DIR or $SPARK_HOME/conf) is the default properties file with the Spark properties of your Spark applications.

Note
spark-defaults.conf is loaded by AbstractCommandBuilder’s loadPropertiesFile internal method.

Calculating Path of Default Spark Properties — Utils.getDefaultPropertiesFile method

getDefaultPropertiesFile calculates the absolute path to spark-defaults.conf properties file that can be either in directory specified by SPARK_CONF_DIR environment variable or $SPARK_HOME/conf directory.

Note
getDefaultPropertiesFile is part of private[spark] org.apache.spark.util.Utils object.
赞(0) 打赏
未经允许不得转载:spark技术分享 » Spark Properties and spark-defaults.conf Properties File
分享到: 更多 (0)

关注公众号:spark技术分享

联系我们联系我们

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏