Spark Submit — spark-submit
shell script
spark-submit
shell script allows you to manage your Spark applications.
You can submit your Spark application to a Spark deployment environment for execution, kill or request status of Spark applications.
You can find spark-submit
script in bin
directory of the Spark distribution.
When executed, spark-submit
script first checks whether SPARK_HOME
environment variable is set and sets it to the directory that contains bin/spark-submit
shell script if not. It then executes spark-class
shell script to run SparkSubmit
standalone application.
Caution
|
FIXME Add Cluster Manager and Deploy Mode to the table below (see options value)
|
Command-Line Option | Spark Property | Environment Variable | Description | Internal Property |
---|---|---|---|---|
|
Defaults to |
|||
|
|
|||
|
|
|||
|
|
|
Deploy mode |
|
|
|
The driver’s class path |
|
|
|
|
The driver’s JVM options |
|
|
|
|
The driver’s native library path |
|
|
|
The driver’s memory |
|
||
|
|
|
||
|
|
|
||
|
|
|
The number of executor CPU cores |
|
|
An executor’s memory |
|
||
|
|
|
||
|
|
|||
|
|
|
||
|
|
|
||
|
|
|||
|
|
|
Master URL. Defaults to |
|
|
|
|||
|
|
|
Uses |
|
|
|
|||
|
|
|||
|
|
|
||
|
|
|
||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|
||
|
|
|||
|
|
|||
|
|
|||
|
|
Tip
|
Set Refer to Print Launch Command of Spark Scripts (or |
Tip
|
Avoid using Refer to Executing Main — |
Preparing Submit Environment — prepareSubmitEnvironment
Internal Method
prepareSubmitEnvironment
creates a 4-element tuple, i.e. (childArgs, childClasspath, sysProps, childMainClass)
.
Element | Description |
---|---|
|
Arguments |
|
Classpath elements |
|
|
|
Main class |
prepareSubmitEnvironment
uses options
to…
Caution
|
FIXME |
Note
|
prepareSubmitEnvironment is used in SparkSubmit object.
|
Tip
|
See the elements of the return tuple using --verbose command-line option.
|
Custom Spark Properties File — --properties-file
command-line option
--properties-file
command-line option sets the path to a file FILE
from which Spark loads extra Spark properties.
Tip
|
Spark uses conf/spark-defaults.conf by default. |
Driver Cores in Cluster Deploy Mode — --driver-cores
command-line option
--driver-cores
command-line option sets the number of cores to NUM
for the driver in the cluster deploy mode.
Note
|
--driver-cores switch is only available for cluster mode (for Standalone, Mesos, and YARN).
|
Note
|
It corresponds to spark.driver.cores setting. |
Note
|
It is printed out to the standard error output in verbose mode. |
Additional JAR Files to Distribute — --jars
command-line option
--jars
is a comma-separated list of local jars to include on the driver’s and executors’ classpaths.
Caution
|
FIXME |
Specifying YARN Resource Queue — --queue
command-line option
With --queue
you can choose the YARN resource queue to submit a Spark application to. The default queue name is default
.
Caution
|
FIXME What is a queue ?
|
Note
|
It corresponds to spark.yarn.queue Spark’s setting. |
Tip
|
It is printed out to the standard error output in verbose mode. |
Actions
Submitting Applications for Execution — submit
method
The default action of spark-submit
script is to submit a Spark application to a deployment environment for execution.
Tip
|
Use –verbose command-line switch to know the main class to be executed, arguments, system properties, and classpath (to ensure that the command-line arguments and switches were processed properly). |
When executed, spark-submit
executes submit
method.
If proxyUser
is set it will…FIXME
Caution
|
FIXME Review why and when to use proxyUser .
|
It passes the execution on to runMain.
Executing Main — runMain
internal method
runMain
is an internal method to build execution environment and invoke the main method of the Spark application that has been submitted for execution.
Note
|
It is exclusively used when submitting applications for execution. |
When verbose
input flag is enabled (i.e. true
) runMain
prints out all the input parameters, i.e. childMainClass
, childArgs
, sysProps
, and childClasspath
(in that order).
Note
|
Use spark-submit ‘s –verbose command-line option to enable verbose flag.
|
runMain
builds the context classloader (as loader
) depending on spark.driver.userClassPathFirst
flag.
Caution
|
FIXME Describe spark.driver.userClassPathFirst
|
It adds the jars specified in childClasspath
input parameter to the context classloader (that is later responsible for loading the childMainClass
main class).
Note
|
childClasspath input parameter corresponds to –jars command-line option with the primary resource if specified in client deploy mode.
|
It sets all the system properties specified in sysProps
input parameter (using Java’s System.setProperty method).
It creates an instance of childMainClass
main class (as mainClass
).
Note
|
childMainClass is the main class spark-submit has been invoked with.
|
Tip
|
Avoid using scala.App trait for a Spark application’s main class in Scala as reported in SPARK-4170 Closure problems when running Scala app that “extends App”.
|
If you use scala.App
for the main class, you should see the following warning message in the logs:
Finally, runMain
executes the main
method of the Spark application passing in the childArgs
arguments.
Any SparkUserAppException
exceptions lead to System.exit
while the others are simply re-thrown.
Adding Local Jars to ClassLoader — addJarToClasspath
internal method
addJarToClasspath
is an internal method to add file
or local
jars (as localJar
) to the loader
classloader.
Internally, addJarToClasspath
resolves the URI of localJar
. If the URI is file
or local
and the file denoted by localJar
exists, localJar
is added to loader
. Otherwise, the following warning is printed out to the logs:
For all other URIs, the following warning is printed out to the logs:
Note
|
addJarToClasspath assumes file URI when localJar has no URI specified, e.g. /path/to/local.jar .
|
Caution
|
FIXME What is a URI fragment? How does this change re YARN distributed cache? See Utils#resolveURI .
|
Command-line Options
Execute spark-submit --help
to know about the command-line options supported.
-
--class
-
--conf
or-c
-
--deploy-mode
(see Deploy Mode) -
--driver-class-path
(see--driver-class-path
command-line option) -
--driver-cores
(see Driver Cores in Cluster Deploy Mode) -
--driver-java-options
-
--driver-library-path
-
--driver-memory
-
--executor-memory
-
--files
-
--jars
-
--kill
for Standalone cluster mode only -
--master
-
--name
-
--packages
-
--exclude-packages
-
--properties-file
(see Custom Spark Properties File) -
--proxy-user
-
--py-files
-
--repositories
-
--status
for Standalone cluster mode only -
--total-executor-cores
List of switches, i.e. command-line options that do not take parameters:
-
--help
or-h
-
--supervise
for Standalone cluster mode only -
--usage-error
-
--verbose
or-v
(see Verbose Mode) -
--version
(see Version)
YARN-only options:
-
--archives
-
--executor-cores
-
--keytab
-
--num-executors
-
--principal
-
--queue
(see Specifying YARN Resource Queue (–queue switch))
--driver-class-path
command-line option
--driver-class-path
command-line option sets the extra class path entries (e.g. jars and directories) that should be added to a driver’s JVM.
Tip
|
You should use --driver-class-path in client deploy mode (not SparkConf) to ensure that the CLASSPATH is set up with the entries. client deploy mode uses the same JVM for the driver as spark-submit ‘s.
|
--driver-class-path
sets the internal driverExtraClassPath
property (when SparkSubmitArguments.handle called).
It works for all cluster managers and deploy modes.
If driverExtraClassPath
not set on command-line, the spark.driver.extraClassPath setting is used.
Note
|
Command-line options (e.g. --driver-class-path ) have higher precedence than their corresponding Spark settings in a Spark properties file (e.g. spark.driver.extraClassPath ). You can therefore control the final settings by overriding Spark settings on command line using the command-line options.
|
Setting / System Property | Command-Line Option | Description |
---|---|---|
|
Extra class path entries (e.g. jars and directories) to pass to a driver’s JVM. |
Verbose Mode — --verbose
command-line option
When spark-submit
is executed with --verbose
command-line option, it enters verbose mode.
In verbose mode, the parsed arguments are printed out to the System error output.
It also prints out propertiesFile
and the properties from the file.
Deploy Mode — --deploy-mode
command-line option
You use spark-submit’s --deploy-mode
command-line option to specify the deploy mode for a Spark application.
Environment Variables
The following is the list of environment variables that are considered when command-line options are not specified:
-
MASTER
for--master
-
SPARK_DRIVER_MEMORY
for--driver-memory
-
SPARK_EXECUTOR_MEMORY
(see Environment Variables in the SparkContext document) -
SPARK_EXECUTOR_CORES
-
DEPLOY_MODE
-
SPARK_YARN_APP_NAME
-
_SPARK_CMD_USAGE
External packages and custom repositories
The spark-submit
utility supports specifying external packages using Maven coordinates using --packages
and custom repositories using --repositories
.
FIXME Why should I care?
Launching SparkSubmit Standalone Application — main
method
Tip
|
The source code of the script lives in https://github.com/apache/spark/blob/master/bin/spark-submit. |
When executed, spark-submit
script simply passes the call to spark-class with org.apache.spark.deploy.SparkSubmit
class followed by command-line arguments.
Tip
|
|
It creates an instance of SparkSubmitArguments.
If in verbose mode, it prints out the application arguments.
It then relays the execution to action-specific internal methods (with the application arguments):
-
When no action was explicitly given, it is assumed submit action.
-
kill (when
--kill
switch is used) -
requestStatus (when
--status
switch is used)
Note
|
The action can only have one of the three available values: SUBMIT , KILL , or REQUEST_STATUS .
|
spark-env.sh – load additional environment settings
-
spark-env.sh
consists of environment settings to configure Spark for your site. -
spark-env.sh
is loaded at the startup of Spark’s command line scripts. -
SPARK_ENV_LOADED
env var is to ensure thespark-env.sh
script is loaded once. -
SPARK_CONF_DIR
points at the directory withspark-env.sh
or$SPARK_HOME/conf
is used. -
spark-env.sh
is executed if it exists. -
$SPARK_HOME/conf
directory hasspark-env.sh.template
file that serves as a template for your own custom configuration.
Consult Environment Variables in the official documentation.