diff --git a/docs/configuration.md b/docs/configuration.md index aaaaca05341d1..409f1f521eb52 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -103,8 +103,9 @@ such as `--master`, as shown above. `spark-submit` can accept any Spark property flag, but uses special flags for properties that play a part in launching the Spark application. Running `./bin/spark-submit --help` will show the entire list of these options. -`bin/spark-submit` will also read configuration options from `conf/spark-defaults.conf`, in which -each line consists of a key and a value separated by whitespace. For example: +When configurations are specified via the `--conf/-c` flags, `bin/spark-submit` will also read +configuration options from `conf/spark-defaults.conf`, in which each line consists of a key and +a value separated by whitespace. For example: spark.master spark://5.6.7.8:7077 spark.executor.memory 4g @@ -112,7 +113,8 @@ each line consists of a key and a value separated by whitespace. For example: spark.serializer org.apache.spark.serializer.KryoSerializer In addition, a property file with Spark configurations can be passed to `bin/spark-submit` via -the `--properties-file` parameter. +`--properties-file` parameter. When this is set, Spark will no longer load configurations from +`conf/spark-defaults.conf` unless another parameter `--load-spark-defaults` is provided. Any values specified as flags or in the properties file will be passed on to the application and merged with those specified through SparkConf. Properties set directly on the SparkConf diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md index 3a99151768a12..071fbf5549398 100644 --- a/docs/submitting-applications.md +++ b/docs/submitting-applications.md @@ -178,8 +178,13 @@ The master URL passed to Spark can be in one of the following formats: # Loading Configuration from a File The `spark-submit` script can load default [Spark configuration values](configuration.html) from a -properties file and pass them on to your application. By default, it will read options -from `conf/spark-defaults.conf` in the `SPARK_HOME` directory. +properties file and pass them on to your application. The file can be specified via the `--properties-file` +parameter. When this is not specified, by default Spark will read options from `conf/spark-defaults.conf` +in the `SPARK_HOME` directory. + +An additional flag `--load-spark-defaults` can be used to tell Spark to load configurations from `conf/spark-defaults.conf` +even when a property file is provided via `--properties-file`. This is useful, for instance, when users +want to put system-wide default settings in the former while user/cluster specific settings in the latter. Loading default Spark configurations this way can obviate the need for certain flags to `spark-submit`. For instance, if the `spark.master` property is set, you can safely omit the