Skip to content

Commit a1a1361

Browse files
Kanwaljit SinghAndrew Or
Kanwaljit Singh
authored and
Andrew Or
committed
SPARK-2641: Passing num executors to spark arguments from properties file
Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances. Author: Kanwaljit Singh <kanwaljit.singh@guavus.com> Closes #1657 from kjsingh/branch-1.0 and squashes the following commits: d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors Conflicts: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
1 parent 4da1039 commit a1a1361

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -120,6 +120,8 @@ private[spark] class SparkSubmitArguments(args: Seq[String], env: Map[String, St
120120
name = Option(name).orElse(sparkProperties.get("spark.app.name")).orNull
121121
jars = Option(jars).orElse(sparkProperties.get("spark.jars")).orNull
122122
deployMode = Option(deployMode).orElse(env.get("DEPLOY_MODE")).orNull
123+
numExecutors = Option(numExecutors)
124+
.getOrElse(defaultProperties.get("spark.executor.instances").orNull)
123125

124126
// Try to set main class from JAR if no --class argument is given
125127
if (mainClass == null && !isPython && primaryResource != null) {

0 commit comments

Comments
 (0)