Skip to content

Commit 10849d9

Browse files
committed
[SPARK-49034][CORE] Support server-side sparkProperties replacement in REST Submission API
### What changes were proposed in this pull request? Like SPARK-49033, this PR aims to support server-side `sparkProperties` replacement in REST Submission API. - For example, ephemeral Spark clusters with server-side environment variables can provide backend-resource and information without touching client-side applications and configurations. - The place holder pattern is `{{SERVER_ENVIRONMENT_VARIABLE_NAME}}` style like the following. https://github.com/apache/spark/blob/163e512c53208301a8511310023d930d8b77db96/docs/configuration.md?plain=1#L694 https://github.com/apache/spark/blob/163e512c53208301a8511310023d930d8b77db96/core/src/main/scala/org/apache/spark/deploy/rest/StandaloneRestServer.scala#L233-L234 ### Why are the changes needed? A user can submits an environment variable holder like `{{AWS_ENDPOINT_URL}}` in order to use server-wide environment variables of Spark Master. ``` $ SPARK_MASTER_OPTS="-Dspark.master.rest.enabled=true" \ AWS_ENDPOINT_URL=ENDPOINT_FOR_THIS_CLUSTER \ sbin/start-master.sh $ sbin/start-worker.sh spark://$(hostname):7077 ``` ``` curl -s -k -XPOST http://localhost:6066/v1/submissions/create \ --header "Content-Type:application/json;charset=UTF-8" \ --data '{ "appResource": "", "sparkProperties": { "spark.master": "spark://localhost:7077", "spark.app.name": "", "spark.submit.deployMode": "cluster", "spark.hadoop.fs.s3a.endpoint": "{{AWS_ENDPOINT_URL}}", "spark.jars": "/Users/dongjoon/APACHE/spark-merge/examples/target/scala-2.13/jars/spark-examples_2.13-4.0.0-SNAPSHOT.jar" }, "clientSparkVersion": "", "mainClass": "org.apache.spark.examples.SparkPi", "environmentVariables": {}, "action": "CreateSubmissionRequest", "appArgs": [ "10000" ] }' ``` - http://localhost:4040/environment/ ![Screenshot 2024-07-26 at 22 00 26](https://github.com/user-attachments/assets/20ea5d98-2503-4969-8cdb-82938c706029) ### Does this PR introduce _any_ user-facing change? No. This is a new feature and disabled by default via `spark.master.rest.enabled (default: false)` ### How was this patch tested? Pass the CIs with newly added test case. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#47511 from dongjoon-hyun/SPARK-49034-2. Authored-by: Dongjoon Hyun <dhyun@apple.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
1 parent 388ca1e commit 10849d9

File tree

2 files changed

+13
-0
lines changed

2 files changed

+13
-0
lines changed

core/src/main/scala/org/apache/spark/deploy/rest/StandaloneRestServer.scala

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -201,6 +201,7 @@ private[rest] class StandaloneSubmitRequestServlet(
201201

202202
// Optional fields
203203
val sparkProperties = request.sparkProperties
204+
.map(x => (x._1, replacePlaceHolder(x._2)))
204205
val driverMemory = sparkProperties.get(config.DRIVER_MEMORY.key)
205206
val driverCores = sparkProperties.get(config.DRIVER_CORES.key)
206207
val driverDefaultJavaOptions = sparkProperties.get(SparkLauncher.DRIVER_DEFAULT_JAVA_OPTIONS)

core/src/test/scala/org/apache/spark/deploy/rest/StandaloneRestSubmitSuite.scala

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -457,6 +457,18 @@ class StandaloneRestSubmitSuite extends SparkFunSuite {
457457
assert(desc.command.environment.get("AWS_ENDPOINT_URL") === Some("2.13"))
458458
}
459459

460+
test("SPARK-49034: Support server-side sparkProperties replacement in REST Submission API") {
461+
val request = new CreateSubmissionRequest
462+
request.appResource = ""
463+
request.mainClass = ""
464+
request.appArgs = Array.empty[String]
465+
request.sparkProperties = Map("spark.hadoop.fs.s3a.endpoint" -> "{{SPARK_SCALA_VERSION}}")
466+
request.environmentVariables = Map.empty[String, String]
467+
val servlet = new StandaloneSubmitRequestServlet(null, null, null)
468+
val desc = servlet.buildDriverDescription(request, "spark://master:7077", 6066)
469+
assert(desc.command.javaOpts.exists(_.contains("-Dspark.hadoop.fs.s3a.endpoint=2.13")))
470+
}
471+
460472
test("SPARK-45197: Make StandaloneRestServer add JavaModuleOptions to drivers") {
461473
val request = new CreateSubmissionRequest
462474
request.appResource = ""

0 commit comments

Comments
 (0)