Skip to content

Commit a61d897

Browse files
committed
resolve comments on docs and addition of unit test
1 parent ab92913 commit a61d897

File tree

2 files changed

+19
-4
lines changed

2 files changed

+19
-4
lines changed

docs/running-on-kubernetes.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -270,7 +270,6 @@ future versions of the spark-kubernetes integration.
270270

271271
Some of these include:
272272

273-
* PySpark
274273
* R
275274
* Dynamic Executor Scaling
276275
* Local File Dependency Management
@@ -628,9 +627,8 @@ specific to Spark on Kubernetes.
628627
<td><code>spark.kubernetes.memoryOverheadFactor</code></td>
629628
<td><code>0.1</code></td>
630629
<td>
631-
This sets the Memory Overhead Factor that will allocate memory to non-JVM jobs which in the case of JVM tasks will default to 0.10 and 0.40 for non-JVM jobs.
632-
This is done as non-JVM tasks need more non-JVM heap space and such tasks commonly fail with "Memory Overhead Exceeded" errors. This prempts this error with
633-
a higher default.
630+
This sets the Memory Overhead Factor that will allocate memory to non-JVM memory, which includes off-heap memory allocations, non-JVM tasks, and various systems processes. For JVM-based jobs this value will default to 0.10 and 0.40 for non-JVM jobs.
631+
This is done as non-JVM tasks need more non-JVM heap space and such tasks commonly fail with "Memory Overhead Exceeded" errors. This prempts this error with a higher default.
634632
</td>
635633
</tr>
636634
<tr>

resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/KubernetesConfSuite.scala

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,23 @@ class KubernetesConfSuite extends SparkFunSuite {
122122
=== Array("local:///opt/spark/example4.py", mainResourceFile) ++ inputPyFiles)
123123
}
124124

125+
test("Testing explicit setting of memory overhead on non-JVM tasks") {
126+
val sparkConf = new SparkConf(false)
127+
.set(MEMORY_OVERHEAD_FACTOR, 0.3)
128+
129+
val mainResourceFile = "local:///opt/spark/main.py"
130+
val mainAppResource = Some(PythonMainAppResource(mainResourceFile))
131+
val conf = KubernetesConf.createDriverConf(
132+
sparkConf,
133+
APP_NAME,
134+
RESOURCE_NAME_PREFIX,
135+
APP_ID,
136+
mainAppResource,
137+
MAIN_CLASS,
138+
APP_ARGS,
139+
None)
140+
assert(conf.sparkConf.get(MEMORY_OVERHEAD_FACTOR) === 0.3)
141+
}
125142

126143
test("Resolve driver labels, annotations, secret mount paths, envs, and memory overhead") {
127144
val sparkConf = new SparkConf(false)

0 commit comments

Comments
 (0)