Skip to content

Commit 02f4522

Browse files
[SPARK-39133][PYTHON][DOC] Mention log level setting in PYSPARK_JVM_STACKTRACE_ENABLED
### What changes were proposed in this pull request? Mention log level setting in PYSPARK_JVM_STACKTRACE_ENABLED. ### Why are the changes needed? Even if `spark.sql.pyspark.jvmStacktrace.enabled` is set False, we should set log level to FATAL to see Python exceptions only. The PR is proposed to document that. ### Does this PR introduce _any_ user-facing change? No. Documents change only. ### How was this patch tested? Manual tests. Closes #36490 from xinrong-databricks/doc_jvm_stacktrace. Lead-authored-by: Xinrong Meng <xinrong.meng@databricks.com> Co-authored-by: Xinrong Meng <47337188+xinrong-databricks@users.noreply.github.com> Co-authored-by: Hyukjin Kwon <gurwls223@gmail.com> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
1 parent 0915a66 commit 02f4522

File tree

1 file changed

+3
-2
lines changed
  • sql/catalyst/src/main/scala/org/apache/spark/sql/internal

1 file changed

+3
-2
lines changed

sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2577,8 +2577,9 @@ object SQLConf {
25772577
val PYSPARK_JVM_STACKTRACE_ENABLED =
25782578
buildConf("spark.sql.pyspark.jvmStacktrace.enabled")
25792579
.doc("When true, it shows the JVM stacktrace in the user-facing PySpark exception " +
2580-
"together with Python stacktrace. By default, it is disabled and hides JVM stacktrace " +
2581-
"and shows a Python-friendly exception only.")
2580+
"together with Python stacktrace. By default, it is disabled to hide JVM stacktrace " +
2581+
"and shows a Python-friendly exception only. Note that this is independent from log " +
2582+
"level settings.")
25822583
.version("3.0.0")
25832584
.booleanConf
25842585
// show full stacktrace in tests but hide in production by default.

0 commit comments

Comments
 (0)