You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-50909][PYTHON] Setup faulthandler in PythonPlannerRunners
### What changes were proposed in this pull request?
Setups `faulthandler` in `PythonPlannerRunner`s.
It can be enabled by the same config as UDFs.
- SQL conf: `spark.sql.execution.pyspark.udf.faulthandler.enabled`
- It fallback to Spark conf: `spark.python.worker.faulthandler.enabled`
- `False` by default
### Why are the changes needed?
The `faulthandler` is not set up in `PythonPlannerRunner`s.
### Does this PR introduce _any_ user-facing change?
When enabled, if Python worker crashes, it may generate thread-dump in the error message on the best-effort basis of Python process.
### How was this patch tested?
Added the related tests.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closesapache#49592 from ueshin/issues/SPARK-50909/faulthandler.
Authored-by: Takuya Ueshin <ueshin@databricks.com>
Signed-off-by: Takuya Ueshin <ueshin@databricks.com>
0 commit comments