Skip to content

[SPARK-52185][CORE] Automate the thread dump collection for Spark applications #50919

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

roczei
Copy link
Contributor

@roczei roczei commented May 16, 2025

What changes were proposed in this pull request?

When a Java program runs for a long time without giving you any feedback/output, how do you determine what the program might be doing and whether it’s stuck? Thread dumps can help in such case. It shows the status of the threads (if it's running/waiting/blocked) and which part of the code is being executed by each thread, being important to detect deadlocks and which part of the program is running.

The purpose of this pull request is to collect thread dumps at regular intervals. Why? Getting a single thread dump only shows a snapshot of threads, getting several allows us to see if threads are progressing by comparing states.

Collecting thread dump samples from slow Spark executors or drivers can be challenging, especially in YARN or Kubernetes environments.

Actual solutions which are available for debugging:

  1. We need to find out where the Java Virtual Machine (JVM) is running then run the jstack command manually.
  2. Download the thread dumps from the Spark UI. For example: http://localhost:4040/executors/threadDump/?executorId=driver
  3. Download the thread dumps via Spark API. For example:
curl "http://localhost:4040/api/v1/applications/local-1747400853731/executors/driver/threads"

Why are the changes needed?

The purpose of this feature request is to automate the thread dump collection at regular intervals. New Spark parameters have been introduced:

  • spark.driver.threadDumpCollector.enabled (default value: false)
  • spark.executor.threadDumpCollector.enabled (default value: false)
  • spark.threadDumpCollector.interval
  • spark.threadDumpCollector.dir
  • spark.threadDumpCollector.output.type
  • spark.threadDumpCollector.include.regex

Example commands

spark-shell  --master local-cluster[2,1,1050] --conf spark.driver.threadDumpCollector.enabled=true --conf spark.executor.threadDumpCollector.enabled=true --conf spark.threadDumpCollector.interval=15s --conf spark.threadDumpCollector.output.type=FILE --conf spark.threadDumpCollector.dir=hdfs:///user/example/jstack_test

The thread dumps will be saved into hdfs:///user/example/jstack_test, example file names: app-20250516161130-0000-driver-2025-05-16_16_12_50.txt, app-20250516161130-0000-0-2025-05-16_16_12_51.txt

spark-shell  --master local-cluster[2,1,1050] --conf spark.driver.threadDumpCollector.enabled=true --conf spark.executor.threadDumpCollector.enabled=true --conf spark.threadDumpCollector.interval=15s --conf spark.threadDumpCollector.output.type=LOG

The thread dumps will be added to the log messages

spark-shell  --master local-cluster[2,1,1050] --conf spark.driver.threadDumpCollector.enabled=true --conf spark.executor.threadDumpCollector.enabled=true --conf spark.threadDumpCollector.interval=15s --conf spark.threadDumpCollector.output.type=LOG --conf spark.threadDumpCollector.include.regex=something

Only those thread dumps will be captured which match the given regular expression (spark.threadDumpCollector.include.regex)

Does this PR introduce any user-facing change?

No

How was this patch tested?

New unit tests have been created and it has been manually tested as well.

Was this patch authored or co-authored using generative AI tooling?

No

@github-actions github-actions bot added the CORE label May 16, 2025
@roczei roczei force-pushed the SPARK-52185 branch 4 times, most recently from 76cdbac to d32d6cc Compare May 17, 2025 18:12
@roczei roczei changed the title [SPARK-52185][CORE] Automate the thread dump collection for Spark applications [WIP][SPARK-52185][CORE] Automate the thread dump collection for Spark applications May 18, 2025
@roczei
Copy link
Contributor Author

roczei commented May 18, 2025

This PR is not ready for review because there are some test failures. Currently I am checking them.

@roczei
Copy link
Contributor Author

roczei commented May 18, 2025

These were just intermittent issues, I have rerun the failed tests and all of them passed.

The PR is ready for review.

@roczei roczei changed the title [WIP][SPARK-52185][CORE] Automate the thread dump collection for Spark applications [SPARK-52185][CORE] Automate the thread dump collection for Spark applications May 18, 2025
@roczei
Copy link
Contributor Author

roczei commented May 18, 2025

Hi @yaooqinn,

I noticed that you’ve worked on the ThreadInfo improvements for monitoring APIs (SPARK-44893).

I’ve opened this PR related to thread dumps and I believe your insights would be particularly valuable. Would you mind taking a look when you have a moment?

Thank you in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant