Skip to content

[SPARK-6556][Core] Fix wrong parsing logic of executorTimeoutMs and checkTimeoutIntervalMs in HeartbeatReceiver #5209

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from

Conversation

zsxwing
Copy link
Member

@zsxwing zsxwing commented Mar 26, 2015

The current reading logic of executorTimeoutMs is:

private val executorTimeoutMs = sc.conf.getLong("spark.network.timeout", 
    sc.conf.getLong("spark.storage.blockManagerSlaveTimeoutMs", 120)) * 1000

So if spark.storage.blockManagerSlaveTimeoutMs is 10000 and spark.network.timeout is not set, executorTimeoutMs will be 10000 * 1000. But the correct value should have been 10000.

checkTimeoutIntervalMs has the same issue.

This PR fixes them.

@SparkQA
Copy link

SparkQA commented Mar 26, 2015

Test build #29226 has started for PR 5209 at commit ccd5147.

  • This patch merges cleanly.

private val checkTimeoutIntervalMs = sc.conf.getLong("spark.network.timeoutInterval",
sc.conf.getLong("spark.storage.blockManagerTimeoutIntervalMs", 60)) * 1000

private val executorTimeoutMs = sc.conf.getOption("spark.network.timeout").map(_.toLong * 1000).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You might comment that timeout is in seconds and (obviously) blockManagerSlaveTimeoutMs is milliseconds. I agree with the change though. These properties you're fixing aren't documented right? the only ref I saw on the mailing list clearly directed people to set values like "60000", which is correctly in ms.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not seen them in any docs. However, from the current codes here, I think it's trying to maintain the compatibility.

@SparkQA
Copy link

SparkQA commented Mar 26, 2015

Test build #29227 has started for PR 5209 at commit c7d5422.

  • This patch merges cleanly.

@lianhuiwang
Copy link
Contributor

LGTM

@SparkQA
Copy link

SparkQA commented Mar 26, 2015

Test build #29226 has finished for PR 5209 at commit ccd5147.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/29226/
Test PASSed.

@SparkQA
Copy link

SparkQA commented Mar 26, 2015

Test build #29227 has finished for PR 5209 at commit c7d5422.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/29227/
Test PASSed.

private val checkTimeoutIntervalMs = sc.conf.getLong("spark.network.timeoutInterval",
sc.conf.getLong("spark.storage.blockManagerTimeoutIntervalMs", 60)) * 1000

// `spark.network.timeout` use `seconds`,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor nits on the comments:

  • I wouldn't use backticks here because it's not md-formatted.
  • "use" -> "uses"
  • "while" should be on line above

@sryza
Copy link
Contributor

sryza commented Mar 27, 2015

Had some nits on the comments. Otherwise this LGTM.

@zsxwing
Copy link
Member Author

zsxwing commented Mar 27, 2015

Fixed the docs

@SparkQA
Copy link

SparkQA commented Mar 27, 2015

Test build #29288 has started for PR 5209 at commit 6a0a411.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Mar 27, 2015

Test build #29288 has finished for PR 5209 at commit 6a0a411.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/29288/
Test PASSed.

@asfgit asfgit closed this in da546b7 Mar 27, 2015
@zsxwing zsxwing deleted the SPARK-6556 branch March 27, 2015 14:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants