-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-26700][Core][FOLLOWUP] Add config spark.network.maxRemoteBlockSizeFetchToMem
#27463
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
77714a3
to
4201a93
Compare
@@ -896,11 +896,7 @@ package object config { | |||
|
|||
private[spark] val MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we move it to SparkConf.deprecatedConfigs
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thansk, done in 2d2637b.
Test build #117880 has finished for PR 27463 at commit
|
Test build #117886 has finished for PR 27463 at commit
|
Test build #117890 has finished for PR 27463 at commit
|
Test build #117888 has finished for PR 27463 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we consider use configsWithAlternatives
for this case? Which seems more convenient for such case.
2d2637b
to
5c1f201
Compare
Make senes, done in 5c1f201 |
Test build #117932 has finished for PR 27463 at commit
|
Test build #117940 has finished for PR 27463 at commit
|
Retest this please. |
Test build #117946 has finished for PR 27463 at commit
|
@@ -894,8 +894,8 @@ package object config { | |||
.checkValue(_ > 0, "The max no. of blocks in flight cannot be non-positive.") | |||
.createWithDefault(Int.MaxValue) | |||
|
|||
private[spark] val MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM = | |||
ConfigBuilder("spark.maxRemoteBlockSizeFetchToMem") | |||
private[spark] val NETWORK_MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can keep using MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, done in 8f7540e.
AlternateConfig("spark.reducer.maxReqSizeShuffleToMem", "2.3")), | ||
NETWORK_MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM.key -> Seq( | ||
AlternateConfig("spark.reducer.maxReqSizeShuffleToMem", "2.3"), | ||
AlternateConfig("spark.maxRemoteBlockSizeFetchToMem", "3.0")), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will spark log warning message when these 2 alternatives are used? If not we still need to put them in deprecatedConfigs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will when user use old one, since deprecation warning also includes configsWithAlternatives
:
spark/core/src/main/scala/org/apache/spark/SparkConf.scala
Lines 753 to 766 in 77510c6
def logDeprecationWarning(key: String): Unit = { | |
deprecatedConfigs.get(key).foreach { cfg => | |
logWarning( | |
s"The configuration key '$key' has been deprecated as of Spark ${cfg.version} and " + | |
s"may be removed in the future. ${cfg.deprecationMessage}") | |
return | |
} | |
allAlternatives.get(key).foreach { case (newKey, cfg) => | |
logWarning( | |
s"The configuration key '$key' has been deprecated as of Spark ${cfg.version} and " + | |
s"may be removed in the future. Please use the new key '$newKey' instead.") | |
return | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested locally, Spark will log warning messages.
Test build #117973 has finished for PR 27463 at commit
|
retest this please |
Test build #117981 has finished for PR 27463 at commit
|
retest this please |
Test build #117984 has finished for PR 27463 at commit
|
retest this please |
Test build #117986 has finished for PR 27463 at commit
|
thanks, merging to master/3.0! |
…kSizeFetchToMem` ### What changes were proposed in this pull request? Add new config `spark.network.maxRemoteBlockSizeFetchToMem` fallback to the old config `spark.maxRemoteBlockSizeFetchToMem`. ### Why are the changes needed? For naming consistency. ### Does this PR introduce any user-facing change? No. ### How was this patch tested? Existing tests. Closes #27463 from xuanyuanking/SPARK-26700-follow. Authored-by: Yuanjian Li <xyliyuanjian@gmail.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com> (cherry picked from commit d861357) Signed-off-by: Wenchen Fan <wenchen@databricks.com>
…kSizeFetchToMem` ### What changes were proposed in this pull request? Add new config `spark.network.maxRemoteBlockSizeFetchToMem` fallback to the old config `spark.maxRemoteBlockSizeFetchToMem`. ### Why are the changes needed? For naming consistency. ### Does this PR introduce any user-facing change? No. ### How was this patch tested? Existing tests. Closes apache#27463 from xuanyuanking/SPARK-26700-follow. Authored-by: Yuanjian Li <xyliyuanjian@gmail.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com>
What changes were proposed in this pull request?
Add new config
spark.network.maxRemoteBlockSizeFetchToMem
fallback to the old configspark.maxRemoteBlockSizeFetchToMem
.Why are the changes needed?
For naming consistency.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Existing tests.