Skip to content

Commit

Permalink
[KYUUBI #1018] Set spark.sql.execution.topKSortFallbackThreshold to 1…
Browse files Browse the repository at this point in the history
…0000

<!--
Thanks for sending a pull request!

Here are some tips for you:
  1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html
  2. If the PR is related to an issue in https://github.com/apache/incubator-kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'.
  3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'.
-->

### _Why are the changes needed?_
<!--
Please clarify why the changes are needed. For instance,
  1. If you add a feature, you can talk about the use case of it.
  2. If you fix a bug, you can clarify why it is a bug.
-->
To avoid performance issues in the topK scenario and close #1018 .

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [ ] [Run test](https://kyuubi.readthedocs.io/en/latest/develop_tools/testing.html#running-tests) locally before make a pull request

Closes #1054 from byyue/feature-1018.

Closes #1018

21de1fc [Brian Yue] Set spark.sql.execution.topKSortFallbackThreshold to 10000

Authored-by: Brian Yue <code.byyue@gmail.com>
Signed-off-by: Kent Yao <yao@apache.org>
  • Loading branch information
byyue authored and yaooqinn committed Sep 9, 2021
1 parent d836742 commit 033784c
Showing 1 changed file with 1 addition and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ object SparkSQLEngine extends Logging {

def createSpark(): SparkSession = {
val sparkConf = new SparkConf()
sparkConf.setIfMissing("spark.sql.execution.topKSortFallbackThreshold", "10000")
sparkConf.setIfMissing("spark.sql.legacy.castComplexTypesToString.enabled", "true")
sparkConf.setIfMissing("spark.master", "local")
sparkConf.setIfMissing("spark.ui.port", "0")
Expand Down

0 comments on commit 033784c

Please sign in to comment.