-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-10198][SQL] Turn off partition verification by default #8404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Test build #41491 has finished for PR 8404 at commit
|
Test build #41493 has finished for PR 8404 at commit
|
val testData = ctx.sparkContext.parallelize( | ||
(1 to 10).map(i => TestData(i, i.toString))).toDF() | ||
testData.registerTempTable("testData") | ||
withSQLConf((SQLConf.HIVE_VERIFY_PARTITION_PATH.key, "false")) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be set to true
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah yes, good catch.
Failing test passes locally after the config fix. I'm going to merge so this can be included in the next RC. |
Test build #41536 has finished for PR 8404 at commit
|
@@ -312,7 +312,7 @@ private[spark] object SQLConf { | |||
doc = "When true, enable filter pushdown for ORC files.") | |||
|
|||
val HIVE_VERIFY_PARTITION_PATH = booleanConf("spark.sql.hive.verifyPartitionPath", | |||
defaultValue = Some(true), | |||
defaultValue = Some(false), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why this place use false as default value
No description provided.