-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-27693][SQL] Add default catalog property #24594
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -1767,6 +1767,11 @@ object SQLConf { | |
"with String") | ||
.booleanConf | ||
.createWithDefault(false) | ||
|
||
val DEFAULT_V2_CATALOG = buildConf("spark.sql.default.catalog") | ||
.doc("Name of the default v2 catalog, used when an catalog is not identified in queries") | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. as we discussed in the DS v2 meeting, we should clearly point out which places this default catalog is used. View/Function resolution definitely doesn't use this default catalog for now. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I agree, we need to make clear choices and document where this is used. |
||
.stringConf | ||
.createOptional | ||
} | ||
|
||
/** | ||
|
@@ -2220,6 +2225,8 @@ class SQLConf extends Serializable with Logging { | |
|
||
def castDatetimeToString: Boolean = getConf(SQLConf.LEGACY_CAST_DATETIME_TO_STRING) | ||
|
||
def defaultV2Catalog: Option[String] = getConf(DEFAULT_V2_CATALOG) | ||
|
||
/** ********************** SQLConf functionality methods ************ */ | ||
|
||
/** Set Spark SQL configuration properties. */ | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit:
spark.sql.catalog.default
is more consistent with other SQL config names:spark.sql.componentName.featureName
.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason for this order is that
spark.sql.catalog.(name)
properties are used to register catalogs. So the property name you suggest creates a catalog named "default" using the existing convention. We could change the properties to usecatalogs
instead if you'd prefer. Then thecatalog.default
name would not conflict.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As @rdblue mentioned,
spark.sql.catalog.default
has meaning. Also, thespark.sql.default.catalog
was the name already used in our code base before this PR.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah I see the problem now.
Since we already have
spark.sql.defaultSizeInBytes
, shall we name itspark.sql.defaultCatalog
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That works for me. I'll open a PR to rename.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, too~