Skip to content

[SPARK-27693][SQL] Add default catalog property #24594

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -1767,6 +1767,11 @@ object SQLConf {
"with String")
.booleanConf
.createWithDefault(false)

val DEFAULT_V2_CATALOG = buildConf("spark.sql.default.catalog")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: spark.sql.catalog.default is more consistent with other SQL config names: spark.sql.componentName.featureName.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason for this order is that spark.sql.catalog.(name) properties are used to register catalogs. So the property name you suggest creates a catalog named "default" using the existing convention. We could change the properties to use catalogs instead if you'd prefer. Then the catalog.default name would not conflict.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As @rdblue mentioned, spark.sql.catalog.default has meaning. Also, the spark.sql.default.catalog was the name already used in our code base before this PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see the problem now.

Since we already have spark.sql.defaultSizeInBytes, shall we name it spark.sql.defaultCatalog?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That works for me. I'll open a PR to rename.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, too~

.doc("Name of the default v2 catalog, used when an catalog is not identified in queries")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit. an catalog -> a catalog.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as we discussed in the DS v2 meeting, we should clearly point out which places this default catalog is used. View/Function resolution definitely doesn't use this default catalog for now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, we need to make clear choices and document where this is used.

.stringConf
.createOptional
}

/**
Expand Down Expand Up @@ -2220,6 +2225,8 @@ class SQLConf extends Serializable with Logging {

def castDatetimeToString: Boolean = getConf(SQLConf.LEGACY_CAST_DATETIME_TO_STRING)

def defaultV2Catalog: Option[String] = getConf(DEFAULT_V2_CATALOG)

/** ********************** SQLConf functionality methods ************ */

/** Set Spark SQL configuration properties. */
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,8 @@ case class DataSourceResolution(

override def lookupCatalog: Option[String => CatalogPlugin] = Some(findCatalog)

def defaultCatalog: Option[CatalogPlugin] = conf.defaultV2Catalog.map(findCatalog)

override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
case CreateTableStatement(
AsTableIdentifier(table), schema, partitionCols, bucketSpec, properties,
Expand All @@ -67,7 +69,7 @@ case class DataSourceResolution(
case create: CreateTableAsSelectStatement =>
// the provider was not a v1 source, convert to a v2 plan
val CatalogObjectIdentifier(maybeCatalog, identifier) = create.tableName
val catalog = maybeCatalog
val catalog = maybeCatalog.orElse(defaultCatalog)
.getOrElse(throw new AnalysisException(
s"No catalog specified for table ${identifier.quoted} and no default catalog is set"))
.asTableCatalog
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -329,8 +329,7 @@ class PlanResolutionSuite extends AnalysisTest {
}
}

// TODO(rblue): enable this test after the default catalog is available
ignore("Test v2 CTAS with data source v2 provider") {
test("Test v2 CTAS with data source v2 provider") {
val sql =
s"""
|CREATE TABLE IF NOT EXISTS mydb.page_view
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,8 +66,7 @@ class DataSourceV2SQLSuite extends QueryTest with SharedSQLContext with BeforeAn
checkAnswer(spark.internalCreateDataFrame(rdd, table.schema), spark.table("source"))
}

// TODO(rblue): enable this test after the default catalog is available
ignore("CreateTableAsSelect: use v2 plan because provider is v2") {
test("CreateTableAsSelect: use v2 plan because provider is v2") {
spark.sql(s"CREATE TABLE table_name USING $orc2 AS SELECT id, data FROM source")

val testCatalog = spark.catalog("testcat").asTableCatalog
Expand Down