Skip to content

[SPARK-30282][SQL][FOLLOWUP] SHOW TBLPROPERTIES should support views #28375

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/sql-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ license: |

- In Spark 3.0, you can use `ADD FILE` to add file directories as well. Earlier you could add only single files using this command. To restore the behavior of earlier versions, set `spark.sql.legacy.addSingleFileInAddFile` to `true`.

- In Spark 3.0, `SHOW TBLPROPERTIES` throws `AnalysisException` if the table does not exist. In Spark version 2.4 and below, this scenario caused `NoSuchTableException`. Also, `SHOW TBLPROPERTIES` on a temporary view causes `AnalysisException`. In Spark version 2.4 and below, it returned an empty result.
- In Spark 3.0, `SHOW TBLPROPERTIES` throws `AnalysisException` if the table does not exist. In Spark version 2.4 and below, this scenario caused `NoSuchTableException`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we can remove it. Exception class change doesn't worth a migration guide item.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This exception class change was brought up in this comment: #26921 (comment). Can I still remove it?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think error message/exception type change is a breaking change. @dongjoon-hyun what do you think?


- In Spark 3.0, `SHOW CREATE TABLE` always returns Spark DDL, even when the given table is a Hive SerDe table. For generating Hive DDL, use `SHOW CREATE TABLE AS SERDE` command instead.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3568,7 +3568,7 @@ class AstBuilder(conf: SQLConf) extends SqlBaseBaseVisitor[AnyRef] with Logging
override def visitShowTblProperties(
ctx: ShowTblPropertiesContext): LogicalPlan = withOrigin(ctx) {
ShowTableProperties(
UnresolvedTable(visitMultipartIdentifier(ctx.table)),
UnresolvedTableOrView(visitMultipartIdentifier(ctx.table)),
Option(ctx.key).map(visitTablePropertyKey))
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2010,11 +2010,11 @@ class DDLParserSuite extends AnalysisTest {
test("SHOW TBLPROPERTIES table") {
comparePlans(
parsePlan("SHOW TBLPROPERTIES a.b.c"),
ShowTableProperties(UnresolvedTable(Seq("a", "b", "c")), None))
ShowTableProperties(UnresolvedTableOrView(Seq("a", "b", "c")), None))

comparePlans(
parsePlan("SHOW TBLPROPERTIES a.b.c('propKey1')"),
ShowTableProperties(UnresolvedTable(Seq("a", "b", "c")), Some("propKey1")))
ShowTableProperties(UnresolvedTableOrView(Seq("a", "b", "c")), Some("propKey1")))
}

test("DESCRIBE FUNCTION") {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -562,6 +562,9 @@ class ResolveSessionCatalog(
case ShowTableProperties(r: ResolvedTable, propertyKey) if isSessionCatalog(r.catalog) =>
ShowTablePropertiesCommand(r.identifier.asTableIdentifier, propertyKey)

case ShowTableProperties(r: ResolvedView, propertyKey) =>
ShowTablePropertiesCommand(r.identifier.asTableIdentifier, propertyKey)

case DescribeFunctionStatement(nameParts, extended) =>
val functionIdent =
parseSessionCatalogFunctionIdentifier(nameParts, "DESCRIBE FUNCTION")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -918,15 +918,20 @@ case class ShowTablePropertiesCommand(table: TableIdentifier, propertyKey: Optio
}

override def run(sparkSession: SparkSession): Seq[Row] = {
val catalogTable = sparkSession.sessionState.catalog.getTableMetadata(table)
propertyKey match {
case Some(p) =>
val propValue = catalogTable
.properties
.getOrElse(p, s"Table ${catalogTable.qualifiedName} does not have property: $p")
Seq(Row(propValue))
case None =>
catalogTable.properties.map(p => Row(p._1, p._2)).toSeq
val catalog = sparkSession.sessionState.catalog
if (catalog.isTemporaryTable(table)) {
Seq.empty[Row]
} else {
val catalogTable = catalog.getTableMetadata(table)
propertyKey match {
case Some(p) =>
val propValue = catalogTable
.properties
.getOrElse(p, s"Table ${catalogTable.qualifiedName} does not have property: $p")
Seq(Row(propValue))
case None =>
catalogTable.properties.map(p => Row(p._1, p._2)).toSeq
}
}
}
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
-- create a table with properties
CREATE TABLE tbl (a INT, b STRING, c INT) USING parquet
TBLPROPERTIES('p1'='v1', 'p2'='v2');

SHOW TBLPROPERTIES tbl;
SHOW TBLPROPERTIES tbl("p1");
SHOW TBLPROPERTIES tbl("p3");

DROP TABLE tbl;

-- create a view with properties
CREATE VIEW view TBLPROPERTIES('p1'='v1', 'p2'='v2') AS SELECT 1 AS c1;

SHOW TBLPROPERTIES view;
SHOW TBLPROPERTIES view("p1");
SHOW TBLPROPERTIES view("p3");

DROP VIEW view;

-- create a temporary view with properties
CREATE TEMPORARY VIEW tv TBLPROPERTIES('p1'='v1') AS SELECT 1 AS c1;

-- Properties for a temporary view should be empty
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW I'd treat this as a bug. I don't think anyone would expect SHOW TBLPROPERTIES to always return None for temp views.

This is an existing issue, we can deal with it later.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan I will create a PR for this. This can be misleading as you pointed out.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan There are few ways to address this:

  1. Throw an exception if CREATE TEMPORARY VIEW contains properties since they are ignored anyway.
  2. Throw an exception when SHOW TBLPROPERTIES are run on a temporary view in ShowTablePropertiesCommand.
  3. Create UnresolvedTableOrPermanentView and use it for commands that don't support temporary views.

WDYT?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd vote for 3. We have many commands that named as XXX TABLE but can work for views.

SHOW TBLPROPERTIES tv;

DROP VIEW tv;
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
-- Automatically generated by SQLQueryTestSuite
-- Number of queries: 13


-- !query
CREATE TABLE tbl (a INT, b STRING, c INT) USING parquet
TBLPROPERTIES('p1'='v1', 'p2'='v2')
-- !query schema
struct<>
-- !query output



-- !query
SHOW TBLPROPERTIES tbl
-- !query schema
struct<key:string,value:string>
-- !query output
p1 v1
p2 v2


-- !query
SHOW TBLPROPERTIES tbl("p1")
-- !query schema
struct<value:string>
-- !query output
v1


-- !query
SHOW TBLPROPERTIES tbl("p3")
-- !query schema
struct<value:string>
-- !query output
Table default.tbl does not have property: p3


-- !query
DROP TABLE tbl
-- !query schema
struct<>
-- !query output



-- !query
CREATE VIEW view TBLPROPERTIES('p1'='v1', 'p2'='v2') AS SELECT 1 AS c1
-- !query schema
struct<>
-- !query output



-- !query
SHOW TBLPROPERTIES view
-- !query schema
struct<key:string,value:string>
-- !query output
p1 v1
p2 v2
view.catalogAndNamespace.numParts 2
view.catalogAndNamespace.part.0 spark_catalog
view.catalogAndNamespace.part.1 default
view.query.out.col.0 c1
view.query.out.numCols 1


-- !query
SHOW TBLPROPERTIES view("p1")
-- !query schema
struct<value:string>
-- !query output
v1


-- !query
SHOW TBLPROPERTIES view("p3")
-- !query schema
struct<value:string>
-- !query output
Table default.view does not have property: p3


-- !query
DROP VIEW view
-- !query schema
struct<>
-- !query output



-- !query
CREATE TEMPORARY VIEW tv TBLPROPERTIES('p1'='v1') AS SELECT 1 AS c1
-- !query schema
struct<>
-- !query output



-- !query
SHOW TBLPROPERTIES tv
-- !query schema
struct<key:string,value:string>
-- !query output



-- !query
DROP VIEW tv
-- !query schema
struct<>
-- !query output

Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ class HiveCommandSuite extends QueryTest with SQLTestUtils with TestHiveSingleto
val message = intercept[AnalysisException] {
sql("SHOW TBLPROPERTIES badtable")
}.getMessage
assert(message.contains("Table not found: badtable"))
assert(message.contains("Table or view not found: badtable"))

// When key is not found, a row containing the error is returned.
checkAnswer(
Expand All @@ -193,21 +193,6 @@ class HiveCommandSuite extends QueryTest with SQLTestUtils with TestHiveSingleto
checkAnswer(sql("SHOW TBLPROPERTIES parquet_tab2('`prop2Key`')"), Row("prop2Val"))
}

test("show tblproperties for spark temporary table - AnalysisException is thrown") {
withTempView("parquet_temp") {
sql(
"""
|CREATE TEMPORARY VIEW parquet_temp (c1 INT, c2 STRING)
|USING org.apache.spark.sql.parquet.DefaultSource
""".stripMargin)

val message = intercept[AnalysisException] {
sql("SHOW TBLPROPERTIES parquet_temp")
}.getMessage
assert(message.contains("parquet_temp is a temp view not table"))
}
}

Seq(true, false).foreach { local =>
val loadQuery = if (local) "LOAD DATA LOCAL" else "LOAD DATA"
test(loadQuery) {
Expand Down