Skip to content

Conversation

@viirya
Copy link
Member

@viirya viirya commented Oct 20, 2019

What changes were proposed in this pull request?

Add CacheTableStatement and make CACHE TABLE go through the same catalog/table resolution framework of v2 commands.

Why are the changes needed?

It's important to make all the commands have the same table resolution behavior, to avoid confusing end-users. e.g.

USE my_catalog
DESC t // success and describe the table t from my_catalog
CACHE TABLE t // report table not found as there is no table t in the session catalog

Does this PR introduce any user-facing change?

yes. When running CACHE TABLE, Spark fails the command if the current catalog is set to a v2 catalog, or the table name specified a v2 catalog.

How was this patch tested?

Unit tests.

@SparkQA
Copy link

SparkQA commented Oct 20, 2019

Test build #112333 has finished for PR 26179 at commit 93e8ec6.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • case class CacheTableStatement(

@SparkQA
Copy link

SparkQA commented Oct 20, 2019

Test build #112347 has finished for PR 26179 at commit 52e9ee4.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Oct 20, 2019

Test build #112348 has finished for PR 26179 at commit 27e8b39.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member Author

viirya commented Oct 23, 2019

@cloud-fan @imback82 @rdblue

}

/**
<<<<<<< HEAD
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops

val query = Option(ctx.query).map(plan)
val tableName = visitMultipartIdentifier(ctx.multipartIdentifier)
if (query.isDefined && tableName.length > 1) {
val catalogAndDatabase = tableName.init.mkString(".")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

catalogAndNamespace

val query = Option(ctx.query).map(plan)
val tableName = visitMultipartIdentifier(ctx.multipartIdentifier)
if (query.isDefined && tableName.length > 1) {
val catalogAndDatabase = tableName.init.mkString(".")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use MultipartIdentifierHelper.quoted?

val tableName = visitMultipartIdentifier(ctx.multipartIdentifier)
if (query.isDefined && tableName.length > 1) {
val catalogAndDatabase = tableName.init.mkString(".")
throw new ParseException("It is not allowed to add catalog/database " +
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

namespace here as well

withTable(t) {
spark.sql(s"CREATE TABLE $t (id bigint, data string) USING foo")

val e1 = intercept[AnalysisException] {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's use testV1Command.

Copy link
Contributor

@imback82 imback82 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@SparkQA
Copy link

SparkQA commented Oct 24, 2019

Test build #112569 has finished for PR 26179 at commit 1303ea3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • skeleton_class = type_constructor(name, bases, type_kwargs)
  • enum_class = metacls.__new__(metacls, name, bases, classdict)
  • case class BitAndAgg(child: Expression) extends DeclarativeAggregate with ExpectsInputTypes
  • case class BitOrAgg(child: Expression) extends DeclarativeAggregate with ExpectsInputTypes
  • case class CreateNamedStruct(children: Seq[Expression]) extends Expression
  • abstract class AbstractSqlParser(conf: SQLConf) extends ParserInterface with Logging
  • class CatalystSqlParser(conf: SQLConf) extends AbstractSqlParser(conf)
  • case class CreateNamespaceStatement(
  • case class TruncateTableStatement(
  • case class ShowPartitionsStatement(
  • case class RefreshTableStatement(tableName: Seq[String]) extends ParsedStatement
  • case class CreateNamespace(
  • case class RefreshTable(
  • class SparkSqlParser(conf: SQLConf) extends AbstractSqlParser(conf)
  • case class CreateNamespaceExec(
  • case class RefreshTableExec(

@SparkQA
Copy link

SparkQA commented Oct 24, 2019

Test build #112577 has finished for PR 26179 at commit 0a44510.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan cloud-fan closed this in 177bf67 Oct 24, 2019
@cloud-fan
Copy link
Contributor

thanks, merging to master!

@viirya viirya deleted the SPARK-29522 branch December 27, 2023 18:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants