Skip to content

[SPARK-33074][SQL] Classify dialect exceptions in JDBC v2 Table Catalog #29952

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 5 commits into from

Conversation

MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Oct 6, 2020

What changes were proposed in this pull request?

  1. Add new method to the JdbcDialect class - classifyException(). It converts dialect specific exception to Spark's AnalysisException or its sub-classes.
  2. Replace H2 exception org.h2.jdbc.JdbcSQLException in JDBCTableCatalogSuite by AnalysisException.
  3. Add H2Dialect

Why are the changes needed?

Currently JDBC v2 Table Catalog implementation throws dialect specific exception and ignores exceptions defined in the TableCatalog interface. This PR adds new method for converting dialect specific exception, and assumes that follow up PRs will implement classifyException().

Does this PR introduce any user-facing change?

Yes.

How was this patch tested?

By running existing test suites JDBCTableCatalogSuite and JDBCV2Suite.

@SparkQA
Copy link

SparkQA commented Oct 6, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/34046/

@SparkQA
Copy link

SparkQA commented Oct 6, 2020

Kubernetes integration test status success
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/34046/

@SparkQA
Copy link

SparkQA commented Oct 6, 2020

Test build #129439 has finished for PR 29952 at commit b2e7a72.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@MaxGekk
Copy link
Member Author

MaxGekk commented Oct 6, 2020

@HyukjinKwon @cloud-fan @maropu @huaxingao Could you review this PR, please.

@MaxGekk
Copy link
Member Author

MaxGekk commented Oct 6, 2020

I think of to continue with this after #29957

…eption

# Conflicts:
#	sql/core/src/main/scala/org/apache/spark/sql/jdbc/JdbcDialects.scala
#	sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/v2/jdbc/JDBCTableCatalogSuite.scala
@SparkQA
Copy link

SparkQA commented Oct 7, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/34108/

@SparkQA
Copy link

SparkQA commented Oct 7, 2020

Kubernetes integration test status success
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/34108/

@SparkQA
Copy link

SparkQA commented Oct 7, 2020

Test build #129502 has finished for PR 29952 at commit 90fcaf3.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class TableAlreadyExistsException(message: String, cause: Option[Throwable] = None)
  • class NoSuchNamespaceException(message: String, cause: Option[Throwable] = None)
  • class NoSuchTableException(message: String, cause: Option[Throwable] = None)

@SparkQA
Copy link

SparkQA commented Oct 7, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/34118/

@SparkQA
Copy link

SparkQA commented Oct 7, 2020

Kubernetes integration test status success
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/34118/

@MaxGekk
Copy link
Member Author

MaxGekk commented Oct 7, 2020

@HyukjinKwon @cloud-fan @maropu @huaxingao I think this PR is ready for review. Please, have a look at it.

@@ -70,7 +70,9 @@ class JDBCTableCatalog extends TableCatalog with Logging {
checkNamespace(ident.namespace())
val writeOptions = new JdbcOptionsInWrite(
options.parameters + (JDBCOptions.JDBC_TABLE_NAME -> getTableName(ident)))
withConnection(JdbcUtils.tableExists(_, writeOptions))
classifyException(s"Failed table existence check: $ident") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we be consistent and always put classifyException inside withConnection?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think that mixing two conceptually different things will make them consistent.

@@ -297,6 +308,7 @@ object JdbcDialects {
registerDialect(DerbyDialect)
registerDialect(OracleDialect)
registerDialect(TeradataDialect)
registerDialect(H2Dialect)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since we already have a testH2Dialect, how about we update testH2Dialect to implement classifyException, and use it in JDBCTableCatalogSuite? Then we don't need to have an official H2Dialect.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you look at testH2Dialect, it has test specific settings. Why should we have the settings in JDBCTableCatalogSuite?

Then we don't need to have an official H2Dialect.

What is the problem to have built-in H2Dialect?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably, not related to this. If we don't want to support H2 officially as a dialect, why do we test it so broadly in Spark.?Maybe it makes sense to switch all internal Spark tests to Derby?

@SparkQA
Copy link

SparkQA commented Oct 7, 2020

Test build #129513 has finished for PR 29952 at commit ac55879.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

thanks, merging to master!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants