Skip to content

Commit

Permalink
[MINOR][SQL] Remove toLowerCase(Locale.ROOT) for `CATALOG_IMPLEMENT…
Browse files Browse the repository at this point in the history
…ATION`

### What changes were proposed in this pull request?

This PR aims to remove redundant `toLowerCase(Locale.ROOT)` transforms during checking `CATALOG_IMPLEMENTATION` values.

### Why are the changes needed?

We already have `checkValues`.

https://github.com/apache/spark/blob/9d9675922543e3e5c3b01023e5a756462a1fd308/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala#L52

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Pass the CIs.

Manually I checked the following. I believe these are all occurrences.

```
$ git grep -C1 '.toLowerCase(Locale.ROOT)' | grep '"hive'
repl/src/main/scala/org/apache/spark/repl/Main.scala-            .get(CATALOG_IMPLEMENTATION.key, "hive")
repl/src/main/scala/org/apache/spark/repl/Main.scala:            .toLowerCase(Locale.ROOT) == "hive") {
sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala:          jsc.sc.conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) ==
sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala-            "hive" &&
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSchemaInferenceSuite.scala-        provider = Option("hive"),
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#45184 from dongjoon-hyun/SPARK_CATALOG_IMPLEMENTATION.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
dongjoon-hyun committed Feb 21, 2024
1 parent 8ede494 commit 76575ee
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 7 deletions.
5 changes: 1 addition & 4 deletions repl/src/main/scala/org/apache/spark/repl/Main.scala
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ package org.apache.spark.repl

import java.io.File
import java.net.URI
import java.util.Locale

import scala.tools.nsc.GenericRunnerSettings

Expand Down Expand Up @@ -104,9 +103,7 @@ object Main extends Logging {
}

val builder = SparkSession.builder().config(conf)
if (conf
.get(CATALOG_IMPLEMENTATION.key, "hive")
.toLowerCase(Locale.ROOT) == "hive") {
if (conf.get(CATALOG_IMPLEMENTATION.key, "hive") == "hive") {
if (SparkSession.hiveClassesArePresent) {
// In the case that the property is not set at all, builder's config
// does not have this value set to 'hive' yet. The original default
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
package org.apache.spark.sql.api.r

import java.io.{ByteArrayInputStream, ByteArrayOutputStream, DataInputStream, DataOutputStream}
import java.util.{Locale, Map => JMap}
import java.util.{Map => JMap}

import scala.jdk.CollectionConverters._
import scala.util.matching.Regex
Expand Down Expand Up @@ -46,8 +46,7 @@ private[sql] object SQLUtils extends Logging {
enableHiveSupport: Boolean): SparkSession = {
val spark =
if (enableHiveSupport &&
jsc.sc.conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) ==
"hive" &&
jsc.sc.conf.get(CATALOG_IMPLEMENTATION.key, "hive") == "hive" &&
// Note that the order of conditions here are on purpose.
// `SparkSession.hiveClassesArePresent` checks if Hive's `HiveConf` is loadable or not;
// however, `HiveConf` itself has some static logic to check if Hadoop version is
Expand Down

0 comments on commit 76575ee

Please sign in to comment.