-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-53438][CONNECT][SQL] Use CatalystConverter in LiteralExpressionProtoConverter #52188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Contributor
Author
ce6661f to
dadd339
Compare
dadd339 to
6faf5a1
Compare
6faf5a1 to
08051db
Compare
cloud-fan
reviewed
Sep 2, 2025
...er/src/main/scala/org/apache/spark/sql/connect/planner/LiteralExpressionProtoConverter.scala
Outdated
Show resolved
Hide resolved
08051db to
d41de5a
Compare
f7f82ee to
fd3f9d0
Compare
cloud-fan
reviewed
Sep 4, 2025
...t/common/src/main/scala/org/apache/spark/sql/connect/common/LiteralValueProtoConverter.scala
Outdated
Show resolved
Hide resolved
45b5e95 to
7ed8146
Compare
Contributor
Author
|
@cloud-fan @hvanhovell @zhengruifeng Please take another look |
7ed8146 to
700a8c9
Compare
cloud-fan
approved these changes
Sep 17, 2025
Contributor
|
thanks, merging to master! |
huangxiaopingRD
pushed a commit
to huangxiaopingRD/spark
that referenced
this pull request
Nov 25, 2025
…nProtoConverter
### What changes were proposed in this pull request?
This PR refactors the `LiteralExpressionProtoConverter` to use `CatalystTypeConverters` for consistent type conversion, eliminating code duplication and improving maintainability.
**Key changes:**
1. **Simplified `LiteralExpressionProtoConverter.toCatalystExpression()`**: Replaced the large switch statement (86 lines) with a clean 3-line implementation that leverages existing conversion utilities
2. **Added TIME type support**: Added missing TIME literal type conversion in `LiteralValueProtoConverter.toScalaValue()`
### Why are the changes needed?
1. **Type conversion issues**: Some complex nested data structures (e.g., arrays of case classes) failed to convert to Catalyst's internal representation when using `expressions.Literal.create(...)`.
2. **Inconsistent behaviors**: Differences in behavior between Spark Connect and classic Spark for the same data types (e.g., Decimal).
### Does this PR introduce _any_ user-facing change?
**Yes** - Users can now successfully use `typedLit` with an array of case classes. Previously, attempting to use `typedlit(Array(CaseClass(1, "a")))` would fail (please see the code piece below for details), but now it works correctly and converts case classes to proper struct representations.
```scala
import org.apache.spark.sql.functions.typedlit
case class CaseClass(a: Int, b: String)
spark.sql("select 1").select(typedlit(Array(CaseClass(1, "a")))).collect()
// Below is the error message:
"""
org.apache.spark.SparkIllegalArgumentException: requirement failed: Literal must have a corresponding value to array<struct<a:int,b:string>>, but class GenericArrayData found.
scala.Predef$.require(Predef.scala:337)
org.apache.spark.sql.catalyst.expressions.Literal$.validateLiteralValue(literals.scala:306)
org.apache.spark.sql.catalyst.expressions.Literal.<init>(literals.scala:456)
org.apache.spark.sql.catalyst.expressions.Literal$.create(literals.scala:206)
org.apache.spark.sql.connect.planner.LiteralExpressionProtoConverter$.toCatalystExpression(LiteralExpressionProtoConverter.scala:103)
"""
```
Besides, some catalyst values (e.g., Decimal 89.97620 -> 89.976200000000000000) have changed. Please see the changes in `explain-results/` for details.
```scala
import org.apache.spark.sql.functions.typedlit
spark.sql("select 1").select(typedlit(BigDecimal(8997620, 5)),typedlit(Array(BigDecimal(8997620, 5), BigDecimal(8997621, 5)))).explain()
// Current explain() output:
"""
Project [89.97620 AS 89.97620#819, [89.97620,89.97621] AS ARRAY(89.97620BD, 89.97621BD)#820]
"""
// Expected explain() output (i.e., same as the classic mode):
"""
Project [89.976200000000000000 AS 89.976200000000000000#132, [89.976200000000000000,89.976210000000000000] AS ARRAY(89.976200000000000000BD, 89.976210000000000000BD)#133]
"""
```
### How was this patch tested?
`build/sbt "connect-client-jvm/testOnly org.apache.spark.sql.PlanGenerationTestSuite"`
`build/sbt "connect/testOnly org.apache.spark.sql.connect.ProtoToParsedPlanTestSuite"`
### Was this patch authored or co-authored using generative AI tooling?
Generated-by: Cursor 1.4.5
Closes apache#52188 from heyihong/SPARK-53438.
Authored-by: Yihong He <heyihong.cn@gmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
This PR refactors the
LiteralExpressionProtoConverterto useCatalystTypeConvertersfor consistent type conversion, eliminating code duplication and improving maintainability.Key changes:
LiteralExpressionProtoConverter.toCatalystExpression(): Replaced the large switch statement (86 lines) with a clean 3-line implementation that leverages existing conversion utilitiesLiteralValueProtoConverter.toScalaValue()Why are the changes needed?
expressions.Literal.create(...).Does this PR introduce any user-facing change?
Yes - Users can now successfully use
typedLitwith an array of case classes. Previously, attempting to usetypedlit(Array(CaseClass(1, "a")))would fail (please see the code piece below for details), but now it works correctly and converts case classes to proper struct representations.Besides, some catalyst values (e.g., Decimal 89.97620 -> 89.976200000000000000) have changed. Please see the changes in
explain-results/for details.How was this patch tested?
build/sbt "connect-client-jvm/testOnly org.apache.spark.sql.PlanGenerationTestSuite"build/sbt "connect/testOnly org.apache.spark.sql.connect.ProtoToParsedPlanTestSuite"Was this patch authored or co-authored using generative AI tooling?
Generated-by: Cursor 1.4.5