Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare Spark 4.0 shims #372

Open
kazuyukitanimura opened this issue May 2, 2024 · 1 comment
Open

Prepare Spark 4.0 shims #372

kazuyukitanimura opened this issue May 2, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@kazuyukitanimura
Copy link
Contributor

What is the problem the feature request solves?

Will create spark-4.0 shim dir

Describe the potential solution

No response

Additional context

No response

@kazuyukitanimura kazuyukitanimura added the enhancement New feature or request label May 2, 2024
@kazuyukitanimura
Copy link
Contributor Author

I am on it

kazuyukitanimura pushed a commit that referenced this issue Jul 19, 2024
## Which issue does this PR close?
Part of #372  and #551 

## Rationale for this change
With Spark 4.0, the `SubquerySuite` in Spark fails as Comet scan did not support the scala subquery feature.

## What changes are included in this PR?
Adds the support for scalar subquery pushdown into Comet scan 

## How are these changes tested?
Existing Spark/sql unit tests in `SubquerySuite`
kazuyukitanimura added a commit that referenced this issue Jul 20, 2024
## Which issue does this PR close?

Part of #372 and #551

## Rationale for this change

To be ready for Spark 4.0

## What changes are included in this PR?

This PR fixes the test that requires to see SparkArithmeticException

## How are these changes tested?

Enabled `SPARK-40389: Don't eliminate a cast which can cause overflow`
himadripal pushed a commit to himadripal/datafusion-comet that referenced this issue Sep 7, 2024
## Which issue does this PR close?
Part of apache#372  and apache#551 

## Rationale for this change
With Spark 4.0, the `SubquerySuite` in Spark fails as Comet scan did not support the scala subquery feature.

## What changes are included in this PR?
Adds the support for scalar subquery pushdown into Comet scan 

## How are these changes tested?
Existing Spark/sql unit tests in `SubquerySuite`
himadripal pushed a commit to himadripal/datafusion-comet that referenced this issue Sep 7, 2024
## Which issue does this PR close?

Part of apache#372 and apache#551

## Rationale for this change

To be ready for Spark 4.0

## What changes are included in this PR?

This PR fixes the test that requires to see SparkArithmeticException

## How are these changes tested?

Enabled `SPARK-40389: Don't eliminate a cast which can cause overflow`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant