Skip to content

[SPARK-48843][FOLLOWUP] Adding a PySpark test. #47653

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 5 commits into from

Conversation

nemanjapetr-db
Copy link
Contributor

What changes were proposed in this pull request?

This PR contains a PySpark test that is checking parameter binding wrapped within Limit node that has caused an infinite loop before #47271 bug fix. This test shall fail should the bug be accidentally reintroduced.

Why are the changes needed?

PySpark test complements Scala test.

Does this PR introduce any user-facing change?

No

How was this patch tested?

Manually ran the test.

Was this patch authored or co-authored using generative AI tooling?

No.

…s with BindParameters. Executes a query where parameter binding was wrapped in a Limit node.
assert cls.spark is not None
assert cls.spark._jvm.SparkSession.getDefaultSession().isDefined()

def test_wrapping_plan_in_limit_node(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason why we need a PySpark specific test for this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spark.sql has no difference between Scala Spark and Python Spark, otherwise we will need to duplicate a lot of tests in pyspark.

@nemanjapetr-db
Copy link
Contributor Author

Cancelling this PR.

@nemanjapetr-db nemanjapetr-db deleted the infiniteloop2 branch August 13, 2024 12:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants