-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-27563][SQL][TEST] automatically get the latest Spark versions in HiveExternalCatalogVersionsSuite #24454
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
.filter(_.contains("""<li><a href="spark-""")) | ||
.map("""<a href="spark-(\d.\d.\d)/">""".r.findFirstMatchIn(_).get.group(1)) | ||
.filter(_ < org.apache.spark.SPARK_VERSION) | ||
logInfo(s"Testing ${org.apache.spark.SPARK_VERSION} with ${versions.mkString(", ")}.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can check the testing log to make sure we picked the right versions to test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! @cloud-fan
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool!
.split("\n") | ||
.filter(_.contains("""<li><a href="spark-""")) | ||
.map("""<a href="spark-(\d.\d.\d)/">""".r.findFirstMatchIn(_).get.group(1)) | ||
.filter(_ < org.apache.spark.SPARK_VERSION) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
isn;t this always true?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This PR should be merged to all the active branches: 2.3, 2.4 and master. Branch 2.3 should not test with Spark 2.4.x
val testingVersions = Seq("2.3.3", "2.4.2") | ||
lazy val testingVersions: Seq[String] = { | ||
import scala.io.Source | ||
val versions = Source.fromURL("https://dist.apache.org/repos/dist/release/spark/").mkString |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, what if this link is temporarily unavailable?
Even if this is a potential issue, I think we should merge this and see if the issue is virtually ignorable/trivial or not.
This assumes that dist has only the latest versions in active branches, which it should, but depends on us deleting previous releases diligently. Not sure if it's in the release process. Ideally this would have further logic to only pick the latest version per branch. But not sure if that's complicated to write. Right now this would actually run an extra set of tests. |
Test build #104887 has finished for PR 24454 at commit
|
@srowen yea we may need to run an extra set of tests for a while during the release. I think it's ok as the failure window is short and the release doesn't happen very frequently. |
seems |
Test build #104899 has finished for PR 24454 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks fine; the 2.4.1 release has been removed from dist (thanks)
Merged to master. |
…in HiveExternalCatalogVersionsSuite ## What changes were proposed in this pull request? We can get the latest downloadable Spark versions from https://dist.apache.org/repos/dist/release/spark/ ## How was this patch tested? manually. Closes #24454 from cloud-fan/test. Authored-by: Wenchen Fan <wenchen@databricks.com> Signed-off-by: HyukjinKwon <gurwls223@apache.org>
…in HiveExternalCatalogVersionsSuite We can get the latest downloadable Spark versions from https://dist.apache.org/repos/dist/release/spark/ manually. Closes #24454 from cloud-fan/test. Authored-by: Wenchen Fan <wenchen@databricks.com> Signed-off-by: HyukjinKwon <gurwls223@apache.org>
I've backported it to 2.4/2.3, so that we can completely ignore it during release process. I'll update https://spark.apache.org/release-process.html soon. |
Oops thanks |
…in HiveExternalCatalogVersionsSuite ## What changes were proposed in this pull request? We can get the latest downloadable Spark versions from https://dist.apache.org/repos/dist/release/spark/ ## How was this patch tested? manually. Closes apache#24454 from cloud-fan/test. Authored-by: Wenchen Fan <wenchen@databricks.com> Signed-off-by: HyukjinKwon <gurwls223@apache.org>
…in HiveExternalCatalogVersionsSuite ## What changes were proposed in this pull request? We can get the latest downloadable Spark versions from https://dist.apache.org/repos/dist/release/spark/ ## How was this patch tested? manually. Closes apache#24454 from cloud-fan/test. Authored-by: Wenchen Fan <wenchen@databricks.com> Signed-off-by: HyukjinKwon <gurwls223@apache.org>
…in HiveExternalCatalogVersionsSuite ## What changes were proposed in this pull request? We can get the latest downloadable Spark versions from https://dist.apache.org/repos/dist/release/spark/ ## How was this patch tested? manually. Closes apache#24454 from cloud-fan/test. Authored-by: Wenchen Fan <wenchen@databricks.com> Signed-off-by: HyukjinKwon <gurwls223@apache.org>
Hi, All. |
What changes were proposed in this pull request?
We can get the latest downloadable Spark versions from https://dist.apache.org/repos/dist/release/spark/
How was this patch tested?
manually.