-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run Flink tests on Java 17 too #10477
Run Flink tests on Java 17 too #10477
Conversation
3a37073
to
3af7ab1
Compare
3af7ab1
to
77fa04a
Compare
209c5d6
to
77fa04a
Compare
Thanks for raising this PR @findepi. Looks like the exclusion isn't working: I'm open to running against Java 17. My only concern is that the Spark/Flink tests take a very long time to run, and adding another item to the test-matrix will consume a lot of CI capacity. For Iceberg 2.0 there seems to be some (I would love to see some more folks jumping in there) consensus to bump Java 11+, maybe that would be the right time to drop Java 8 and add Java 17? Out of curiosity, did you find any issues with JDK17? |
That's a valid concern. OTOH, if we don't run the tests, how do we know that the code works under newer Java version?
I thought that Java 17 is already well supported -- the build requires you to use 8, 11 or 17. |
77fa04a
to
78ae15b
Compare
flink-scala-2-12-tests (8, 1.17) failed, not sure why. |
This time the build was green, including flink-scala-2-12-tests (8, 1.17) |
@pvary is your fix applicable to this flakiness? |
The fix should be there. So probably not 😢 |
.github/workflows/spark-ci.yml
Outdated
@@ -69,7 +69,7 @@ jobs: | |||
runs-on: ubuntu-22.04 | |||
strategy: | |||
matrix: | |||
jvm: [8, 11] | |||
jvm: [8, 11, 17] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
17 exercised by spark-3x-java-17-tests
it works for both 2.12 and 2.13, if you want we can merge the last one into this single runs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i think the person writing those jobs didn't know about matrix excludes/includes (or didn't want to use them).
we should be able to merge the jobs and remove duplication.
would prefer for such changes to be considered as a follow-up though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 on removing spark-3x-java-17-tests
and including 17
into the matrix for testing Spark
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done in #10513.
changes in this file backed out for this PR
thank you @singhpk234 @nastra !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks like the changes in this file disappeared. It would actually be good to clean this up in the scope of this PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it disappeared because it was a result of my misunderstanding.
it is being cleaned up in #10513
I know a lot of Spark and Flink users run on Java 17. this was not an issue, probably because code compiled with old Java typically runs fine on newer version JVMs. Hence we choose the older Java version for released binary. Here is a related discussion from a year ago: https://lists.apache.org/thread/obpdrfgg3spzqrnd0rnghpygdhc8vb9q I would also echo @Fokko 's concern on CI resources. it will be great if we can drop Java 8 from CI matrix. |
this is valuable feedback!
of course! but apparently there often are issues as was pointed out in #10477 (comment)
We can use newer JDK and utilize
Re dropping 8 -- i am fully supportive, but not my call. |
Project supports building with 8, 11 and 17. In most CI scripts we run the jobs on all supported Java versions, let's do same here to ensure the build would work locally too.
78ae15b
to
68bb762
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Re dropping 8 -- i am fully supportive, but not my call.
Re resources -- yes, obviously, and I agree. Is this a "blocker concern" or "i wish there were no downsides" type of concern?
I sent out a mail earlier this year on the devlist. There wasn't as much as response as I hoped. Probably there are still a lot of folks on Hadoop, which is not easy to update, but I'm happy to start this discussion again for Iceberg 2.0.0.
LGTM, thanks for working on this @findepi
re: dropping 8 -- i started some work on a side for this: #10518 |
Thanks @singhpk234 @stevenzwu @pvary @nastra @Fokko for your reviews and thank you @nastra @Fokko for approvals. |
@Fokko Sir.Since hadoop 3.4.1, the hadoop community will also drop support for java8. (I think support for java8 will be removed in 3.5.X at the latest.) . So it's only a matter of time before java8 is abandoned. |
Thanks for the context there @BsoBird, good to see that also Hadoop is moving on 👍 |
Project supports building with 8, 11 and 17. In most CI scripts we run the jobs on all supported Java versions, let's do same here to ensure the build would work locally too.
Project supports building with 8, 11 and 17. In most CI scripts we run the jobs on all supported Java versions, let's do same here to ensure the build would work locally too.