-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade Spark version #152
Comments
FYI, we also discovered the version parsing problem when the java version is like "11" or "16" with no decimal places. This stack trace will occur: java.lang.ExceptionInInitializerError This des not happen with an update version like "11.0.10" etc. |
Looks like this can be closed then. |
The current Spark benchmarks caused an issue to OpenJ9 ( #131 ), they limit the suite compatibility with latest JDK versions on Mac OS ( #127 ) and they crash on ia64 infrastructure ( #150 ).
Those problems increases the motivation to upgrade Spark and any other libraries present in the
apache-spark
subproject.To avoid changing the existing benchmarks though, the best would be to create a different subproject and port the benchmarks one by one while ensuring they still work as expected. After ensuring that those benchmarks are as good (or better) than the other ones, we can deprecate or remove the old benchmarks.
The text was updated successfully, but these errors were encountered: