Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JDK 9+ support #127

Closed
farquet opened this issue May 8, 2019 · 10 comments
Closed

JDK 9+ support #127

farquet opened this issue May 8, 2019 · 10 comments
Labels
compatibility Relates to platform or system compatibility
Milestone

Comments

@farquet
Copy link
Collaborator

farquet commented May 8, 2019

We should make sure that the suite works well with the the most recent JDK versions.
And the travis gate should include a JDK11 test and maybe newer if possible.

I currently see that there is an issue building the suite with JDK11 :

[info] Compiling 10 Scala sources to <suite-path>/benchmarks/apache-spark/target/scala-2.11/classes ...
[error] <suite-path>/benchmarks/apache-spark/src/main/scala/org/renaissance/apache/spark/PageRank.scala:43:52: value zipWithIndex is not a member of java.util.stream.Stream[String]
[error]       val sublist = for ((line, num) <- text.lines.zipWithIndex if num < MAX_LINE) yield line
[error]                                                    ^
[error] <suite-path>/renaissance/benchmarks/apache-spark/src/main/scala/org/renaissance/apache/spark/PageRank.scala:43:72: value < is not a member of Any
[error]       val sublist = for ((line, num) <- text.lines.zipWithIndex if num < MAX_LINE) yield line
[error]                                                                        ^
[error] two errors found
@farquet farquet added the compatibility Relates to platform or system compatibility label May 8, 2019
@farquet farquet added this to the 1.0.0 milestone May 8, 2019
@villazon
Copy link
Collaborator

villazon commented May 8, 2019

I pushed PR #128 to fix this issue.
Solution is inspired from scala/scala3#5463 and Glavo/dotty@59c49fa

@farquet
Copy link
Collaborator Author

farquet commented May 8, 2019

Thanks ! This fixes the compilation problem, but I see issues during benchmark executions for JDK9 and JDK11. Let’s keep that issue open until we have fixed all of these and added the Travis check on JDK11.

@vhotspur
Copy link
Member

Should we close it now as #129 was also merged (bc27f1b)?

@axel22
Copy link
Member

axel22 commented May 14, 2019

Sounds good to me to close.
I'd still let @farquet comment because he's aware of some discussion about JDK 12 (?) problems.

@farquet
Copy link
Collaborator Author

farquet commented May 14, 2019

I am very surprised that it works on Travis, because locally (on MacOS) I hit an error in a static initializer on JDK11 for Spark benchmarks (everything fine on JDK8, JDK9 and JDK10).
What I hit seems to be this Hadoop bug : https://issues.apache.org/jira/browse/HADOOP-14586
Hadoop is a dependency of Spark and since we depend on an old Spark version, it doesn't include a recent enough version of Hadoop with a fix for this.

[error] java.lang.ExceptionInInitializerError
[error] 	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
[error] 	at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:362)
[error] 	at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$29.apply(SparkContext.scala:985)
[error] 	at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$29.apply(SparkContext.scala:985)
[error] 	at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:177)
[error] 	at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:177)
[error] 	at scala.Option.map(Option.scala:146)
[error] 	at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:177)
[error] 	at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:196)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
[error] 	at scala.Option.getOrElse(Option.scala:121)
[error] 	at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
[error] 	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
[error] 	at scala.Option.getOrElse(Option.scala:121)
[error] 	at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
[error] 	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
[error] 	at scala.Option.getOrElse(Option.scala:121)
[error] 	at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
[error] 	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
[error] 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
[error] 	at scala.Option.getOrElse(Option.scala:121)
[error] 	at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
[error] 	at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65)
[error] 	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKeyWithClassTag$3.apply(PairRDDFunctions.scala:629)
[error] 	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKeyWithClassTag$3.apply(PairRDDFunctions.scala:629)
[error] 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error] 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[error] 	at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
[error] 	at org.apache.spark.rdd.PairRDDFunctions.combineByKeyWithClassTag(PairRDDFunctions.scala:628)
[error] 	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:616)
[error] 	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:616)
[error] 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error] 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[error] 	at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
[error] 	at org.apache.spark.rdd.PairRDDFunctions.combineByKey(PairRDDFunctions.scala:615)
[error] 	at org.apache.spark.mllib.classification.NaiveBayes.run(NaiveBayes.scala:382)
[error] 	at org.renaissance.apache.spark.NaiveBayes.runIteration(NaiveBayes.scala:104)
[error] 	at org.renaissance.RenaissanceBenchmark.runIterationWithBeforeAndAfter(RenaissanceBenchmark.java:125)
[error] 	at org.renaissance.FixedIterationsPolicy.execute(FixedIterationsPolicy.java:48)
[error] 	at org.renaissance.RenaissanceBenchmark.runBenchmark(RenaissanceBenchmark.java:86)
[error] 	at org.renaissance.RenaissanceSuite$.$anonfun$main$2(renaissance-suite.scala:311)
[error] 	at org.renaissance.RenaissanceSuite$.$anonfun$main$2$adapted(renaissance-suite.scala:309)
[error] 	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
[error] 	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
[error] 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
[error] 	at org.renaissance.RenaissanceSuite$.main(renaissance-suite.scala:309)
[error] 	at org.renaissance.RenaissanceSuite.main(renaissance-suite.scala)
[error] 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] 	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[error] 	at org.renaissance.Launcher.main(Launcher.java:18)
[error] Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
[error] 	at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3319)
[error] 	at java.base/java.lang.String.substring(String.java:1874)
[error] 	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:48)
[error] 	... 58 more

Are you 100% that the Travis gate runs the benchmark on the correct JDK ? It's not sufficient to have java on the PATH because if you specify a different JAVA_HOME, this is what sbt will use as a JDK. As an example, locally, I test things using this convenient one liner :
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk-11.jdk/Contents/Home/ tools/sbt/bin/sbt 'runMain org.renaissance.Launcher naive-bayes -r 3'

@vhotspur
Copy link
Member

I am very surprised that it works on Travis, because locally (on MacOS) I hit an error in a static initializer on JDK11 for Spark benchmarks (everything fine on JDK8, JDK9 and JDK10).
What I hit seems to be this Hadoop bug : https://issues.apache.org/jira/browse/HADOOP-14586
Hadoop is a dependency of Spark and since we depend on an old Spark version, it doesn't include a recent enough version of Hadoop with a fix for this.

Are you 100% that the Travis gate runs the benchmark on the correct JDK ? It's not sufficient to have java on the PATH because if you specify a different JAVA_HOME, this is what sbt will use as a JDK.

I have pushed one more commit (335820a) to the topic/travis-more-jdks branch that also prints $JAVA_HOME. The build is here and as far as I can tell from the output, we are running with the correct JDK.

However, we test only the default Java installation on MacOS (seems to be Java HotSpot(TM) 64-Bit Server VM 18.3 (build 10.0.1+10, mixed mode)). We should probably test multiple JDKs there too.

@farquet
Copy link
Collaborator Author

farquet commented May 15, 2019

Thanks for the extra check.

Indeed the default Java HotSpot(TM) 64-Bit Server VM 18.3 (build 10.0.1+10, mixed mode) is JDK10 and works fine on my machine too.

I hit the bug with the following version :

java version "11" 2018-09-25
Java(TM) SE Runtime Environment 18.9 (build 11+28)
Java HotSpot(TM) 64-Bit Server VM 18.9 (build 11+28, mixed mode)

@farquet
Copy link
Collaborator Author

farquet commented May 15, 2019

I think we could safely close the issue if we are confident that everything works fine at least on Linux from JDK8 up until the latest dev JDK. I see that Travis supports ea versions, that would be great if we could add a Travis check for JDK12 and JDK13 ea.
If everything works, I would suggest to open a distinct issue to talk about the Mac OS problem on JDK11+.

@lbulej
Copy link
Member

lbulej commented Apr 30, 2021

Related to #244

@vhotspur
Copy link
Member

I think we can close this as it is superseded by #241 (and related to #291).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compatibility Relates to platform or system compatibility
Projects
None yet
Development

No branches or pull requests

5 participants