Skip to content

[SPARK-2815]: Compilation failed upon the hadoop version 2.0.0-cdh4.5.0 #1754

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

witgo
Copy link
Contributor

@witgo witgo commented Aug 3, 2014

No description provided.

@SparkQA
Copy link

SparkQA commented Aug 3, 2014

QA tests have started for PR 1754. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17815/consoleFull

@srowen
Copy link
Member

srowen commented Aug 3, 2014

This piece of code in the build is legacy, as the note in the following line says. I don't think it needs to be changed. The OP is building with the wrong command; that's the problem.

@witgo
Copy link
Contributor Author

witgo commented Aug 3, 2014

This patch is to be consistent with the 1.0 version.
See #772.

@SparkQA
Copy link

SparkQA commented Aug 3, 2014

QA results for PR 1754:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds no public classes

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17815/consoleFull

@pwendell
Copy link
Contributor

pwendell commented Aug 4, 2014

I think this is an intermediate YARN version that is different from both the yarn-alpha and yarn-stable API's. @witgo what if you apply the patch here - does it work?

https://github.com/apache/spark/pull/151/files

In terms of merging this - I'm not sure we want to support 3 different YARN API's in the build and I think CDH itself said that YARN was not stable/supported here, so I'm not sure if we want to merge that patch. I am curious though whether it fixes it for you.

@witgo
Copy link
Contributor Author

witgo commented Aug 4, 2014

This PR makes sbt' s behavior is consistent with building-with-maven.md description.

YARN versionProfile required
0.23.x to 2.1.xyarn-alpha
2.2.x and lateryarn

@@ -71,7 +71,7 @@ object SparkBuild extends PomBuild {
}
Properties.envOrNone("SPARK_HADOOP_VERSION") match {
case Some(v) =>
if (v.matches("0.23.*")) isAlphaYarn = true
if ("^2\\.[2-9]+".r.findFirstIn(v) == None) isAlphaYarn = true
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there is ever a Hadoop 2.10, this pattern would consider it a YARN alpha version. Better to match against 2\\.[01]\\.[0-9]+ or something. This won't address the actual issue in the OP. It looks like a bit of cleanup, but in a deprecated code path. It would let Hadoop 2.0 / 2.1 be "supported" but they aren't actually.

@witgo
Copy link
Contributor Author

witgo commented Aug 4, 2014

@pwendell #151 compilation fails.
There seems to be infinite loop:
SPARK_HADOOP_VERSION=2.0.0-cdh4.5.0 SPARK_YARN=true SPARK_HIVE=true sbt/sbt clean assembly ->

java.lang.StackOverflowError
    at scala.reflect.internal.HasFlags$class.isSynthetic(HasFlags.scala:115)
    at scala.reflect.internal.Symbols$Symbol.isSynthetic(Symbols.scala:112)
    at xsbt.ExtractUsedNames.eligibleAsUsedName(ExtractUsedNames.scala:121)
    at xsbt.ExtractUsedNames.handleClassicTreeNode$1(ExtractUsedNames.scala:87)
    at xsbt.ExtractUsedNames.xsbt$ExtractUsedNames$$handleTreeNode$1(ExtractUsedNames.scala:97)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1489)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1487)
    at scala.reflect.internal.Trees$class.itraverse(Trees.scala:1174)
    at scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:13)
    at scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:13)
    at scala.reflect.api.Trees$Traverser.traverse(Trees.scala:2825)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1490)
    at scala.reflect.internal.Trees$TreeContextApiImpl.foreach(Trees.scala:80)
    at xsbt.ExtractUsedNames.handleMacroExpansion$1(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames.xsbt$ExtractUsedNames$$handleTreeNode$1(ExtractUsedNames.scala:95)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1489)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1487)
    at scala.reflect.api.Trees$Traverser.traverseTrees(Trees.scala:2829)
    at scala.reflect.internal.Trees$class.itraverse(Trees.scala:1174)
    at scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:13)
    at scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:13)
    at scala.reflect.api.Trees$Traverser.traverse(Trees.scala:2825)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1490)
    at scala.reflect.internal.Trees$TreeContextApiImpl.foreach(Trees.scala:80)
    at xsbt.ExtractUsedNames.handleMacroExpansion$1(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames.xsbt$ExtractUsedNames$$handleTreeNode$1(ExtractUsedNames.scala:95)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1489)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1487)
    at scala.reflect.api.Trees$Traverser.traverseTrees(Trees.scala:2829)
    at scala.reflect.internal.Trees$class.itraverse(Trees.scala:1174)
    at scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:13)
    at scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:13)
    at scala.reflect.api.Trees$Traverser.traverse(Trees.scala:2825)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1490)
    at scala.reflect.internal.Trees$TreeContextApiImpl.foreach(Trees.scala:80)
    at xsbt.ExtractUsedNames.handleMacroExpansion$1(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames.xsbt$ExtractUsedNames$$handleTreeNode$1(ExtractUsedNames.scala:95)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at xsbt.ExtractUsedNames$$anonfun$handleMacroExpansion$1$2.apply(ExtractUsedNames.scala:64)
    at scala.reflect.internal.Trees$ForeachTreeTraverser.traverse(Trees.scala:1489)

@witgo
Copy link
Contributor Author

witgo commented Aug 4, 2014

We need to explicitly pointed out that spark does not support the version 2.0.x and 2.1.x of yarn ?

@srowen
Copy link
Member

srowen commented Aug 4, 2014

@witgo I don't think #151 is to be committed, if I understand correctly. It's not 100% clear which versions of YARN 2.0.x actually work with yarn-alpha, and which if any work with yarn. If anything it's worth a note that the pre-stable YARN versions are not guaranteed to work, but might.

@pwendell
Copy link
Contributor

I don't mind putting this one in (it's simple enough and might lower the bar for anyone trying to go this route). But the regex needs to be fixed, otherwise it will match 2.10.X.

@witgo
Copy link
Contributor Author

witgo commented Sep 18, 2014

@pwendell I have updated the regular expression.

@SparkQA
Copy link

SparkQA commented Sep 18, 2014

QA tests have started for PR 1754 at commit 727062e.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Sep 18, 2014

QA tests have finished for PR 1754 at commit 727062e.

  • This patch passes unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@witgo witgo closed this Oct 14, 2014
snmvaughan pushed a commit to snmvaughan/spark that referenced this pull request Jun 20, 2023
…pache#1754)

* rdar://109188323 [Boson] Enable Boson scan in Spark 3.4 by default

* fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants