Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-1199][REPL] Remove VALId and use the original import style for defined classes. #1179

Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions repl/src/main/scala/org/apache/spark/repl/SparkIMain.scala
Original file line number Diff line number Diff line change
Expand Up @@ -744,7 +744,7 @@ import org.apache.spark.util.Utils
*
* Read! Eval! Print! Some of that not yet centralized here.
*/
class ReadEvalPrint(lineId: Int) {
class ReadEvalPrint(val lineId: Int) {
def this() = this(freshLineId())

private var lastRun: Run = _
Expand Down Expand Up @@ -1241,7 +1241,10 @@ import org.apache.spark.util.Utils
// old style
beSilentDuring(parse(code)) foreach { ts =>
ts foreach { t =>
withoutUnwrapping(logDebug(asCompactString(t)))
if (isShow || isShowRaw)
withoutUnwrapping(echo(asCompactString(t)))
else
withoutUnwrapping(logDebug(asCompactString(t)))
}
}
}
Expand Down
23 changes: 14 additions & 9 deletions repl/src/main/scala/org/apache/spark/repl/SparkImports.scala
Original file line number Diff line number Diff line change
Expand Up @@ -182,15 +182,26 @@ trait SparkImports {
// ambiguity errors will not be generated. Also, quote
// the name of the variable, so that we don't need to
// handle quoting keywords separately.
case x: ClassHandler =>
// I am trying to guess if the import is a defined class
// This is an ugly hack, I am not 100% sure of the consequences.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think is ugly about this? Is this the original way the REPL used to include classes?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This portion of code is just about importing, the ugly part is that there should not be special casing for different things ideally. This can lead to situations where we don't know what next needs a special case. Thus it is hard for us to pitch this to change to original scala REPL.

The actual ugly part is next case statement where we put Val id = something and then import it. This was the actual reason for this SPARK-1199 issue. But unfortunately we can not avoid it, because otherwise the remote executors try to pull in classes which they don't need to. But since we are living with that code for a while it should ideally be okay.

Yes importing without the intermediate val creation was the originally followed style, which we changed to solve the problem described above.

Suppose we only used code.append("import " + objName + ".INSTANCE" + req.accessPath + "." + imv + "\n") this to import and no other way, then it is much better than what we are doing and there will be no need for migration of REPL for next upgrade.

// Here we, let everything but "defined classes" use the import with val.
// The reason for this is, otherwise the remote executor tries to pull the
// classes involved and may fail.
for (imv <- x.definedNames) {
val objName = req.lineRep.readPath
code.append("import " + objName + ".INSTANCE" + req.accessPath + ".`" + imv + "`\n")
}

case x =>
for (imv <- x.definedNames) {
if (currentImps contains imv) addWrapper()
val objName = req.lineRep.readPath
val valName = "$VAL" + newValId();
val valName = "$VAL" + req.lineRep.lineId

if(!code.toString.endsWith(".`" + imv + "`;\n")) { // Which means already imported
code.append("val " + valName + " = " + objName + ".INSTANCE;\n")
code.append("import " + valName + req.accessPath + ".`" + imv + "`;\n")
code.append("val " + valName + " = " + objName + ".INSTANCE;\n")
code.append("import " + valName + req.accessPath + ".`" + imv + "`;\n")
}
// code.append("val " + valName + " = " + objName + ".INSTANCE;\n")
// code.append("import " + valName + req.accessPath + ".`" + imv + "`;\n")
Expand All @@ -211,10 +222,4 @@ trait SparkImports {
private def membersAtPickler(sym: Symbol): List[Symbol] =
beforePickler(sym.info.nonPrivateMembers.toList)

private var curValId = 0

private def newValId(): Int = {
curValId += 1
curValId
}
}
12 changes: 12 additions & 0 deletions repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala
Original file line number Diff line number Diff line change
Expand Up @@ -235,6 +235,18 @@ class ReplSuite extends FunSuite {
assertContains("res4: Array[Int] = Array(0, 0, 0, 0, 0)", output)
}

test("SPARK-1199-simple-reproduce") {
val output = runInterpreter("local-cluster[1,1,512]",
"""
|case class Sum(exp: String, exp2: String)
|val a = Sum("A", "B")
|def b(a: Sum): String = a match { case Sum(_, _) => "Found Sum" }
|b(a)
""".stripMargin)
assertDoesNotContain("error:", output)
assertDoesNotContain("Exception", output)
}

if (System.getenv("MESOS_NATIVE_LIBRARY") != null) {
test("running on Mesos") {
val output = runInterpreter("localquiet",
Expand Down