Skip to content

[SPARK-12365][CORE] Use ShutdownHookManager where Runtime.getRuntime.addShutdownHook() is called #10325

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from

Conversation

ted-yu
Copy link

@ted-yu ted-yu commented Dec 16, 2015

SPARK-9886 fixed ExternalBlockStore.scala

This PR fixes the remaining references to Runtime.getRuntime.addShutdownHook()

@ted-yu ted-yu changed the title [CORE] Use ShutdownHookManager where Runtime.getRuntime.addShutdownHook() is called [SPARK-12365][CORE] Use ShutdownHookManager where Runtime.getRuntime.addShutdownHook() is called Dec 16, 2015
@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

Jenkins, test this please

@srowen
Copy link
Member

srowen commented Dec 16, 2015

Looks pretty good

@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

@srowen
Thanks for the quick review

@SparkQA
Copy link

SparkQA commented Dec 16, 2015

Test build #47811 has finished for PR 10325 at commit 8b4b261.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 16, 2015

Test build #47812 has finished for PR 10325 at commit 8b4b261.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

For build 47811:

Had test failures in pyspark.mllib.tests with python2.6; see logs.

I don't think the above was related to this PR.

@SparkQA
Copy link

SparkQA commented Dec 16, 2015

Test build #47817 has finished for PR 10325 at commit 87e2d0d.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

org.apache.spark.streaming.JavaAPISuite.testStreamingContextTransform

Failing for the past 1 build (Since Failed#47817 )
Took 62 ms.
Error Message

org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2287) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:140) org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:869) org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:869) scala.Option.map(Option.scala:145) org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:869) org.apache.spark.streaming.MasterFailureTest$.runStreams(MasterFailureTest.scala:278) org.apache.spark.streaming.MasterFailureTest$.testOperation(MasterFailureTest.scala:165) 

I don't think the above is caused by this PR.

@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

Jenkins, test this please

@SparkQA
Copy link

SparkQA commented Dec 16, 2015

Test build #47831 has finished for PR 10325 at commit 87e2d0d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -24,6 +24,7 @@ import org.apache.spark.deploy.rest.mesos.MesosRestServer
import org.apache.spark.scheduler.cluster.mesos._
import org.apache.spark.util.SignalLogger
import org.apache.spark.{Logging, SecurityManager, SparkConf}
import org.apache.spark.util.ShutdownHookManager
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you merge this with the on in L25

@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

 > git fetch --tags --progress https://github.com/apache/spark.git +refs/pull/10325/*:refs/remotes/origin/pr/10325/* # timeout=15
ERROR: Timeout after 15 minutes
ERROR: Error fetching remote repo 'origin'

@ted-yu
Copy link
Author

ted-yu commented Dec 16, 2015

Jenkins, test this please

@SparkQA
Copy link

SparkQA commented Dec 17, 2015

Test build #47858 has finished for PR 10325 at commit a6caf22.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor

Merging into master 1.6

@asfgit asfgit closed this in f590178 Dec 17, 2015
asfgit pushed a commit that referenced this pull request Dec 17, 2015
…addShutdownHook() is called

SPARK-9886 fixed ExternalBlockStore.scala

This PR fixes the remaining references to Runtime.getRuntime.addShutdownHook()

Author: tedyu <yuzhihong@gmail.com>

Closes #10325 from ted-yu/master.

(cherry picked from commit f590178)
Signed-off-by: Andrew Or <andrew@databricks.com>

Conflicts:
	sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants