Skip to content

SPARK-3470 [CORE] [STREAMING] Add Closeable / close() to Java context objects #2346

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

srowen
Copy link
Member

@srowen srowen commented Sep 10, 2014

... that expose a stop() lifecycle method. This doesn't add AutoCloseable, which is Java 7+ only. But it should be possible to use try-with-resources on a Closeable in Java 7, as long as the close() does not throw a checked exception, and these don't. Q.E.D.

@SparkQA
Copy link

SparkQA commented Sep 10, 2014

QA tests have started for PR 2346 at commit 612c21d.

  • This patch merges cleanly.

@witgo
Copy link
Contributor

witgo commented Sep 10, 2014

The relevant PR: #991

@SparkQA
Copy link

SparkQA commented Sep 10, 2014

QA tests have finished for PR 2346 at commit 612c21d.

  • This patch passes unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class JavaSparkContext(val sc: SparkContext)
    • class JavaStreamingContext(val ssc: StreamingContext) extends Closeable

@mattf
Copy link

mattf commented Sep 10, 2014

since spark targets java 7 & 8, why not just use the correct AutoCloseable?

@srowen
Copy link
Member Author

srowen commented Sep 10, 2014

It doesn't target Java 7: https://github.com/apache/spark/blob/master/pom.xml#L113

@mattf
Copy link

mattf commented Sep 10, 2014

It doesn't target Java 7: https://github.com/apache/spark/blob/master/pom.xml#L113

you're right. i hope it becomes java 7+ and we can move to AutoCloseable

@srowen
Copy link
Member Author

srowen commented Sep 10, 2014

AutoCloseable is a superinterface of Closeable, so, this is already true. Implementing Closeable is a stronger condition, strangely.

@mattf
Copy link

mattf commented Sep 11, 2014

would like a unit test, but lgtm

@andrewor14
Copy link
Contributor

Yup, LGTM

@@ -40,7 +41,9 @@ import org.apache.spark.rdd.{EmptyRDD, HadoopRDD, NewHadoopRDD, RDD}
* A Java-friendly version of [[org.apache.spark.SparkContext]] that returns
* [[org.apache.spark.api.java.JavaRDD]]s and works with Java collections instead of Scala ones.
*/
class JavaSparkContext(val sc: SparkContext) extends JavaSparkContextVarargsWorkaround {
class JavaSparkContext(val sc: SparkContext)
extends JavaSparkContextVarargsWorkaround with Closeable {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note two space indent here, but don't worry i will fix it.

@rxin
Copy link
Contributor

rxin commented Sep 13, 2014

@srowen any reason you did not add this to the Scala SparkContext?

@asfgit asfgit closed this in feaa370 Sep 13, 2014
@srowen
Copy link
Member Author

srowen commented Sep 13, 2014

@rxin (Yeah wasn't sure how to handle the continuation indent, feel free to change it.) I didn't add it to SparkContext because I figured the purpose of the change was to support Java's try-with-resources, which only works in Java, and I don't expect Java users would regularly be accessing SparkContext instead of JavaSparkContext. It wouldn't do harm to implement Closeable but probably doesn't help anyone.

@srowen srowen deleted the SPARK-3470 branch September 15, 2014 14:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants