Skip to content

Conversation

@ala
Copy link
Contributor

@ala ala commented Feb 10, 2017

What changes were proposed in this pull request?

This change add an optional argument to SparkContext.cancelStage() and SparkContext.cancelJob() functions, which allows the caller to provide exact reason for the cancellation.

How was this patch tested?

Adds unit test.

@ala
Copy link
Contributor Author

ala commented Feb 10, 2017

@rxin

*/
def cancelJob(jobId: Int) {
dagScheduler.cancelJob(jobId)
def cancelJob(jobId: Int, reason: String = "") {
Copy link
Contributor

@rxin rxin Feb 10, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually you need to define a new function with two parameters and keep the old one to maintain binary compatibility for the public APIs.

* Cancel a job that is running or waiting in the queue.
*/
def cancelJob(jobId: Int): Unit = {
def cancelJob(jobId: Int, reason: String = ""): Unit = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

internally can we use an Option rather than empty string?

@rxin
Copy link
Contributor

rxin commented Feb 10, 2017

LGTM pending Jenkins.

def cancelJob(jobId: Int) {
dagScheduler.cancelJob(jobId)
def cancelJob(jobId: Int, reason: String): Unit = {
dagScheduler.cancelJob(jobId, Some(reason))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use Option(reason), Some(reason) will happily wrap a null.

@SparkQA
Copy link

SparkQA commented Feb 10, 2017

Test build #72712 has finished for PR 16887 at commit 8b023d4.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 10, 2017

Test build #72716 has finished for PR 16887 at commit a5f4946.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 10, 2017

Test build #72717 has finished for PR 16887 at commit 50c54b1.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@rxin
Copy link
Contributor

rxin commented Feb 10, 2017

Merging in master!

@asfgit asfgit closed this in d785217 Feb 10, 2017
@SparkQA
Copy link

SparkQA commented Feb 10, 2017

Test build #72713 has finished for PR 16887 at commit 81b4243.

  • This patch fails from timeout after a configured wait of `250m`.
  • This patch merges cleanly.
  • This patch adds no public classes.

@hvanhovell
Copy link
Contributor

retest this please

cmonkey pushed a commit to cmonkey/spark that referenced this pull request Feb 15, 2017
## What changes were proposed in this pull request?

This change add an optional argument to `SparkContext.cancelStage()` and `SparkContext.cancelJob()` functions, which allows the caller to provide exact reason  for the cancellation.

## How was this patch tested?

Adds unit test.

Author: Ala Luszczak <ala@databricks.com>

Closes apache#16887 from ala/cancel.
liancheng pushed a commit to liancheng/spark that referenced this pull request Mar 17, 2017
This change add an optional argument to `SparkContext.cancelStage()` and `SparkContext.cancelJob()` functions, which allows the caller to provide exact reason  for the cancellation.

Adds unit test.

Author: Ala Luszczak <ala@databricks.com>

Closes apache#16887 from ala/cancel.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants