Skip to content

[SPARK-28616][INFRA] Improve merge-spark-pr script to warn WIP PRs and strip trailing dots #25356

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Aug 5, 2019

What changes were proposed in this pull request?

This PR aims to improve the merge-spark-pr script in the following two ways.

  1. [WIP] is useful when we show that a PR is not ready for merge. Apache Spark allows merging WIP PRs. However, sometime, we accidentally forgot to clean up the title for the completed PRs. We had better warn once more during merging stage and get a confirmation from the committers.
  2. We have two kinds of PR titles in terms of the ending period. This PR aims to remove the trailing dot since the shorter is the better in the commit title. Also, the PR titles without the trailing dot is dominant in the Apache Spark commit logs.
$ git log --oneline | grep '[.]$' | wc -l
    4090
$ git log --oneline | grep '[^.]$' | wc -l
   20747

How was this patch tested?

Manual.

$ dev/merge_spark_pr.py
git rev-parse --abbrev-ref HEAD
Which pull request would you like to merge? (e.g. 34): 25157

The PR title has `[WIP]`:
[WIP][SPARK-28396][SQL] Add PathCatalog for data source V2
Continue? (y/n):
$ dev/merge_spark_pr.py
git rev-parse --abbrev-ref HEAD
Which pull request would you like to merge? (e.g. 34): 25304
I've re-written the title as follows to match the standard format:
Original: [SPARK-28570][CORE][SHUFFLE] Make UnsafeShuffleWriter use the new API.
Modified: [SPARK-28570][CORE][SHUFFLE] Make UnsafeShuffleWriter use the new API
Would you like to use the modified title? (y/n):

@SparkQA
Copy link

SparkQA commented Aug 5, 2019

Test build #108637 has finished for PR 25356 at commit 0e3c1a2.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

retest this please

@dongjoon-hyun
Copy link
Member Author

All test passed and the CRAN check failed because they seems to start to enforce R 3.4 and the above.

* checking CRAN incoming feasibility ...Error in readRDS(con) : 
  ReadItem: unknown type 238, perhaps written by later version of R
Execution halted
During startup - Warning message:
In .First() :
  Support for R prior to version 3.4 is deprecated since Spark 3.0.0
Loading required package: methods

I'll merge this PR since this is not related to the Spark code.

@dongjoon-hyun
Copy link
Member Author

Thank you always for review and approval, @HyukjinKwon !
Merged to master.

@dongjoon-hyun
Copy link
Member Author

Oops. I didn't read your retest comment. Sorry for missing that~

@HyukjinKwon
Copy link
Member

I think that's fine.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/108640/
Test FAILed.

@dongjoon-hyun
Copy link
Member Author

I'll file a JIRA for CRAN failure since we need to upgrade R in Jenkins.

@HyukjinKwon
Copy link
Member

Thanks.

* checking CRAN incoming feasibility ...Error in readRDS(con) : 
  ReadItem: unknown type 238, perhaps written by later version of R

This error seems to be newer one. cc @viirya FYI.

@dongjoon-hyun
Copy link
Member Author

Ya. I'm checking the other commits and found that #24560 passed 4 hours ago without this problem. Maybe, it might be a transient issue. I'll keep monitoring~

@viirya
Copy link
Member

viirya commented Aug 6, 2019

  • checking CRAN incoming feasibility ...Error in readRDS(con) :

Looks different to previous CRAN error. Was it happened again?

@HyukjinKwon
Copy link
Member

Yes, seems so - #25363 (comment)

@HyukjinKwon
Copy link
Member

looks intermittent though ..

@dongjoon-hyun
Copy link
Member Author

I'll backport this to branch-2.4 because I noticed that branch-2.4 script is also used sometime.

dongjoon-hyun added a commit that referenced this pull request Sep 18, 2019
…d strip trailing dots

## What changes were proposed in this pull request?

This PR aims to improve the `merge-spark-pr` script in the following two ways.
1. `[WIP]` is useful when we show that a PR is not ready for merge. Apache Spark allows merging `WIP` PRs. However, sometime, we accidentally forgot to clean up the title for the completed PRs. We had better warn once more during merging stage and get a confirmation from the committers.
2. We have two kinds of PR titles in terms of the ending period. This PR aims to remove the trailing `dot` since the shorter is the better in the commit title. Also, the PR titles without the trailing `dot` is dominant in the Apache Spark commit logs.
```
$ git log --oneline | grep '[.]$' | wc -l
    4090
$ git log --oneline | grep '[^.]$' | wc -l
   20747
```

## How was this patch tested?

Manual.
```
$ dev/merge_spark_pr.py
git rev-parse --abbrev-ref HEAD
Which pull request would you like to merge? (e.g. 34): 25157

The PR title has `[WIP]`:
[WIP][SPARK-28396][SQL] Add PathCatalog for data source V2
Continue? (y/n):
```

```
$ dev/merge_spark_pr.py
git rev-parse --abbrev-ref HEAD
Which pull request would you like to merge? (e.g. 34): 25304
I've re-written the title as follows to match the standard format:
Original: [SPARK-28570][CORE][SHUFFLE] Make UnsafeShuffleWriter use the new API.
Modified: [SPARK-28570][CORE][SHUFFLE] Make UnsafeShuffleWriter use the new API
Would you like to use the modified title? (y/n):
```

Closes #25356 from dongjoon-hyun/SPARK-28616.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
(cherry picked from commit ae08387)
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
@dongjoon-hyun dongjoon-hyun deleted the SPARK-28616 branch September 18, 2019 23:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants