Skip to content

Git Notes Testing Pull Requests

Randy McDermott edited this page Oct 1, 2024 · 33 revisions

Checking Out the Pull Request

As a project member you may need to accept or reject a pull request either from another member or an outside collaborator. Here we will show how to test the changes before merging them into the central repo. The basic process is discussed here. Below we will follow an example for a set of changes to the FDS User Guide, which is fairly easy to test.

Let's follow the procedure for accepting pull request #2809. The first step is to fetch the commit to your local repo. The syntax is

$ git fetch repo pull/ID/head:BRANCHNAME

Here the ID is the number of the pull request (in this case 2809) and the BRANCHNAME is whatever you want to call it in your local repo. Here we will call it test_2809.

$ git fetch firemodels pull/2809/head:test_2809
remote: Counting objects: 36, done.
remote: Compressing objects: 100% (36/36), done.
remote: Total 36 (delta 23), reused 0 (delta 0), pack-reused 0
Unpacking objects: 100% (36/36), done.
From git://github.com/firemodels/fds-smv
 * [new branch]      refs/pull/2809/head -> test_2809

Note that here firemodels is the name I gave to the upstream repository (in this case, it is the fds central repository set up under the firemodels GitHub organization). It is set up for remote tracking in the clone of my forked repository. If I were working directly within a clone of the central repository, I would refer the the repo as origin.

Now do a git branch to see what branches you have locally.

$ git branch
* master
  test_2809

You will see that you have fetched the branch test_2809, but you are still on branch master.

Now checkout the new test branch.

$ git checkout test_2809
... (what you see here depends on what you have modified in your local repo; you may want to [stash] your changes before starting this process)

OK, now run your test to determine if the pull request does what it is supposed to do. In this case, we go to Manuals/FDS_User_Guide/ and run $ ./make_guide.sh, and we should see FDS User Guide built successfully! in addition to the correct additions having been made to the guide.

If successful, then you can simply accept the pull request on GitHub. If there is a problem, you can either write the author to make changes and "Close and Comment" the conversation (the author will need to make another pull request) or you can make the changes yourself in your local test repo and push those to YOUR origin and submit a pull request to the central repo from there. We discuss this option next.

Often it is the case that the pull request is 99 % of the way there, but may need a few minor edits in order meet all the Developer Commit Guidelines. In this case, it is usually easier for the developer to make the local changes themselves and then push to their own forked repo. If you are the developer and if you have followed the instructions to this point you will not have a copy of the new test branch in your GitHub repo. Once you have made the changes in the test branch (add and commit them) you can push this new branch to GitHub via

$ git push -u origin test_2809
Counting objects: 13, done.
Delta compression using up to 12 threads.
Compressing objects: 100% (9/9), done.
Writing objects: 100% (9/9), 1.04 KiB, done.
Total 9 (delta 7), reused 0 (delta 0)
To git@github.com:rmcdermo/fds-smv.git
 * [new branch]      test_2809 -> test_2809
Branch test_2809 set up to track remote branch test_2809 from origin by rebasing.

You can now submit this branch as a pull request to the central repo. Then accept that request.

Once this is done you will want to fetch and merge the changes into your local development branch (see Git Notes: Getting Started).

$ git checkout master
$ git remote update
Fetching origin
Fetching firemodels
remote: Counting objects: 1, done.
remote: Total 1 (delta 0), reused 0 (delta 0), pack-reused 0
Unpacking objects: 100% (1/1), done.
From git://github.com/firemodels/fds
   59ab030..0a17223  master -> firemodels/master
$ git merge firemodels/master
Updating 59ab030..0a17223
Fast-forward
 Manuals/FDS_User_Guide/FDS_User_Guide.tex |   21 +++++++++++++--------
 1 files changed, 13 insertions(+), 8 deletions(-)
$ git push origin master
Counting objects: 49, done.
Delta compression using up to 12 threads.
Compressing objects: 100% (37/37), done.
Writing objects: 100% (37/37), 4.37 KiB, done.
Total 37 (delta 28), reused 0 (delta 0)
To git@github.com:rmcdermo/fds.git
   59ab030..0a17223  master -> master

And finally you need to delete the test branch from your local repo.

$ git branch -d test_2809
Deleted branch test_2809 (was 31be744).

Do a git branch to see that the test branch is gone.

$ git branch
* master

Note that usually GitHub will tell you that you may delete the test branch on your forked repo on GitHub. Go ahead and click the delete button (it gives you a restore, just in case). If you forget to do this, you will eventually want to get rid of your remote test branch in your forked repo. Then you will have to do either

$ git push origin :test_2809

or

$ git push origin --delete test_2809

Running Firebot (complete)

A guide to setting up and running the FDS continuous integration program "Firebot" can be found here. Going through this procedure will take you through the whole process, including building the manuals. However, we note that it is advisable to carry out this process on a clean repository, else you may lose untracked files in your working directories.

Manually Running FDS_Cases.sh

A light weight alternative to running the complete Firebot process is to manually run the verification cases. The list of cases in the FDS verification suite is in Verification/FDS_Cases.sh. There are a few reasons why you may want to utilize this list for testing your pull request. First, (at the time of writing) if you do your development on a machine other than blaze (I use burn or my laptop most of the time), it is a hassle to migrate your branch over to blaze for a full Firebot session. Next, if your changes would pass a build, FDS run, and Matlab plot, then it should be reasonable to merge and let the full Firebot run overnight. Last, it may be that you only want to run a subset of the cases, in which case you could comment out other cases in FDS_Case.sh, or modify scripts/Run_FDS_Cases.sh (which we will talk about next) to run your select cases.

The downside of using Run_FDS_Cases.sh is that there are a few things you must do manually, but these are really not too onerous.

Step 1: Do a clean compile of both MPI debug and MPI release versions of the code. From the top level of the repo do

$ cd Build/impi_intel_linux_64_db
$ rm *.o *.mod
$ ./make_fds.sh
...

Obviously, if the code does not compile, fix the errors and repeat. But in debug you may also get unused variable warnings. Make sure you fix these too before going on.

Next, compile the MPI release version. Again, from the top level do

$ cd Build/impi_intel_linux_64
$ rm *.o *.mod
$ ./make_fds.sh
...

This will take a little longer to compile. Notice that here we are compiling with Intel MPI ("impi"). This is the default for Utilities/Scripts/qfds.sh (which I alias to qfds.sh). If you are using a different executable, run qfds.sh -H to see a list of options to qfds.sh. You can change the executable with the -e option. Make sure you can run a single case using qfds.sh before trying to launch the verification suite with Run_FDS_Cases.sh!

Step 2: Check queue status and launch cases.

To check queue status do squeue at the terminal command line (assuming you are using SLURM for job scheduling). If the cluster looks reasonably open go ahead and launch cases. CD into the Verification/scripts directory and type

$ cd Verification/scripts/
$ ./Run_FDS_Cases.sh # note: use -d option for debug version of the code, -m 10 will run for 10 time steps, -q firebot will run in firebot queue
...

Now do another squeue and you should see all the cases either in Q status or R status for "running".

Step 3: Check for errors and numerical instabilities.

Once all the cases have cleared the queue (this may take an hour or several hours, depending on your system and how full the queue was to start with) you need to check for "forrtl" errors and "Numerical Instability" stops. Assuming you are in the Verification/scripts/ directory, do

$ cd ..
$ grep forrtl */*.err
$ grep Numerical */*.out

Hopefully, you don't get any messages. If you do, obviously, fix those cases before you proceed.

Step 4: Run Matlab verification script

Once the cases have run you can launch Matlab and run FDS_verification_script.m. I run this from my desktop machine on a mapped drive to burn. Once you start Matlab and cd to the Utilities/Matlab/ directory, type

>> FDS_verification_script
...
>> verification scripts completed successfully!

If you get the above message without any warnings, then proceed to check the error tolerances.

Step 5. Check verification error tolerances

The output for each quantitative verification check is documented in a table that gets printed in the FDS Verification Guide. There is a "Yes" or "No" column telling you whether the case passed the error tolerance metric. You can either visually scan the file or grep for "No" in the last column of the table. From the top level of the repo do

$ cd Manuals/FDS_Verification_Guide/SCRIPT_FIGURES/Scatterplots/
$ grep ' No ' ./verification_statistics.tex

If everything has passed, you are OK to merge the pull request!

Clone this wiki locally