Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/cam ptype #158

Merged
merged 66 commits into from
Jul 24, 2023

Conversation

MarcelCaron-NOAA
Copy link
Contributor

Pull Request Testing

  • Describe testing already performed for this Pull Request:

    All jobs have been running in the cron for months, recent runs have been completing cleanly, and output stats and plots have been approved by the CAM verification team leads.
  • Recommend testing for the reviewer(s) to perform, including the location of input datasets, and any additional instructions:

This feature branch includes PTYPE verification as an addition to the CAM/grid2obs verification set. PTYPE verification occurs over each of the three existing grid2obs steps (prep, stats, plots) and I recommend testing each set of jobs.

Prep and stats jobs are timing-sensitive and should be run in the cron, like this:

30 0 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hireswarw_grid2obs_prep.out {HOMEevs}/dev/drivers/scripts/cam/prep/jevs_hireswarw_grid2obs_prep.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hireswarw_grid2obs_prep.out 2>&1
30 0 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hireswarwmem2_grid2obs_prep.out {HOMEevs}/dev/drivers/scripts/cam/prep/jevs_hireswarwmem2_grid2obs_prep.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hireswarwmem2_grid2obs_prep.out 2>&1
30 0 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hireswfv3_grid2obs_prep.out {HOMEevs}/dev/drivers/scripts/cam/prep/jevs_hireswfv3_grid2obs_prep.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hireswfv3_grid2obs_prep.out 2>&1
30 0 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hrrr_grid2obs_prep.out {HOMEevs}/dev/drivers/scripts/cam/prep/jevs_hrrr_grid2obs_prep.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hrrr_grid2obs_prep.out 2>&1
30 0 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_namnest_grid2obs_prep.out {HOMEevs}/dev/drivers/scripts/cam/prep/jevs_namnest_grid2obs_prep.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_namnest_grid2obs_prep.out 2>&1
0 2,3,6,9,12,15,18,21 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hireswarw_grid2obs_stats.out {HOMEevs}/dev/drivers/scripts/cam/stats/jevs_hireswarw_grid2obs_stats.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hireswarw_grid2obs_stats.out 2>&1
0 2,3,6,9,12,15,18,21 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hireswarwmem2_grid2obs_stats.out {HOMEevs}/dev/drivers/scripts/cam/stats/jevs_hireswarwmem2_grid2obs_stats.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hireswarwmem2_grid2obs_stats.out 2
0 2,3,6,9,12,15,18,21 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hireswfv3_grid2obs_stats.out {HOMEevs}/dev/drivers/scripts/cam/stats/jevs_hireswfv3_grid2obs_stats.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hireswfv3_grid2obs_stats.out 2>&1
0 2,3,6,9,12,15,18,21 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_hrrr_grid2obs_stats.out {HOMEevs}/dev/drivers/scripts/cam/stats/jevs_hrrr_grid2obs_stats.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_hrrr_grid2obs_stats.out 2>&1
0 2,3,6,9,12,15,18,21 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_namnest_grid2obs_stats.out {HOMEevs}/dev/drivers/scripts/cam/stats/jevs_namnest_grid2obs_stats.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_namnest_grid2obs_stats.out 2>&1

Replacing {USER} with your WCOSS2 username, and {HOMEevs} with the location of your EVS PR testing directory. For each driver script ({HOMEevs}/dev/drivers/scripts/cam/...) set HOMEevs, DATA, and COMOUT to your desired directories. Additionally, in the stats driver script, set COMINmping your prep COMOUT directory.

Plots jobs may be run interactively at any time. If you prefer to use the cron, here is how I have my job set up:
0 22 * * * qsub -o /lfs/h2/emc/stmp/{USER}/jevs_cam_grid2obs_plots.out {HOMEevs}/dev/drivers/scripts/cam/plots/jevs_cam_grid2obs_plots.sh >> /lfs/h2/emc/stmp/{USER}/cron.out/test_jevs_cam_grid2obs_plots.out 2>&1

Just be sure to replace {USER} and {HOMEevs} as in the prep/stats cron jobs. Then set HOMEevs, DATA, and COMOUT to your desired directories. Finally, set COMIN to my test directory:
export COMIN=/lfs/h2/emc/vpppg/noscrub/marcel.caron/$NET/$evs_ver/stats/$COMPONENT

  • Has the code been checked to ensure that no errors occur during the execution? [Yes]

  • Do these updates/additions include sufficient testing updates? [Yes]

  • Please complete this pull request review by [07/21/2023].

Pull Request Checklist

  • Review the source issue metadata (required labels, projects, and milestone).

  • Complete the PR description above.

  • Ensure the PR title matches the feature branch name.

  • Check the following:

  • Instructions provided on how to run

  • Developer's name is replaced by ${user} where necessary throughout the code

  • Check that the ecf file has all the proper definitions of variables

  • Check that the jobs file has all the proper settings of COMIN and COMOUT and other input variables

  • Check to see that the output directory structure is followed

  • Be sure that you are not using MET utilities outside the METplus wrapper structure

  • After submitting the PR, select Development issue with the original issue number.

  • After the PR is approved, merge your changes. If permissions do not allow this, request that the reviewer do the merge.

  • Close the linked issue.

marcel caron and others added 30 commits December 2, 2022 01:15
feature/CAM_grid2obs_priority is the next branch to merge into the
authoritative branch; therefore feature/CAM_plotting needs to stay up to
date with changes in feature/CAM_grid2obs_priority
…clude step where pcp_combine output are copied to COMOUT for all models
…ep where pcpcombine output copies to COMOUT
@PerryShafran-NOAA
Copy link
Contributor

There was an error in the crontab for hireswarwmem2. Running the prep step again for that model.

Perry

@PerryShafran-NOAA
Copy link
Contributor

Can you check that you see the hireswarwmen2 prep run?

Thanks!

Perry

@MarcelCaron-NOAA
Copy link
Contributor Author

I see the hireswarwmem2 run and it completed successfully. Thanks!

@PerryShafran-NOAA
Copy link
Contributor

Hi, @MarcelCaron-NOAA ,

For valid date 20230717, I see stat files in /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/stats/cam. I also see prep and stats output files in /lfs/h2/emc/stmp/perry.shafran. Will you take a look at these things and let me know if the stat files and output files are what you would expect? If so, maybe we can finish up this particular PR.

Perry

@MarcelCaron-NOAA
Copy link
Contributor Author

@PerryShafran-NOAA I looked at the output files for each stats job and found no errors, all seem to have completed cleanly at the end of each cycle. I found intermediate stats files successfully outputted after each cycle, and then final stats files for each model. Each of these final stats files has the PTYPE data that is the motivation for this PR, and otherwise looks good. Overall I'm okay with the prep and stats tests.

Another component is the plots job. Were you able to run that as well? That can be run interactively and at any time if pointed to my test archive.

@PerryShafran-NOAA
Copy link
Contributor

I have not run plots yet. Let me run that interactively and let you know what I get.

Perry

@PerryShafran-NOAA
Copy link
Contributor

Hi, Marcel,

Your plot job completed on Dogwood. I am transferring the results to Cactus now for you to see.

One thing that I noticed is that the tar file wound up in the directory /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/plots/cam rather than /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/plots/cam/atmos.20230718. Also there was another directory written but not used:

/lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/plots/cam/plots/cam

Just to note, we should define the following variables:

COMOUT=/lfs/h2/emc/vpppg/noscrub/$USER/evs/v1.0
COMOUTplots=$COMOUT/$STEP/$COMPONENT/$RUN.$VDATE

I think that you defined COMOUT to be /lfs/h2/emc/vpppg/noscrub/${USER}/$NET/$evs_ver/$STEP/$COMPONENT caused some problems down the code. If we could reconfigure these variables, I think we'll be good. Since this is something that is shared by all the cam jobs, we'll all want to define COMOUT and COMOUTplots the same way.

Perry

@PerryShafran-NOAA
Copy link
Contributor

The .o file is here: /lfs/h2/emc/vpppg/noscrub/perry.shafran/pr158test/EVS/dev/drivers/scripts/cam/plots/jevs_cam_grid2obs_plots.o83809033

The plot file is here: /lfs/h2/emc/ptmp/perry.shafran/evs.plots.cam.atmos.grid2obs.v20230718.tar

Perry

@MarcelCaron-NOAA
Copy link
Contributor Author

Hi Perry:

Yes, Shelley noted the same COMOUT directory problem for the plots step in another PR; thanks for bringing it up as I haven't also made the change here. I'll check the output once it's available on Dogwood and then push some changes based on your feedback about COMOUT, hopefully by around this afternoon.

-Marcel

marcel caron added 2 commits July 21, 2023 14:37
@MarcelCaron-NOAA
Copy link
Contributor Author

Hi Perry:

Not sure where this /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/plots/cam/plots/cam is being written from yet so I'll have to dig a little bit before I can fix. I did however add a COMOUTplots definition to the J-job and changed the lower level scripts to write to that directory instead of to COMOUT. I didn't add a top level definition in the driver script because it isn't necessary as long as COMOUT is defined there.

Otherwise, output log looks good to me, and the output tar file includes the PTYPE graphics that I need and that are the motivation for this PR.

I'll update this thread again once I find and fix the part of the code that makes the moot COMOUT directory mentioned above.

@PerryShafran-NOAA
Copy link
Contributor

Hi, @MarcelCaron-NOAA ,

The last time I ran the plots was on Dogwood, so you won't be able to see it unless I run again. But it wound up being right under the /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/plots/cam directory. It's not on Cactus. Machine switches make PRs a bit challenging. ;)

I'll need to run with your fixes and then have you confirm. Let me know when I can test.

Thanks!

Perry

@MarcelCaron-NOAA
Copy link
Contributor Author

Perry:

Yes I don't envy you all having to deal with PR testing across machine switches! Thanks for the help transferring things over.

Strange. I couldn't find where this extra subdirectory is being written, and I couldn't find the directory mentioned in the output log either. I ran the job myself on Cactus and it ran okay for me, no extra subdirectories. Perhaps a recent merge removed the problem?

Do you mind running the plots job one more time, on Cactus?

-Marcel

@PerryShafran-NOAA
Copy link
Contributor

Yes, I can do so.

Perry

@PerryShafran-NOAA
Copy link
Contributor

I pulled in all your updates; let's see if that helps.

Perry

@ShelleyMelchior-NOAA
Copy link
Contributor

Is that an old dir that is lingering? Maybe delete it before the re-run and confirm it is (or isn't) created.

@PerryShafran-NOAA
Copy link
Contributor

We'll see; I'm running it on Cactus now. But when I ran on Dogwood, it had the time stamp of the time it ran.

Marcel made updates; I'm running it now (on his data since it presumably has the ptype in it).

Perry

@PerryShafran-NOAA
Copy link
Contributor

Hi, @MarcelCaron-NOAA ,

The file looks to now be in the correct location: /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v1.0/plots/cam/atmos.20230720

And here's the plot file: evs.plots.cam.atmos.grid2obs.v20230720.tar

Please validate this plot file and then we can approve this PR, pending the code check.

Perry

@MarcelCaron-NOAA
Copy link
Contributor Author

Hi Perry:

I checked the output tar file and it looks good to me. I also didn't notice any extra cam/plots/cam/... subdirectories this time. Thanks for running again.

-Marcel

@PerryShafran-NOAA
Copy link
Contributor

Excellent. I'm going to do a code review now, and unless I see something else egregious, I'll merge.

Thanks!

Perry

@PerryShafran-NOAA PerryShafran-NOAA self-requested a review July 24, 2023 15:24
@PerryShafran-NOAA
Copy link
Contributor

I think we are good to merge now.

Perry

@PerryShafran-NOAA PerryShafran-NOAA self-requested a review July 24, 2023 19:23
@PerryShafran-NOAA PerryShafran-NOAA merged commit b8f6a01 into NOAA-EMC:develop Jul 24, 2023
@MarcelCaron-NOAA MarcelCaron-NOAA deleted the feature/CAM_ptype branch July 24, 2023 20:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants