Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setup the snow DA analysis to update to the enkf ensemble members #2033

Closed

Conversation

jiaruidong2017
Copy link
Contributor

Description

Add/Modify config.* and job files for JEDI-based Land DA analysis to update the EnKF ensemble members.

  • Changes to config.base to turn on/off land DA (add DO_JEDILANDENS).
  • Changes to config.resources for the new task (add landensanl)
  • Addition of new config and job files for land DA related options (See below)

This PR adds in initial configuration files for support of future global land analysis capabilities for the EnKF ensemble members.

New files:
parm/config/gfs/config.landensanl
jobs/JGLOBAL_LANDENS_ANALYSIS
jobs/rocoto/landensanl.sh
scripts/exglobal_landens_analysis.py

Modified files:
config.resources
env/HERA.env
workflow/rocoto/gfs_tasks.py
workflow/rocoto/tasks.py
workflow/applications/gfs_cycled.py

How Has This Been Tested?

These are place holders for future experiment setup and XML generation.

Checklist

  • Any dependent changes have been merged and published
  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • My changes generate no new warnings
  • New and existing tests pass with my changes
  • I have made corresponding changes to the documentation if necessary

Copy link
Contributor

@WalterKolczynski-NOAA WalterKolczynski-NOAA left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some issues with the COM variable we may need to get together and work out.

Comment on lines 4 to 5
export WIPE_DATA="NO"
export DATA=${DATA:-${DATAROOT}/${RUN}landensanl_${cyc}}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are only needed if the job is using the working directory of a previous job. It appears this job is doing initialize/run/finalize all in one job.

Suggested change
export WIPE_DATA="NO"
export DATA=${DATA:-${DATAROOT}/${RUN}landensanl_${cyc}}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed as you suggested. Thanks.

# Begin JOB SPECIFIC work
##############################################
# Generate COM variables from templates
YMD=${PDY} HH=${cyc} generate_com -rx COM_LANDENS_ANALYSIS
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

COM_LANDENS_ANALYSIS_TMPL is not defined anywhere in order to build this variable. Should this be

Suggested change
YMD=${PDY} HH=${cyc} generate_com -rx COM_LANDENS_ANALYSIS
YMD=${PDY} HH=${cyc} generate_com -rx COM_LAND_ANALYSIS

(COM_LAND_ANALYSIS is already defined in config.com)

I also don't see a MEMDIR definition, which this will need if it is part of RUN=enkf. What RUN(s) is this part of, and if it is part of enkf, is it trying to write stuff for a bunch of different members (either as a metatask or looping over members within a single job)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @WalterKolczynski-NOAA I removed the whole sections because the COM directories for the ensemble members will be defined at the runtime and the MEMDIR is part of RUN=enkf.

@jiaruidong2017 jiaruidong2017 marked this pull request as draft November 7, 2023 15:08
@CoryMartin-NOAA
Copy link
Contributor

@jiaruidong2017 marked this as draft because we plan to make this a metatask within the workflow in order to parallel process the 80 ensemble members

Copy link
Contributor

@aerorahul aerorahul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First look at the addition in parm/config.landensanl.

parm/config/gfs/config.landensanl Show resolved Hide resolved
jobs/JGLOBAL_LANDENS_ANALYSIS Fixed Show fixed Hide fixed
jobs/JGLOBAL_LANDENS_ANALYSIS Fixed Show fixed Hide fixed
@jiaruidong2017 jiaruidong2017 marked this pull request as ready for review December 13, 2023 13:35
@aerorahul
Copy link
Contributor

@jiaruidong2017
We are waiting on the deterministic test before we can review this PR wholly.

@jiaruidong2017
Copy link
Contributor Author

@jiaruidong2017 We are waiting on the deterministic test before we can review this PR wholly.

@aerorahul Both the deterministic test and the EnKF test on C96C48 resolution succeeded as below:

image

What and where do I need to provide to confirm the above tests? Thanks.

@CoryMartin-NOAA
Copy link
Contributor

@jiaruidong2017 like we chatted about the other day, @aerorahul would like a test case for the workflow CI. Does his date of Dec 21, 2021 18z work for your purpose? (it looks like it does based on above).

If so, please provide him with the input obs, and all the instructions to run the test (including setup_expt.py options and any changes needed to the config.* files)

@jiaruidong2017
Copy link
Contributor Author

@CoryMartin-NOAA Yes, I used the initial conditions (/scratch1/NCEPDEV/global/glopara/data/ICSDIR/C96C48/) on Dec 21, 2021 at 18z, and conducted the test run for the four cycles to Dec 22, 2021 at 18z because the IMS assimilation is only at the 18z cycle. I made some changes to the initial conditions because of the changes of COM directory.

@aerorahul How do I provide the input obs to you?

@@ -242,6 +243,11 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then
targrp_list="${targrp_list} gdasice"
fi

#gdasland
if [ "${DO_JEDILANDDA}" = "YES" ]; then
targrp_list="${targrp_list} gdasland"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see an additional file list being created in hpssarch_gen.sh that matches this name.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the gdasland list, and we added it to gdas.txt. Thanks for your review.

Comment on lines 13 to 16
##export FIMS_NML_TMPL="${HOMEgfs}/sorc/gdas.cd/parm/land/prep/fims.nml.j2"
##export IMS_OBS_LIST="${HOMEgfs}/sorc/gdas.cd/parm/land/prep/prep_ims.yaml"
##export CALCFIMSEXE="${HOMEgfs}/exec/calcfIMS.exe"
##export IMS2IODACONV="${HOMEgfs}/ush/imsfv3_scf2ioda.py"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If these aren't needed anymore, they should just be removed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

env/HERA.env Outdated
echo "atmanlrun atmensanlrun aeroanlrun landanl"
echo "anal sfcanl fcst post metp"
echo "atmanlrun atmensanlrun aeroanlrun landanl landensanl"
echo "anal sfcanl fcst post vrfy metp"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is vrfy being added back in?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed vrfy.

@CoryMartin-NOAA
Copy link
Contributor

@jiaruidong2017 is it easy enough to provide the ICs for 12z, so that the first DA cycle is at 18z, then we only need to run 2.5 cycles instead of 4.5

@jiaruidong2017
Copy link
Contributor Author

jiaruidong2017 commented Jan 4, 2024

@CoryMartin-NOAA Good suggestion. The ICs I used are provided by you. Do you mean I can generate the ICs at 12z by myself? Thanks.

@CoryMartin-NOAA
Copy link
Contributor

@jiaruidong2017 if you could, that would be great. @aerorahul would prefer to run:

  • 12z half cycle fcst
  • 18z da
  • 18z fcst
  • 00z da
  • 00z fcst
    2 cycles instead of 1 to test with and without IMS snow

@jiaruidong2017
Copy link
Contributor Author

@CoryMartin-NOAA Okay, I will generate an ICs at 12z, and conduct a 2-cycle run.

@jiaruidong2017
Copy link
Contributor Author

The ICs and OBS are updated below:

The ICs on Dec 20, 2021 at 12z are saved at:

/scratch2/NCEPDEV/stmp1/Jiarui.Dong/gdas.init/landid9/C96C48/output/gdas.20211220
/scratch2/NCEPDEV/stmp1/Jiarui.Dong/gdas.init/landid9/C96C48/output/enkfgdas.20211220

The observation files with 2 cycles are provided below:

t18z
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211220/18/atmos/gdas.t18z.adpsfc.tm00.bufr_d
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211220/18/atmos/gdas.t18z.snocvr_snow.nc4
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211220/18/atmos/gdas.t18z.ims2021354_4km_v1.3.nc
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C96_oro_data.nc
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C48_oro_data.nc
t00z
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211221/00/atmos/gdas.t00z.adpsfc.tm00.bufr_d
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gdas.20211221/00/atmos/gdas.t00z.snocvr_snow.nc4

@CoryMartin-NOAA
Copy link
Contributor

Thanks @jiaruidong2017 ! @aerorahul is this all you need now?

@KateFriedman-NOAA
Copy link
Member

I have copied the new files into the GDA on Hera for testing:

[role.glopara@hfe06 dump]$ pwd
/scratch1/NCEPDEV/global/glopara/dump
[role.glopara@hfe06 dump]$ ll gdas.20211220/18/atmos/gdas.t18z.snocvr_snow.nc4
-rw-r--r-- 1 role.glopara global 987968 Dec 18 16:30 gdas.20211220/18/atmos/gdas.t18z.snocvr_snow.nc4
[role.glopara@hfe06 dump]$ ll gdas.20211220/18/atmos/gdas.t18z.ims2021354_4km_v1.3.nc 
-rw-r--r-- 1 role.glopara global 557852 Aug  8 14:12 gdas.20211220/18/atmos/gdas.t18z.ims2021354_4km_v1.3.nc
[role.glopara@hfe06 dump]$ ll gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C*
-rw-r--r-- 1 role.glopara global 455639880 Dec  2 18:10 gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C192_oro_data.nc
-rw-r--r-- 1 role.glopara global 463602504 Aug  8 14:12 gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C384_oro_data.nc
-rw-r--r-- 1 role.glopara global   4110338 Dec 13 13:00 gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C48_oro_data.nc
-rw-r--r-- 1 role.glopara global 453649224 Dec 13 13:00 gdas.20211220/18/atmos/gdas.t18z.IMS4km_to_FV3_mapping.C96_oro_data.nc
[role.glopara@hfe06 dump]$ ll gdas.20211221/00/atmos/gdas.t00z.snocvr_snow.nc4
-rw-r--r-- 1 role.glopara global 960306 Dec 18 16:30 gdas.20211221/00/atmos/gdas.t00z.snocvr_snow.nc4

The adpsfc files are the same as what currently exists in the GDA so nothing has been done with those files.

@jiaruidong2017
Copy link
Contributor Author

@aerorahul Please use branch feature/landda_ci from GDASApp for the CI test.

@aerorahul
Copy link
Contributor

@jiaruidong2017 @CoryMartin-NOAA

We have copied the observations and the ICs to the GDA and the ICSDIR spaces on Hera.
I have created a yaml for the deterministic snowDA test. The yaml is available in the branch feature/snow_test.

I have exercised the setup of the experiment as a CI test would do as follows:

export LOGGING_LEVEL=INFO
export pslot=snowtest
export SLURM_ACCOUT=fv3-cpu
export RUNTESTS=/scratch1/NCEPDEV/stmp2/Rahul.Mahajan/RUNTESTS
export ICSDIR_ROOT=/scratch1/NCEPDEV/global/glopara/data/ICSDIR

./workflow/create_experiment.py --yaml ci/cases/pr/C96_atmsnowDA.yaml
2024-01-05 11:52:32,137 - INFO     - root        : BEGIN: __main__.input_args
2024-01-05 11:52:32,138 - INFO     - root        :   END: __main__.input_args
2024-01-05 11:52:32,153 - INFO     - root        : Call: setup_expt.main()

****************************************************************************************************
EXPDIR: /scratch1/NCEPDEV/stmp2/Rahul.Mahajan/RUNTESTS/EXPDIR/snowtest
COMROT: /scratch1/NCEPDEV/stmp2/Rahul.Mahajan/RUNTESTS/COMROT/snowtest
****************************************************************************************************
2024-01-05 11:52:32,160 - INFO     - root        : Call: setup_xml.main()
Finalizing initialize
sourcing config.prep
sourcing config.anal
sourcing config.analdiag
sourcing config.sfcanl
sourcing config.analcalc
sourcing config.fcst
sourcing config.upp
sourcing config.atmos_products
sourcing config.arch
sourcing config.cleanup
sourcing config.fit2obs
sourcing config.verfozn
sourcing config.verfrad
sourcing config.vminmon
sourcing config.tracker
sourcing config.genesis
sourcing config.preplandobs
sourcing config.landanl

Please verify the experiment configuration in: /scratch1/NCEPDEV/stmp2/Rahul.Mahajan/RUNTESTS/EXPDIR/snowtest

CoryMartin-NOAA added a commit to NOAA-EMC/GDASApp that referenced this pull request Jan 5, 2024
- The UFO filters are rearranged by using the explicit filter ordering
with `obs pre filters`, `obs prior filters` and `obs post filters`
options.
- Add `Temporal Thinning filter` into the `obs post filters` section and
and arrange it after the `Background Check filter` and before the `Buddy
check filter`.
- Rename the IMS IODA file name and copy it into the `rundir/obs`
directory for the late use in the snow analysis in the global workflow
(NOAA-EMC/global-workflow#2033).

Co-authored-by: Cory Martin <cory.r.martin@noaa.gov>
@jiaruidong2017
Copy link
Contributor Author

@KateFriedman-NOAA Would you please copy the new observation files below into the GDA on Hera for the gfs_cyc=1 testing. Thanks.

t18z
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211220/18/atmos/gfs.t18z.adpsfc.tm00.bufr_d
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211220/18/atmos/gfs.t18z.snocvr_snow.nc4
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211220/18/atmos/gfs.t18z.ims2021354_4km_v1.3.nc
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211220/18/atmos/gfs.t18z.IMS4km_to_FV3_mapping.C96_oro_data.nc
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211220/18/atmos/gfs.t18z.IMS4km_to_FV3_mapping.C48_oro_data.nc
t00z
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211221/00/atmos/gfs.t00z.adpsfc.tm00.bufr_d
/scratch1/NCEPDEV/global/Jiarui.Dong/JEDI/GlobalWorkflow/para_gfs/glopara_dump/gfs.20211221/00/atmos/gfs.t00z.snocvr_snow.nc4

@KateFriedman-NOAA
Copy link
Member

@jiaruidong2017 Done:

[role.glopara@hfe02 atmos]$ pwd
/scratch1/NCEPDEV/global/glopara/dump/gfs.20211220/18/atmos
[role.glopara@hfe02 atmos]$ ll *ims2* *IMS* *snocvr*                                                                                                                       
-rw-r--r-- 1 role.glopara global   4110338 Jan  8 01:02 gfs.t18z.IMS4km_to_FV3_mapping.C48_oro_data.nc
-rw-r--r-- 1 role.glopara global 453649224 Jan  8 01:02 gfs.t18z.IMS4km_to_FV3_mapping.C96_oro_data.nc
-rw-r--r-- 1 role.glopara global    557852 Jan  8 01:02 gfs.t18z.ims2021354_4km_v1.3.nc
-rw-r--r-- 1 role.glopara global    987968 Jan  6 20:38 gfs.t18z.snocvr_snow.nc4
[role.glopara@hfe02 atmos]$ pwd
/scratch1/NCEPDEV/global/glopara/dump/gfs.20211221/00/atmos
[role.glopara@hfe02 atmos]$ ll *snocvr*
-rw-r--r-- 1 role.glopara global 960306 Jan  6 20:38 gfs.t00z.snocvr_snow.nc4

The adpsfc files are the same as what is in the GDA so I have not done anything with those files.

@CoryMartin-NOAA
Copy link
Contributor

This PR is on hold until a series of subsequent PRs are created for the deterministic case. At that point, we will either supersede this PR with a new one, or modify this PR to handle the ensemble case.

@jiaruidong2017 jiaruidong2017 marked this pull request as draft January 25, 2024 20:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants