Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding history variables for use with Newton-Krylov #1457

Draft
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

slevis-lmwg
Copy link
Contributor

@slevis-lmwg slevis-lmwg commented Aug 9, 2021

Description of changes

decomp_k already had the necessary infrastructure to appear in ctsm history. It's an "inactive" variable and appears in the documentation's master list as K_*. It was dimensioned wrong in the code, so I corrected that in my first commit to this PR, thankfully without changing answers.

Next I will add pathfrac_decomp_cascade to history.

Specific notes

Contributors other than yourself, if any:
@wwieder @ekluzek

CTSM Issues Fixed (include github issue #):
#1455 #1825

Are answers expected to change (and if so in what way)?
No. Variable pathfrac_decomp_cascade will appear in history optionally.

Any User Interface Changes (namelist or namelist defaults changes)?
No. Later we may consider interface changes to add these two inactive variables to history when spinning up with Newton-Krylov.

Testing performed, if any:
The following test PASSES with the mods of the first commit to this PR:
ERP_P36x2_D_Ld3.f10_f10_mg37.I1850Clm50BgcCrop.cheyenne_gnu.clm-extra_outputs -c /glade/p/cgd/tss/ctsm_baselines/ctsm5.1.dev051

@@ -708,6 +727,7 @@ subroutine SetValues ( this, num_column, filter_column, value_column)
i = filter_column(fi)
this%decomp_cpools_transport_tendency_col(i,j,k) = value_column
this%decomp_cpools_sourcesink_col(i,j,k) = value_column
this%decomp_k_col(i,j,k) = value_column
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I moved decomp_k from the "transitions" loop to the "pools" loop as part of this variable's corrected 3rd dimension.

integer, private :: i_s2s3
integer, private :: i_s3s1
integer, private :: i_cwdl2
integer, private :: i_cwdl3
Copy link
Contributor Author

@slevis-lmwg slevis-lmwg Aug 10, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made these variables available to this entire module to accommodate my changes to pathfrac_decomp_cascade. Similarly in deprecated SoilBiogeochemDecompCascadeCNMod.F90.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are the parallel changes needed in CNMod?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made the changes in CNMod to avoid errors when building the code. Otherwise no.

if (.not. use_fates) then
pathfrac_decomp_cascade(bounds%begc:bounds%endc,1:nlevdecomp,i_cwdl2) = cwd_fcel
pathfrac_decomp_cascade(bounds%begc:bounds%endc,1:nlevdecomp,i_cwdl3) = cwd_flig
end if
Copy link
Contributor Author

@slevis-lmwg slevis-lmwg Aug 10, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I now calc. pathfrac_decomp_cascade right after decomp_k in subr. decomp_rate_constants_bgc (and in deprecated subr. decomp_rate_constants_cn) because otherwise the order of initializations did not permit adding pathfrac_decomp_cascade to history.

In particular, leaving pathfrac_decomp_cascade in subr. init_decompcascade_bgc (i.e. unchanged):

  1. as a soilbiogeochem_state_inst variable (i.e. unchanged) and adding to history resulted in messed up bounds for cascade_donor_pool in SoilBiogeochemStateType.F90
  2. as a soilbiogeochem_carbonflux_inst variable and adding to history resulted in messed up bounds for pathfrac_decomp_cascade in subr. init_decompcascade_bgc

I'm happy to explain in greater detail if requested.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll take your word on this :)

@slevis-lmwg slevis-lmwg self-assigned this Aug 10, 2021
@slevis-lmwg
Copy link
Contributor Author

slevis-lmwg commented Aug 11, 2021

Cheyenne test-suite OK

if (.not. use_fates) then
pathfrac_decomp_cascade(bounds%begc:bounds%endc,1:nlevdecomp,i_cwdl2) = cwd_fcel
pathfrac_decomp_cascade(bounds%begc:bounds%endc,1:nlevdecomp,i_cwdl3) = cwd_flig
end if
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll take your word on this :)

@@ -39,6 +39,18 @@ module SoilBiogeochemDecompCascadeCNMod
integer, private :: i_soil3 = -9 ! SOM third pool
integer, private :: i_soil4 = -9 ! SOM fourth pool

real(r8), private :: cwd_fcel
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we don't need this in CNMod, since the CN model is being deprecated.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True, but I think that the case would have stopped in the build phase, so I made the changes anyway.

@@ -110,7 +110,7 @@ subroutine SoilBiogeochemPotential (bounds, num_soilc, filter_soilc, &
initial_cn_ratio => decomp_cascade_con%initial_cn_ratio , & ! Input: [real(r8) (:) ] c:n ratio for initialization of pools

rf_decomp_cascade => soilbiogeochem_state_inst%rf_decomp_cascade_col , & ! Input: [real(r8) (:,:,:) ] respired fraction in decomposition step (frac)
pathfrac_decomp_cascade => soilbiogeochem_state_inst%pathfrac_decomp_cascade_col , & ! Input: [real(r8) (:,:,:) ] what fraction of C leaving a given pool passes through a given transition (frac)
pathfrac_decomp_cascade => soilbiogeochem_carbonflux_inst%pathfrac_decomp_cascade_col , & ! Input: [real(r8) (:,:,:) ] how much C from the donor pool passes through a given transition (frac)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

detail should still be what fraction of C, as this is a fraction, not a flux (which seems implied by 'how much C').

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing to
what fraction of C passes from donor to receiver pool through a given transition
Does this seem ok?

@@ -34,7 +34,6 @@ module SoilBiogeochemStateType
real(r8) , pointer :: fpi_col (:) ! (no units) fraction of potential immobilization
real(r8), pointer :: fpg_col (:) ! (no units) fraction of potential gpp
real(r8) , pointer :: rf_decomp_cascade_col (:,:,:) ! (frac) respired fraction in decomposition step
real(r8) , pointer :: pathfrac_decomp_cascade_col (:,:,:) ! (frac) what fraction of C leaving a given pool passes through a given transition
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this being removed from this module?
Seeing this code it also makes me wonder if we need to:

  • write out fpi_col (N limitation of decomp), or if it's brought into decomp_k?
  • write out rf_decomp_cascade_col, or is this brought into pathfrac somewhere?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pathfrac got moved from here to SoilBiogeochemCarbonFluxType for reasons outlined in this comment

FPI is on the master list as an inactive field.

rf_decomp_cascade is not. Let me know if I should add it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's go ahead and write out fpi and rf_decomp_cascade to history. I think we'll want these for the precondiotnal

doc/ChangeLog Outdated
---------------------------------------
Substantial timing or memory changes:
[For timing changes, you should at least check the PFS test in the test

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are these history fields going to be written out for each column or grid cell average? I'd assume the former, but wondered if N-K solver can handle column level history files @klindsay28?

@ekluzek I also wondered how we easily turn on this output for a N-K spinup? Should this feature come into this PR, or just handled more manually at this stage?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding the second question:
One can activate so-called "inactive" history fields using the namelist. In this testing phase I would rely on that. If after the testing we decide to go with Newton-Krylov, then we can create a corresponding compset. Would you respond differently, @ekluzek ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are these history fields going to be written out for each column or grid cell average? I'd assume the former, but wondered if N-K solver can handle column level history files @klindsay28?

When added to the primary history file, history fields appear as grid cell average by default. With appropriate namelist selections, they can also be sent to history as column variables.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@klindsay28, grid cell average variables are dimensioned by lat, lon, while column variables are written as a column vector. The latter complicates a bit the task of finding where you are on the planet, which I think is why @wwieder asked the question.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The solver will be reading and modifying restart files. Are these column based? (I don't have the CLM/CTSM diagram of grid cell/PFT/column handy.) If so, then the solver will be dealing with column level variables, and column level history output then seems natural. If there is history output needed for the preconditioner that is cell averages, then I think there will be an extra step of mapping the column index to the proper grid cell.

integer, private :: i_s2s3
integer, private :: i_s3s1
integer, private :: i_cwdl2
integer, private :: i_cwdl3
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are the parallel changes needed in CNMod?

@slevis-lmwg
Copy link
Contributor Author

Izumi test-suite OK

slevis-lmwg added a commit to slevis-lmwg/ctsm that referenced this pull request Aug 16, 2021
There are more of these in store; however, I need to merge my work
from ESCOMP#1457 while ESCOMP#1457 is pending, in order to make progress here
@slevis-lmwg
Copy link
Contributor Author

slevis-lmwg commented Aug 19, 2021

Running tests that compare against dev053:
PASS ERP_P36x2_D_Ld3.f10_f10_mg37.I1850Clm50BgcCrop.cheyenne_gnu.clm-extra_outputs
ΟΚ Cheyenne test-suite
OK Izumi test-suite
PASS ERP_P36x2_D_Ld3.f10_f10_mg37.I1850Clm50BgcCrop.cheyenne_gnu.clm-extra_outputs with user_nl_clm modified to include some of the new variables in history:

hist_fincl1   += 'FPI_vr', 'K_ACT_SOM', 'K_CEL_LIT', 'K_CWD',
                 'CWD_PATHFRAC_L2_vr', 'CWD_RESP_FRAC_L2_vr',
                 'L1_PATHFRAC_S1_vr', 'L1_RESP_FRAC_S1_vr',
                 'S1_PATHFRAC_S3_vr', 'S1_RESP_FRAC_S3_vr'

I will update this post as tests complete.

@wwieder
Copy link
Contributor

wwieder commented Aug 19, 2021

Thanks for these changes @slevisconsulting . I just wanted to confirm that in this PR are the additional variables still set to inactive by default with the plan of manually turning them on as we're testing the N-K output?

@slevis-lmwg
Copy link
Contributor Author

in this PR are the additional variables still set to inactive by default with the plan of manually turning them on as we're testing the N-K output?

@wwieder that is correct. The last test listed a few posts up will confirm that namelist variable fincl1 adds these variables to history. We also have the option of changing these variables to "active" if that's the preferred solution.

@slevis-lmwg
Copy link
Contributor Author

slevis-lmwg commented Aug 20, 2021

Notes from today's mtg:

  • @wwieder will clone the branch from this PR and perform a single-point cold-start for 20 yrs at a boreal forest location w AD mode off. He will then continue the run for another 20 yrs or, as @klindsay28 recommended, create a new case that continues for another 20 years.
  • Need column-level history of the necessary variables averaged over the full 20 years.
  • Need restart files from the end of each 20-yr cycle.
  • Before @wwieder starts the runs, @slevisconsulting will add a user_nl_clm to the new directory that Sam will create here: cime_config/usermods_dirs. Sam's file will include the necessary fincl and other hist namelist variables.

@ekluzek
Copy link
Collaborator

ekluzek commented Aug 20, 2021

@slevisconsulting just add a subdirectory to cime_config/usermods_dirs/ with a name something like newton_krylov_spinup. You can add the user_nl_clm there, as well as a shell_commands file if you need to do any xml settings (you should set it to a cold start, so you'll want that).

@@ -0,0 +1,15 @@
hist_dov2xy = .true.,.false.
hist_nhtfrq = 0,-175200
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wwieder the above two lines set the auxiliary history file to 20-yr average vector output.

nhtfrq is in hours: 24 hrs/day * 365 days/yr * 20 yrs = 175200

Two lines down is the list of variables, and I think I got all of them.

@wwieder
Copy link
Contributor

wwieder commented Aug 23, 2021

Thanks @slevisconsulting dumb question here, but what do I need to do when creating a case to get these user_mods to be active? Or will they just be included by default?

@ekluzek
Copy link
Collaborator

ekluzek commented Aug 23, 2021

@wwieder this is done similar to the user-mods used for NEON. It's an option to create_newcase. Use ./create_newcase --help to remind yourself of the options. But, it would be using the "--user-mods-dirs newton_krylov_spinup" option.

@wwieder
Copy link
Contributor

wwieder commented Aug 23, 2021

OK, questions here are moving outside of this issue / PR, but what's the most effective way to create a case for a random point and populate the datm.streams file correctly?

you can see my initial attempts here /glade/work/wwieder/ctsm/ctsm5.1_N-K_test/cime/scripts/RandomBoreal_0

  <file>/glade/scratch/wwieder/single_point/datmdata/clmforc.GSWP3.c2011.0.5x0.5.Prec.RandomBoreal.1901-01.nc</file>

@slevis-lmwg
Copy link
Contributor Author

what's the most effective way to create a case for a random point and populate the datm.streams file correctly?

you can see my initial attempts here /glade/work/wwieder/ctsm/ctsm5.1_N-K_test/cime/scripts/RandomBoreal_0

  <file>/glade/scratch/wwieder/single_point/datmdata/clmforc.GSWP3.c2011.0.5x0.5.Prec.RandomBoreal.1901-01.nc</file>

I didn't look at your case directory, but the quickest/simplest approach that I think should work is to point to the existing global atm files rather than generate new ones for this boreal point. The model will run slower due to interpolation calculations, but you won't have to generate a whole new atmosphere dataset. But @ekluzek may have another suggestion.

@wwieder
Copy link
Contributor

wwieder commented Aug 24, 2021

sorry to pester here, @ekluzek, but can you help make suggestions for how to set up a generic single point (58N, 55E) with less headache than I did here /glade/work/wwieder/ctsm/ctsm5.1_N-K_test/cime/scripts/RandomBoreal_0

@ekluzek
Copy link
Collaborator

ekluzek commented Aug 25, 2021

Hey @wwieder OK so I looked at your case, and also created a case to model from. Two issues in your case is that you need to set CLM_USRDAT_NAME to something that refers to the site you are working on. Using your nomenclature I used "RANDOMBOREAL1". The other thing is the compset you are using which is configured to get tower site data (that's what the 1Pt refers to in the name). So you want a regular compset name so that you get the global forcing data. You also set the site location with PTS_LAT and PTS_LON xml variables.

So my example case is here:

/glade/work/erik/ctsm_worktrees/branch1/cime/scripts/cases/RandomBoreal_0

With createnew case command:

./create_newcase --case cases/RandomBoreal_0 --compset I2000Clm51Bgc --res CLM_USRDAT --driver nuopc --user-mods-dirs newton_krylov_spinup --mpilib mpi-serial --run-unsupported
(note I used the mpi-serial library for MPI because it's just a single point case)

I did these xml commands (added them to shell_commands)

./xmlchange CLM_USRDAT_NAME="BOREAL1"
./xmlchange PTS_LON=55.
./xmlchange PTS_LAT=57.958115183246
./xmlchange CASESTR="Random Boreal"

NOTE: CASESTR is not required, but makes sure a few things are set with something besides UNSET

Added this to user_nl_clm

fsurdat = '/glade/scratch/wwieder/single_point/surfdata_hist_16pfts_Irrig_CMIP6_simyr2000_RandomBoreal_c210823.nc'

and this to user_nl_mosart

frivinp_rtm = '/dev/null'

(This is something that is required with latest updates in externals, when MOSART_MODE=null (i.e. when compset has MOSART rather than SROF). This is a little glitch that we should fix in CTSM so you don't have to do this (There's a MOSART issue about this)

@wwieder
Copy link
Contributor

wwieder commented Aug 25, 2021

Thanks Erik. After submitting, now I'm getting this: ERROR: model_meshfile UNSET does not exist

Do you have any suggestions here?
my case is here /glade/work/wwieder/ctsm/ctsm5.1_N-K_test/cime/scripts/RandomBoreal_0

looks like this is true in your case too /glade/work/erik/ctsm_worktrees/branch1/cime/scripts/cases/RandomBoreal_0/CaseDocs/datm_in

@ekluzek
Copy link
Collaborator

ekluzek commented Aug 25, 2021

Hi @wwieder. Hmmm. Actually there's a simple step that I left out, that you need to do.

My case did run even with that being UNSET. I think that's actually normal. Although it probably means we should clean this up and give it some value like "UNUSED" to make it more clear that it's not needed.

From comparing my case to yours, the differences I see are:

PTS_LON,PTS_LAT, and CLM_USRDAT_NAME aren't set in your case, which are the critical things to make sure are set correctly. Oh, and I see that you added these settings to your ./shell_commands -- but they weren't invoked by the case. So that just means you need to run ./shell_commands manually. That's an important step that I neglected to say. The case does run shell_commands, itself, but it must be either before case.setup or with case.setup....

./shell_commands

@wwieder
Copy link
Contributor

wwieder commented Aug 25, 2021

Thanks @ekluzek that's what I needed to do! We'll have to work on some new documentation for this somewhere. I'll discuss with @danicalombardozzi where this should go.

@wwieder
Copy link
Contributor

wwieder commented Aug 25, 2021

Last question, how should can I share the build from the first case with another nearly identical case?

@slevis-lmwg
Copy link
Contributor Author

can I share the build from the first case with another nearly identical case?

I haven't used this in a long while, but try:
./create_clone --case new_case_name --clone existing_case_name

@danicalombardozzi
Copy link
Contributor

Thanks @ekluzek that's what I needed to do! We'll have to work on some new documentation for this somewhere. I'll discuss with @danicalombardozzi where this should go.

Happy to discuss -- just let me know when!

@ekluzek
Copy link
Collaborator

ekluzek commented Aug 25, 2021

@wwieder the important argument to create_clone is "--keepexe" which will then shared the built case exe with the new case. There's some notes on it and help with

./create_clone --help

@slevis-lmwg
Copy link
Contributor Author

@slevisconsulting was your test for a 10x10 grid? Can you make this run correctly with a single point?

@wwieder the first one was 10x10, but I just tried a single-point successfully:
/glade/u/home/slevis/cases_mimics/newton_krylov_1x1_brazil

@ekluzek
Copy link
Collaborator

ekluzek commented Nov 29, 2021

Ahh, @wwieder you don't need a mesh file for the main model a single point case. But, you do need a mesh file in the datm streams file f you are pointing to datm forcing data. This looks like the problem. needs to be set in <stream_info> for your single point forcing data.

I suggest first making a case where you use the global forcing data and make sure that works. In the meantime I'll get back to you on creating a mesh file for your single point forcing data.

@wwieder
Copy link
Contributor

wwieder commented Nov 29, 2021

the shell_commands that were created with this case didn't have all the needed information. I copied this from a different case and things seems to be running now.

@wwieder
Copy link
Contributor

wwieder commented Nov 30, 2021

@klindsay28 you should be able to clone this case for the next step in your testing.
/glade/work/wwieder/ctsm/ctsm5.1_N-K_test/cime/scripts/RandomBoreal_1c

@slevisconsulting and @ekluzek thanks for helping work this out.

@wwieder
Copy link
Contributor

wwieder commented Nov 30, 2021

@ekluzek I should have been more clear. This is a single point case, but it's just reading in the global GSWP3 datm files.
I forgot that I needed to manually add some details to shell_commands in the new case I created (which is noted above in the thread).

This is something we can work to better document in the updated nuopc workflow for running generic single point cases with global input data.

@slevis-lmwg
Copy link
Contributor Author

Linking @ekluzek 's comment
You need to add a test for the new newton_krylov testmod.
...to this PR so as to not forget it.

Correct perched water table calculation

Modifies the calculation of the frost table and perched water table
layers.

This brings in the answer-changing aspects of the Hillslope Hydrology
branch (ESCOMP#1715).

Specific changes are:

* PerchedWaterTable

1) Frost table depth

a) original frost table determination looped from the top
of the soil column downward to the index of the first layer
whose temperature was <= freezing, and whose neighbor above
had a temperature above freezing.  The frost table depth is
then given by the node of that soil layer, i.e. z(k_frz).

b) in the new method, the same index is found, but the
depth of the frost table is given by the depth of the top
of the frozen layer, i.e. zi(k_frz-1).  Note zi(k_frz) would
be the bottom of layer k_frz.

2) Perched water table depth

a) in the original formulation, a loop from k_frz to layer 1
was used to identify the deepest layer in 1:k_frz whose
volumetric soil moisture was greater than a threshold given
by sat_lev (e.g. sat_lev = 0.9).

b) in the new method, the search is only done if k_frz is
greater than 1.  The rationale is that if k_frz = 1, then
zwt_perched has already been initialized to the frost_table
depth (which is equal to the top of the uppermost soil layer),
and therefore no search is required.

3) Determining perched water table depth within layer
identified by index k_perch in 2)

a) in the original formulation, the perched water table
depth was calculated by linearly interpolating between layers
k_perch and k_perch+1, with no consideration of their relative
values.  In the case where the deeper layer was drier than the
layer above it, this could result in values far outside the
soil layer.

b) in the new formulation, if the deeper layer is drier than the
layer above it (s1 > s2), then the perched water table depth is
simply given by the depth of the upper surface of layer k_perch,
i.e. zi(c,k_perch-1).

* PerchedLateralFlow

1) Removal of icefrac calculation

a) in the original calculation, the frozen layer was included,
so an ice impedance factor was calculated using icefrac.  If
only unfrozen layers are used, no ice impedance factor is needed.

b) in the new formulation, the icefrac variable and loop are
removed.

2) Move loop calculating frost and perched water table depths

a) in the original formulation, the frost and perched water
table depths were calculated in the same loop as the calculation
of the lateral flow from the perched saturated zone.

b) in preparation for the hillslope hydrology branch, this
calculation is moved into its own loop.

3) q_perch calculation

a)in the original formulation, q_perch was calculated by summing
over layers k_perch to k_frost.  However, because k_frost is now
identified as the frozen layer, and its depth the top of the frozen
layer, it should not be included in the calculation; only the
unfrozen layers above it should be included.

b) in the new formulation, the loop is bounded by k_frost-1 instead
of k_frost.

4) Removal of water from perched saturated zone

a) the in the original method, the frozen layer (k_frost)
was included in the loop.  Also, the drainage was defined to be
negative, which was confusing.

b) in the new method, the frozen layer is not included in the
loop; water is only removed from the unfrozen layers above k_frost.
Calculate drainage as positive values, which are then subtracted
from the soil moisture in each layer.

Resolved conflicts:
doc/ChangeLog
doc/ChangeSum
@slevis-lmwg
Copy link
Contributor Author

I'm running a copy of this case
RandomBoreal_1c
to generate output for Wednesday's meeting with @klindsay28

For the same purpose I am also running a global 10x15 case.

Reuse some files generated in initialization when rerunning

A new directory is now in your run directory init_generated_files. These
two run time generated files - finidat_interp_dest.nc and the land
fraction file ctsm_landfrac.nc (which is new in this PR) - are put in
that directory when they are created. Then, if they already exist, they
are reused rather than being regenerated. (Technically, it looks for a
status file that flags successful creation of this file.) This applies
when you rerun a startup (non-restart) run a second or subsequent time,
as is common when doing development and testing.

For tests, theinit_generated_files directory is removed before running,
so that rerunning a test will do the same thing the second time.

There are occasional, rare cases when a user would need to manually
remove the init_generated_files directory – e.g., when changing things
about N dominant landunits or PFTs.

Also, output was cleaned up in a few cases so that it did not span
multiple lines.

Resolved conflicts:
doc/ChangeLog
doc/ChangeSum
@ekluzek
Copy link
Collaborator

ekluzek commented Jan 20, 2024

@slevis-lmwg is this something that was close enough that it come in as it is? Or is there more work to do on it? We kind of abandoned it a while back, so if it takes too long to get spun up on it again, don't worry about it. But, if it's close enough that it could come in as is, it would be worth bringing it in.

@slevis-lmwg
Copy link
Contributor Author

@wwieder also has a say here.
My thinking is to keep this unmerged as a draft until we feel confident that the Newton-Krylov approach will work for us.

@wwieder
Copy link
Contributor

wwieder commented Jan 22, 2024

I think this suggestion makes sense, @slevis-lmwg hopefully it's not hard to bring this draft up to main if the N-K is effectively implimented.

@ekluzek
Copy link
Collaborator

ekluzek commented Jan 22, 2024

Sounds good. We'll leave it as it is...

@samsrabin samsrabin added simple bfb bit-for-bit and removed simple bfb labels Aug 8, 2024
@samsrabin samsrabin added good first issue simple; good for first-time contributors and removed simple labels Oct 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bfb bit-for-bit good first issue simple; good for first-time contributors
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants