Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a bunch of C-grid infrastructure #1

Merged
merged 17 commits into from
Nov 12, 2021
Merged

Add a bunch of C-grid infrastructure #1

merged 17 commits into from
Nov 12, 2021

Conversation

apcraig
Copy link
Owner

@apcraig apcraig commented Nov 4, 2021

PR checklist

  • Short (1 sentence) summary of your PR:
    apcraig
  • Developer(s):
    Add C-grid infrastructure
  • Suggest PR reviewers from list in the column to the right.
  • Please copy the PR test results link or provide a summary of testing completed below.
    TBD
  • How much do the PR code changes differ from the unmodified code?
    • bit for bit
    • different at roundoff level
    • more substantial
  • Does this PR create or have dependencies on Icepack or any other models?
    • Yes
    • No
  • Does this PR add any new test cases?
    • Yes
    • No
  • Is the documentation being updated? ("Documentation" includes information on the wiki or in the .rst files from doc/source/, which are used to create the online technical docs at https://readthedocs.org/projects/cice-consortium-cice/. A test build of the technical docs will be performed as part of the PR testing.)
    • Yes
    • No, does the documentation need to be updated at a later time?
      • Yes
      • No
  • Please provide any additional information or relevant details below:

This is a first proposal of C-grid infrastructure changes.

So far, this adds

  • N and E grids, lengths, areas, masks
  • Adds new grid variables to history file
  • Adds grid_average_X2Y methods for averaging between all grids (i.e. t2u)
  • Add gridavgchk unit tester

Still to be done

  • Complete and validate X2Y methods fully
  • Add grid_system namelist
  • Add BC masks
  • Documentation still needs to be updated. (See https://apcraig-cice.readthedocs.io/en/cgrida/index.html)
  • Testing still needs to be done. Seems to be mostly bit-for-bit so far, trying to maintain that. PGI on cheyenne seems to change answers (as usual).

…ta on halos and avoid halo updates to improve halo values when neighbor cells are land block eliminated
Copy link
Collaborator

@eclare108213 eclare108213 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lots of progress already!

@@ -448,6 +449,10 @@ subroutine init_grid2

!-----------------------------------------------------------------
! T-grid cell and U-grid cell quantities
! Fill halo data locally where possible to avoid missing
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this change BFB when blocks are not eliminated?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've run the quick suite on cheyenne pgi, intel, and gnu and it is BFB so far. I was worried about this too. By avoiding the haloupdate for the variables, we can do a little more in terms of the averaging to/from grids for blocks where the neighbor is eliminated. I need to test the full suite and will do that tonight.

tstr3Dz = 'TLON TLAT VGRDi time',& ! vcoord for T cell quantities, 3D
ustr3Dz = 'ULON ULAT VGRDi time',& ! vcoord for U cell quantities, 3D
nstr3Dz = 'NLON NLAT VGRDi time',& ! vcoord for N cell quantities, 3D
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we will need some of these options on the N and E grids, but it's okay to have them available for future use. The B- vs C-grid choice should only impact the dynamics variables, which do not have vertical, ITD (ncat) or FSD dimensions.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These 4d options already exists for the ULON ULAT, so that's why I added them for the N and E grids. It could be that we can get rid of some of them at some point.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add a "clean up history" checkbox in the project board, to remind us to look back at this. I'd suggest commenting out whatever is not immediately needed.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added a tick box to the project page.

@@ -452,7 +452,9 @@ block equally. This is useful in POP which always has work in
each block and is written with a lot of
array syntax requiring calculations over entire blocks (whether or not
land is present). This option is provided in CICE as well for
direct-communication compatibility with POP. The ‘latitude’ option
direct-communication compatibility with POP. Block that contain 100%
land grid cells are eliminated with 'block'. The 'blockall' option is identical
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand the sentence that begins with 'Block'.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I fixed it, it should read "Blocks that contain 100% land grid cells are eliminated with 'block'. The 'blockall' option is identical to 'block' except it does not do land block elimination." Let me know if that's still not clear.

Copy link
Collaborator

@dabail10 dabail10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This somehow looks familiar. Looking good.

@apcraig
Copy link
Owner Author

apcraig commented Nov 5, 2021

Documentation can be viewed here, https://apcraig-cice.readthedocs.io/en/cgrida/index.html

shift to only support S and F X2Y transforms for clarity
add uveln, vvele if grid_system=CD.  proposed placeholder for extra velocity variables (but this could change)
extend io to start to support different grids for history variables depending on grid_system
@apcraig
Copy link
Owner Author

apcraig commented Nov 8, 2021

I think this could be merged. Test results are here, https://github.com/CICE-Consortium/Test-Results/wiki/cice_by_hash_forks#4cd4039ab1d7a25b6a5af991edf1b67eb011a5dc. Everything looks good. Some of the pgi results are not bit-for-bit, but that's typical for pgi. gnu and intel are both bit-for-bit for all tests. The only think missing is the boundary condition masks, that could be done as a separate PR. Lots of things here may need to be revised as we implement the C/CD grid including

  • more grid variables for particular length scales that should be precomputed
  • update the X2Y grid mapping to support particular weighting or masks needed
  • remove/refactor the new "extra" uveln and vvele arrays that are needed for the CD grid
  • update history variables that changes underlying grids in the CD solver

@apcraig apcraig marked this pull request as ready for review November 8, 2021 18:41
Copy link
Collaborator

@eclare108213 eclare108213 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just comments for now. I'm looking at this pretty closely because these changes are fundamental to the new capability, and I want to make sure that we're all on the same page re the assumptions made. This looks great overall -- following responses to my questions/comments, we can probably merge it with some project-board reminders to check/clean up a few things later.

"ice velocity (y)", &
"positive is y direction on U grid", c1, c0, &
"positive is y direction on v grid", c1, c0, &
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's not a "v" grid, is there? I think this line and the uvel one above should remain as "on U grid".

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, now I see that you're reusing uvel, vvel for the C-grid. So maybe to be clearer, there could be a conditional statement that sets the comment depending on which grid is in use, e.g. keep original for B-grid but for C-grid something like "positive is X direction at East face of grid cell"?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct. The U grid is now defined at the corner point. The u and v grid could be the U, N, or E grid. We do need to work on making this clearer in history output, but we do write the history coordinates as part of the output as well, so that helps.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like this one fixed for this PR, please. It's confusing (to me) otherwise. BTW did you spot check the history output from a run to make sure it's working as expected, including existing variables like uvel?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll refactor it a bit. I have checked that I can vary the underlying coordinate system for history fields based on grid_subsystem. One other thing, these comments that are created for variables are not actually written to the history file right now. I'm not sure why, they certainly could be. But it's not implemented in the netcdf or pio IO versions.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Weird that they aren't written -- they definitely should be! Another thing to follow up on...

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have updated the history writing and comments are now being written.

"ice velocity (y)", &
"positive is y direction on E grid", c1, c0, &
ns1, f_vvele)

call define_hist_field(n_uatm,"uatm","m/s",ustr2D, ucstr, &
"atm velocity (x)", &
"positive is x direction on U grid", c1, c0, &
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this comment correct, that uatm, vatm are on the U grid? They might actually be on the T grid here -- need to check (and if wrong, this could be fixed in a subsequent history PR).

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is one of the things we said we needed to review.

tstr3Dz = 'TLON TLAT VGRDi time',& ! vcoord for T cell quantities, 3D
ustr3Dz = 'ULON ULAT VGRDi time',& ! vcoord for U cell quantities, 3D
nstr3Dz = 'NLON NLAT VGRDi time',& ! vcoord for N cell quantities, 3D
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add a "clean up history" checkbox in the project board, to remind us to look back at this. I'd suggest commenting out whatever is not immediately needed.

@@ -162,6 +163,10 @@ subroutine init_dyn (dt)
! velocity
uvel(i,j,iblk) = c0 ! m/s
vvel(i,j,iblk) = c0 ! m/s
if (grid_system == 'CD') then ! extra velocity variables
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just noting here that you're reusing uvel, vvel on E, N faces for the C-grid. That's not clear in the history output.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct. I figured that was the easiest thing to do for now. And I added uveln and vvele as the "extra" arrays for the CD version. This may not be what we want, but it's a starting point.

@@ -59,19 +65,29 @@ module ice_grid
dyt , & ! height of T-cell through the middle (m)
dxu , & ! width of U-cell through the middle (m)
dyu , & ! height of U-cell through the middle (m)
dxn , & ! width of N-cell through the middle (m)
dyn , & ! height of N-cell through the middle (m)
dxe , & ! width of E-cell through the middle (m)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that dxe (i,j) = HUS (i,j). If I remember correctly, HUS etc are available in the grid files (is that right)? If so, are these calculations consistent with those?

Copy link
Owner Author

@apcraig apcraig Nov 8, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CICE only reads HTN and HTE. All other grid lengths are derived from those. The current extension to N and E grids is done the "same way" as how the U dx,dy values are computed. That method is far from ideal (does not conserve area locally), but it's what we've had for a long time. I think it's worthwhile looking into whether we can improve the solution at some point down the road.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll admit I'm being a little picky here, but here's what's available from the grid file:

[eclare@ba-fe1 gx3]$ ncdump -h grid_gx3.nc
netcdf grid_gx3 {
dimensions:
y = 116 ;
x = 100 ;
variables:
double ulat(y, x) ;
ulat:units = "rad" ;
double ulon(y, x) ;
ulon:units = "rad" ;
double htn(y, x) ;
htn:units = "cm" ;
double hte(y, x) ;
hte:units = "cm" ;
double hus(y, x) ;
hus:units = "cm" ;
double huw(y, x) ;
huw:units = "cm" ;
double angle(y, x) ;
angle:units = "rad" ;

It would be prudent to read in hus and huw, and then set dxe = hus and dyn = huw, at least for now. How different are the resulting grid lengths, from using the averaging method? Hopefully not very much... but does that help the area conservation issue? Using existing grid file variables would save some pretty substantial chunks of code in ice_grid.F90, for calculating dxe and dyn.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're opening a huge can of worms here. Using HUS and HUW will almost certainly improve the accuracy of the grid and local area conservation. But, if we read them, then dxt and dyt should also probably change. Yes, dxe=HUS. But then dxt should be 0.5*(HUSij + HUSi-1j) and that will change all answers, make CICE inconsistent with POP (and impact POP/CICE coupling conservation?), and possibly break some usage if users have grid files that don't include HUS and HUW. I'm willing to go there, but we should also carefully examine the solutions that are produced. We could run some qc as a starting point. Do we want to do this now or is that a separate task/issue to tackle separately, either before or after CD grid? And why weren't dxt and dyt based on HUS and HUW in prior implementations?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay. We are identifying lots of issues to look into later. So POP doesn't use HUS and HUW either? This goes WAY back, farther than my memory serves, but for the infrastructure stuff I copied POP as closely as possible, and there was an effort back then to reduce memory usage wherever we could, so maybe we chose to not read them in. dxt and dyt are calculated from HTN and HTE (e.g. dxt=0.5*(HTNij+HTNij-1)). dxu and dyu would be averaged similarly from HUS and HUW. I'm fine with continuing to do the analogous calculations for now, recognizing that there are some sanity checks we could do using the originally calculated grid lengths from the files.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked at POP. It computes the dx, dy just like CICE currently. But it also reads HUS, HUW and stores it internally and uses it for a few computations not directly related to grid lengths. Very odd. I propose we add a namelist variable that allows us to switch back and forth between "old" and "new" grid computations.

@@ -1819,11 +1949,90 @@ subroutine Tlatlon

! TLAT in radians North
TLAT(i,j,iblk) = asin(tz)

! these two loops should be merged to save cos/sin calculations,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this routine isn't called during the timestepping, then this bit of savings won't matter much. I don't remember why it's like this, but I do remember that this subroutine is super finicky. Trig calculations are required based on the center of the Earth for accuracy. I'm not sure that simple averaging is good enough - need to look at it more closely. Might be good enough for now, though.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree and think you're right. Just frustrating that the trig calcs are not bit-for-bit if called in different orders. I don't understand that one!

!
! author: Elizabeth C. Hunke, LANL

subroutine u2tgrid_vector (work)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "_vector" suffix would indicate to me that this (old) routine would calculate vector averages, which it does not. I'm guessing that it was called this because vector quantities are on the U grid (for the B-grid case), which needs the extra halo update before shifting. (I looked at the 'blame' function and this one is on me... sometime between 8 and 15 years ago). This PR replaces this subroutine with "grid_average*", which is more appropriately named and should reproduce the current functionality on B grids. As part of the C-grid development and testing, it would be good to check whether/how much doing proper vector averages would make a difference in the solution. I suspect a lot, since the difference between B- and C-/CD-grids is all about the vector quantities.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only different between the to_u and t2ugrid_vector methods was a halo update. So now the halo updated is required to be done outside the averager and the 4 old averages are all gone, but reimplemented in a bfb way in grid_average_X2Y (we can change the name if folks don't like that one).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The grid_average_X2Y name is fine -- it's better than the previous subroutine name. My main point here is that I don't think we're doing vector averages correctly in old or new code.

@@ -239,6 +239,9 @@ grid_nml
"``grid_file``", "string", "name of grid file to be read", "'unknown_grid_file'"
"``grid_format``", "``bin``", "read direct access grid and kmt files", "``bin``"
"", "``nc``", "read grid and kmt files", ""
"``grid_system``", "``B``", "use B grid structure with T at center and U at NE corner, "``B``"
"", "``C``", "use C grid structure with T at center, U at E edge, and V at N edge", ""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might need to be more specific in how we reference the grids. In my current understanding, we will only implement the sea ice equations for the CD grid, but we will allow ocean coupling directly with either a C or CD grid. Maybe we should use CD everywhere, since it's our main grid, and only mention C-grid in reference to coupling?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need to completely separate the solver and the coupling. The solver may have B, C, CD options, whatever. The coupling will have it's own set of options, maybe fields come in on the T grid, U grid, N grid, E grid, and that could be quite a mix and match for different coupled systems, different forcing files, etc. It could even be that fields come in on a grid that is somehow not associated with the B, CD, etc grids. We want a lot of flexibility for the coupling fields.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, agreed. @JFLemieux73 can correct me on this, but I'm not expecting to have a purely C-grid solver.

| mask (real) | hm | uvm | npm | epm |
+----------------+-------+-------+-------+-------+


In CESM, the sea ice model may exchange coupling fluxes using a
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will we use this namelist variable 'gridcpl_file' for distinguishing between coupling on the C- and CD-grids?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gridcpl_file is something totally different. We have a new namelist, grid_system that can take on values of B, C, CD, etc as we move forward.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I answered too quickly. gridcpl_file allows a model to couple on a completely different grid. In RASM we use this to coupled on an extended grid of the internal grid and we setup ice fraction to 0 at the extended grid points. This helps the regional coupled model significantly. The location of the forcing/coupling fields is totally different. We'll need to create namelist to handle those options. None of that is done yet, we still need to make some decisions about how we want to do that, and it's not critical for the initial implementation, but does need to be done at some point (it could even be after we have the CD implementation on the trunk). For now, coupling/forcing is always assumed to be on the T grid. Extending that to support other grids for subsets of forcing/coupling fields should not be difficult.

@dabail10
Copy link
Collaborator

dabail10 commented Nov 8, 2021

This is great. I am in an all day meeting M-W this week, but I will try to take a look. I was also going to work on some history stuff this week. Looks like you've got a lot of done already.

@dabail10
Copy link
Collaborator

dabail10 commented Nov 8, 2021

Most groups are moving away from POP. So, I don't think we should necessarily retain the POP-style infrastructure. Just my $0.02.

@apcraig
Copy link
Owner Author

apcraig commented Nov 8, 2021

I largely agree @dabail10. But for those that still do couple with POP, we probably need to keep the backwards compatibility. RASM is certainly in this situation and likely to stay there for a least a little while longer. RASM updates CICE regularly, and continues to use older-ish versions of POP. It could be 5-10 years before all users of those applications (including probably recent versions of CESM) have fully moved on. It could also be that we can upgrade the CICE grid calcs without any problem to POP coupling, but that's something the Consortium may not be able to determine universally very easily.

@dabail10
Copy link
Collaborator

dabail10 commented Nov 8, 2021

The coupler checks TLAT, TLON and the mask. So, as long as these are consistent I believe we can do whatever we want.

@apcraig
Copy link
Owner Author

apcraig commented Nov 8, 2021

Could be, but answers will change in CICE and you'll end up with the same TLON, TLAT, and tmask in POP and CICE, but tarea will not be the same in both models. For coupling systems that do direct coupling of fluxes with no area corrections, this will lead to a problem as the system will no longer conserve between CICE and POP. Maybe there aren't any/many of those applications anymore, but again, we don't really know.

@dabail10
Copy link
Collaborator

dabail10 commented Nov 9, 2021

I guess I was thinking TAREA would be the same as well. Nonetheless, your point is taken. I do like the idea of adding a flag to maintain backward compatibility.

@eclare108213
Copy link
Collaborator

Most groups are moving away from POP. So, I don't think we should necessarily retain the POP-style infrastructure. Just my $0.02.

I totally agree. POP frequently comes up in answer to questions like "why is so and so that way?" We are definitely no longer tied to POP, but that said, there were really good reasons why things were done they way they were in POP. I'm not sure any ocean model is as computationally efficient, now, as POP for what it could (and still does) do. But ocean modeling has gotten a lot more complex (and sea ice modeling too).

@phil-blain
Copy link
Collaborator

I forgot earlier that tomorrow is holiday; I won't have time to take a look at this before Friday.

@apcraig
Copy link
Owner Author

apcraig commented Nov 12, 2021

I'll merge now. @phil-blain, feel free to review and provide feedback, I'm sure we'll be creating several PRs over the next week or two. Thanks.

@apcraig apcraig merged commit c6258dc into cgridDEV Nov 12, 2021
Copy link
Collaborator

@phil-blain phil-blain left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had a look. Thanks for all this work, it's a major first step.

@@ -6,3 +6,5 @@ unittest gx3 1x1x25x29x16 sumchk
unittest tx1 8x1 sumchk
unittest gx3 4x1 bcstchk
unittest gx3 1x1 bcstchk
unittest gx3 8x2 gridavgchk,dwblockall
unittest gx3 8x2 gridavgchk
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

General comment about this 'unittest' test suite: We should probably run it as part of the CI, if possible. And maybe, instruct developers/contributors to run it as part of the standard testing procedure (i.e. in addition to the base_suite ) ?

@apcraig apcraig deleted the cgridA branch August 17, 2022 20:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants