Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update components for v0.3.0 #118

Closed
5 of 6 tasks
micaeljtoliveira opened this issue Mar 13, 2024 · 24 comments
Closed
5 of 6 tasks

Update components for v0.3.0 #118

micaeljtoliveira opened this issue Mar 13, 2024 · 24 comments

Comments

@micaeljtoliveira
Copy link
Contributor

micaeljtoliveira commented Mar 13, 2024

This issue is to plan and track the progress of updating the OM3 components to newer versions.

Before starting this, it would be good to tag the codebase and the configurations. It's not strictly necessary, but it's good practice to do it before adding changes that will most likely break backward compatibility.

So here is a first attempt at the list of tasks:

  • Update the Spack environments in /g/data/ik11
  • Decide on which versions we want to update to
  • Update versions in the OM3 codebase
  • Update the configurations as needed
  • Do a release of OM3
  • Do a release of the officially supported configurations (MOM6-CICE6 and MOM6-CICE6-WW3).
@aekiss
Copy link
Contributor

aekiss commented Mar 13, 2024

Has CESM upgraded to CICE6.5? If not we might need to get it from upstream to get C-grid support #39

@micaeljtoliveira
Copy link
Contributor Author

Yes, this is the version used in the development branch of CESM: https://github.com/ESCOMP/CICE/releases/tag/cice6_5_0_20240222

@anton-seaice
Copy link
Contributor

anton-seaice commented Mar 13, 2024

Re: dependencies / spack

Yes, this is the version used in the development branch of CESM: https://github.com/ESCOMP/CICE/releases/tag/cice6_5_0_20240222

That's only 3 commits behind the current main, one of which is irrelevant to us, but the other two fix bugs relevant for us. One is the calendar cice patch and the other is a bug in PIO implementation for CESMCOUPLED.

@micaeljtoliveira
Copy link
Contributor Author

micaeljtoliveira commented Mar 14, 2024

Here is a summary of the changes from the previous CESM version we used and the latest one:

CICE: 6.4.1_20230620 -> 6.5.0_20240222
CMEPS: 0.14.35 -> 0.14.50
CDEPS: 1.0.19 -> 1.0.28
MOM dev/ncar_230504 -> dev/ncar_240214
WW3: dev/unified_0.0.7 -> dev/unified_0.0.10
CESM_share: 1.0.17 -> 1.0.18

The following OM3 external libraries are also updated in CESM:

parallelio: 2.5.10 -> 2.6.2
FMS: 2021.03.01 -> dev/ncar_0.0.1

Note that for FMS the git repository was changed and CESM is no longer using an official release. Instead, it's using a fork, probably because of some extensive patches.

Let me know if all these are okay, or if we need some newer/different version. Regarding FMS, I'll investigate what are the changes introduced in the fork.

@micaeljtoliveira
Copy link
Contributor Author

Re: dependencies / spack

* OpenMPI promised a 4.1.7 release in Q1 2024, hopefully it comes soon. This fixes a bug in parallel reads of symlinks. (related to [Parallel IO in CICE #81](https://github.com/COSIMA/access-om3/issues/81))

I'll start updating the Spack environment next week, so this will probably have to wait.

Yes, this is the version used in the development branch of CESM: https://github.com/ESCOMP/CICE/releases/tag/cice6_5_0_20240222

That's only 3 commits behind the current main, one of which is irrelevant to us, but the other two fix bugs relevant for us. One is the calendar cice patch and the other is a bug in PIO implementation for CESMCOUPLED.

Let's then use the current main for CICE.

@aekiss
Copy link
Contributor

aekiss commented Mar 14, 2024

CICE: 6.4.1 -> 5.0.7

From the link I guess you meant 6.5.0?

@micaeljtoliveira
Copy link
Contributor Author

From the link I guess you meant 6.5.0?

Yes! Fixed now.

@anton-seaice
Copy link
Contributor

CICE: 6.4.1 -> 5.0.7

From the link I guess you meant 6.5.0?

Note its not really 6.5.0 ... its the main from 22 Feb 2024 which is sometime after 6.5.0

@micaeljtoliveira
Copy link
Contributor Author

Note its not really 6.5.0 ... its the main from 22 Feb 2024 which is sometime after 6.5.0

I updated the comment to reflect this better. It's really confusing when people add their own versioning to their forks 😢

@aidanheerdegen
Copy link

ACCESS-OM2 performance was 1.5% better with parallelio-2.5.2 vs 2.5.10. It's a small difference, and maybe lost in the noise of other changes in OM3, but do you know if you see similar performance drop with the later version?

https://forum.access-hive.org.au/t/testing-spack-model-component-builds-for-access-om2/1567/11

See also ACCESS-NRI/ACCESS-OM2#30

@micaeljtoliveira
Copy link
Contributor Author

ACCESS-OM2 performance was 1.5% better with parallelio-2.5.2 vs 2.5.10. It's a small difference, and maybe lost in the noise of other changes in OM3, but do you know if you see similar performance drop with the later version?

We haven't tested this, as we did not use 2.5.2 and instead started with 2.5.9.

@anton-seaice
Copy link
Contributor

I think you might have said yesterday, but is doing a 0.3.0 config tag stuck on something ?

@micaeljtoliveira
Copy link
Contributor Author

I need to finish updating the configurations. Working on this, but today gadi hasn't been behaving very well... 😢

@anton-seaice
Copy link
Contributor

Yes its being a bit flakey!

@adele-morrison
Copy link

gadi hasn't been behaving very well... 😢

Do we know why yet? A couple of people from our group have reported to NCI, but have not had helpful responses. NCI didn't acknowledge or seem to know that it is a system-wide problem.

@minghangli-uni
Copy link
Contributor

Not sure if it is because someone is using login node doing something...

@micaeljtoliveira
Copy link
Contributor Author

I've tagged the OM3 codebase. Now waiting to get all the required changes into the configs.

@micaeljtoliveira
Copy link
Contributor Author

Considering that the configurations are all in different states of development, I'm not sure we should tag a new release for each of them.

@anton-seaice
Copy link
Contributor

How about tagging for MOM6-CICE6 only? What is the downside to tagging the release? (I guess tagging a not-working point in history is useless)

@micaeljtoliveira
Copy link
Contributor Author

Assuming the configuration as tagged works, there's no downside, specially because all of them are not yet considered to be "stable" (version < 1.0). I guess the question I'm asking here is if it's really useful to tag all the configurations.

My original plan was to tag simultaneously both the codebase and all the configurations, so that it was clear that version X of a given configuration worked with version X of OM3. Now that it was decided to decouple this, there's no need to tag the configurations when tagging the OM3 codebase.

@anton-seaice
Copy link
Contributor

Whats happening with https://github.com/COSIMA/MOM6-CICE6-WW3/tree/1deg_jra55do_ryf ? Did we intentionally not update the binary? Or forget ?

@micaeljtoliveira
Copy link
Contributor Author

Whats happening with https://github.com/COSIMA/MOM6-CICE6-WW3/tree/1deg_jra55do_ryf ? Did we intentionally not update the binary? Or forget ?

IIRC, that configuration was quite behind the IAF one and required some more extensive changes to work with the latest exec. But maybe I'm wrong?

@anton-seaice
Copy link
Contributor

I think we can close this as completed.

@micaeljtoliveira points out - the configs are constantly being updated, and you can now tell which binary they were tested / working with from the path in config.yaml, so adding the version to configurations doesn't add any information

@micaeljtoliveira
Copy link
Contributor Author

Okay, closing this then.

Note that once the NRI release team is ready to deploy the OM3 configurations, you should tag them, so it's clear what is to be deployed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants