Skip to content

Commit

Permalink
[develop] Update Build/Run and ConfigWorkflow Chapters (#425)
Browse files Browse the repository at this point in the history
* update stochastic physics link

* add HPC-Stack Intersphinx links

* remove hpc-stack submodule in favor of intersphinx links

* move tables to 'tables' directory

* remove duplicate Glossary terms; update links

* update Quickstart

* add M. Leukin to mgmt team list; change expt gen command

* fix typo

* fix typo

* add stochastic physics link & troubleshooting tips

* separate Build & Run chapters; update crosslinks

* minor fix

* update file exts in Config Params chapter

* update user/platform sections of Config Params Chapter

* update file name params

* grid_gen, verbose, compiler, etc. params

* ConfigWorkflow 1st draft revision

* add met docs to intersphinx

* fix minor errors

* updated table 4.1

* add default values to ConfigWorkflow, other minor fixes

* add RRFS and GSI to Glossary

* update tables & file exts

* add L1 data locations

* address comments in ConfigWorkflow

* fix table

* minor fixes

* minor updates

* rm reg_wflow references, update tables, minor updates

* switch order of steps so python env is loaded first

* edit and rearrange pyenv and config BR sections

* edit mac/linux/VX config sections

* 1st draft of RunSRW

* add parameter definitions that were accidentally deleted in .sh --> .yaml switch

* remove comments

* fix typo

* fix link

* add info from Mike L.

* update build & workflow images

* update workflow generation image

* update image

* update image

* fix reviewer comments

* update expt gen image

* rm reg_wflow refs, switch config.sh to .yaml, update directory structure in intro

* rm reg_wflow refs, switch config.sh to .yaml

* change templates dir to parm

* update GSI glossary entry

* fixes based on PR review

* remove fixed file mapping params

* remove SFC_CLIMO_FIELDS from config_defaults.yaml/ConfigWorkflow.rst

* update info on data: section of machine file

* minor wording fix

* minor wording fix

* add note about converting ols .sh file to .yaml

* update intro forum link, minor wording/details

* update pngs, RRFS note, minor intro fixes

* update wflow gen img

Co-authored-by: gspetro <gillian.s.petro@gmail.com>
  • Loading branch information
gspetro-NOAA and gspetro authored Oct 31, 2022
1 parent ce024c4 commit 0e11b9d
Show file tree
Hide file tree
Showing 24 changed files with 3,637 additions and 2,821 deletions.
1,632 changes: 0 additions & 1,632 deletions docs/UsersGuide/source/BuildRunSRW.rst

This file was deleted.

496 changes: 496 additions & 0 deletions docs/UsersGuide/source/BuildSRW.rst

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/UsersGuide/source/Components.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ A Python script is provided to create basic visualizations of the model output.
is designed to output graphics in PNG format for 14 standard meteorological variables
when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results.

After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/regional_workflow/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s <Graphics>` and are also included at the top of the script.
After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s <Graphics>` and are also included at the top of the script.

Build System and Workflow
=========================
Expand Down
2,465 changes: 1,521 additions & 944 deletions docs/UsersGuide/source/ConfigWorkflow.rst

Large diffs are not rendered by default.

12 changes: 6 additions & 6 deletions docs/UsersGuide/source/ContainerQuickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@
Container-Based Quick Start Guide
====================================

This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity <https://sylabs.io/guides/3.5/user-guide/introduction.html>`__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`'s, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager <https://github.com/christopherwharrop/rocoto/wiki/Documentation>`__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container approach <BuildRunSRW>` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App.
This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity <https://sylabs.io/guides/3.5/user-guide/introduction.html>`__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`'s, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager <https://github.com/christopherwharrop/rocoto/wiki/Documentation>`__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach <BuildSRW>` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App.

The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization.

.. attention::

* The SRW Application has `four levels of support <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. However, this guide can serve as a starting point for running the SRW App on other systems, too.
* This chapter of the User's Guide should **only** be used for container builds. For non-container builds, see :numref:`Chapter %s <NCQuickstart>` for a Quick Start Guide or :numref:`Chapter %s <BuildRunSRW>` for a detailed guide to building the SRW App **without** a container.
* This chapter of the User's Guide should **only** be used for container builds. For non-container builds, see :numref:`Chapter %s <NCQuickstart>` for a Quick Start Guide or :numref:`Chapter %s <BuildSRW>` for a detailed guide to building the SRW App **without** a container.

.. _DownloadCodeC:

Expand Down Expand Up @@ -247,7 +247,7 @@ To activate the regional workflow, run the following commands:
where:

* ``<path/to/modulefiles>`` is replaced with the actual path to the modulefiles on the user's system (often ``$PWD/modulefiles``), and
* ``<platform>`` is a valid, lowercased machine/platform name (see the ``MACHINE`` variable in :numref:`Section %s <PlatEnv>`).
* ``<platform>`` is a valid, lowercased machine/platform name (see the ``MACHINE`` variable in :numref:`Section %s <user>`).

The ``wflow_<platform>`` modulefile will then output instructions to activate the regional workflow. The user should run the commands specified in the modulefile output. For example, if the output says:

Expand All @@ -273,7 +273,7 @@ where:

* ``-c`` indicates the compiler on the user's local machine (e.g., ``intel/2022.1.2``)
* ``-m`` indicates the :term:`MPI` on the user's local machine (e.g., ``impi/2022.1.2``)
* ``<platform>`` refers to the local machine (e.g., ``hera``, ``jet``, ``noaacloud``, ``mac``). See ``MACHINE`` in :numref:`Section %s <PlatEnv>` for a full list of options.
* ``<platform>`` refers to the local machine (e.g., ``hera``, ``jet``, ``noaacloud``, ``mac``). See ``MACHINE`` in :numref:`Section %s <user>` for a full list of options.
* ``-i`` indicates the name of the container image that was built in :numref:`Step %s <BuildC>` (``ubuntu20.04-intel-srwapp`` or ``ubuntu20.04-intel-srwapp-develop.img`` by default).

For example, on Hera, the command would be:
Expand All @@ -299,7 +299,7 @@ From here, users can follow the steps below to configure the out-of-the-box SRW
The default settings include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization.

#. Edit the ``MACHINE`` and ``ACCOUNT`` variables in the ``user:`` section of ``config.yaml``. See :numref:`Section %s <PlatEnv>` for details on valid values.
#. Edit the ``MACHINE`` and ``ACCOUNT`` variables in the ``user:`` section of ``config.yaml``. See :numref:`Section %s <user>` for details on valid values.

.. note::

Expand All @@ -326,7 +326,7 @@ From here, users can follow the steps below to configure the out-of-the-box SRW
EXTRN_MDL_FILES_ICS: []
EXTRN_MDL_DATA_STORES: disk
On other systems, users will need to change the path for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_FILES_LBCS`` (below) to reflect the location of the system's data. The location of the machine's global data can be viewed :ref:`here <SystemData>` for Level 1 systems. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Section %s <InitialConditions>`.
On other systems, users will need to change the path for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_FILES_LBCS`` (below) to reflect the location of the system's data. The location of the machine's global data can be viewed :ref:`here <Data>` for Level 1 systems. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Section %s <InitialConditions>`.

#. Edit the ``task_get_extrn_lbcs:`` section of the ``config.yaml`` to include the correct data paths to the lateral boundary conditions files. For example, on Hera, add:

Expand Down
Loading

0 comments on commit 0e11b9d

Please sign in to comment.