Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,8 +54,8 @@ environment):
``` bash
conda config --add channels conda-forge
conda config --set channel_priority strict
conda create -y -n mpas_dev --file dev-spec.txt
conda activate mpas_dev
conda create -y -n mpas_analysis_dev --file dev-spec.txt
conda activate mpas_analysis_dev
python -m pip install --no-deps --no-build-isolation -e .
```

Expand All @@ -64,16 +64,16 @@ for MPAS-Tools or geometric\_features), you should first comment out the other
package in `dev-spec.txt`. Then, you can install both packages in the same
development environment, e.g.:
``` bash
conda create -y -n mpas_dev --file tools/MPAS-Tools/conda_package/dev-spec.txt \
conda create -y -n mpas_analysis_dev --file tools/MPAS-Tools/conda_package/dev-spec.txt \
--file analysis/MPAS-Analysis/dev-spec.txt
conda activate mpas_dev
conda activate mpas_analysis_dev
cd tools/MPAS-Tools/conda_package
python -m pip install --no-deps --no-build-isolation -e .
cd ../../../analysis/MPAS-Analysis
python -m pip install --no-deps --no-build-isolation -e .
```
Obviously, the paths to the repos may be different in your local clones. With
the `mpas_dev` environment as defined above, you can make changes to both
the `mpas_analysis_dev` environment as defined above, you can make changes to both
`mpas_tools` and `mpas-analysis` packages in their respective branches, and
these changes will be reflected when refer to the packages or call their
respective entry points (command-line tools).
Expand Down
4 changes: 2 additions & 2 deletions ci/recipe/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,15 +32,15 @@ requirements:
- lxml
- mache >=1.11.0
- matplotlib-base >=3.9.0
- mpas_tools >=0.34.1,<1.0.0
- mpas_tools >=1.0.0,<2.0.0
- nco >=4.8.1,!=5.2.6
- netcdf4
- numpy >=2.0,<3.0
- pandas
- pillow >=10.0.0,<11.0.0
- progressbar2
- pyproj
- pyremap >=1.2.0,<2.0.0
- pyremap >=2.0.0,<3.0.0
- python-dateutil
- requests
- scipy >=1.7.0
Expand Down
2 changes: 1 addition & 1 deletion configs/alcf/job_script.cooley.bash
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
source /lus/theta-fs0/projects/ccsm/acme/tools/e3sm-unified/load_latest_e3sm_unified_cooley.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=cooley

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/compy/job_script.compy.bash
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ export OMP_NUM_THREADS=1
source /share/apps/E3SM/conda_envs/load_latest_e3sm_unified_compy.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=compy

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/job_script.default.bash
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
export OMP_NUM_THREADS=1

source ~/mambaforge/etc/profile.d/conda.sh
conda activate mpas_dev
conda activate mpas_analysis_dev
# if you are on an E3SM supported machine, you can specify it:
# export E3SMU_MACHINE=chrysalis

Expand Down
2 changes: 1 addition & 1 deletion configs/lanl/job_script.lanl.bash
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

source /users/xylar/climate/mambaforge/etc/profile.d/conda.sh
source /users/xylar/climate/mambaforge/etc/profile.d/mamba.sh
mamba activate mpas_dev
mamba activate mpas_analysis_dev

export HDF5_USE_FILE_LOCKING=FALSE

Expand Down
2 changes: 1 addition & 1 deletion configs/lcrc/job_script.anvil.bash
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ export OMP_NUM_THREADS=1
source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_anvil.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=anvil

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/lcrc/job_script.chrysalis.bash
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ export OMP_NUM_THREADS=1
source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_chrysalis.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=chrysalis

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/nersc/job_script.cori-haswell.bash
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ export OMP_NUM_THREADS=1
source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=cori-haswell

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/nersc/job_script.cori-knl.bash
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ export OMP_NUM_THREADS=1
source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-knl.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=cori-knl

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/nersc/job_script.pm-cpu.bash
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ export OMP_NUM_THREADS=1
source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_pm-cpu.sh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=pm-cpu

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
2 changes: 1 addition & 1 deletion configs/olcf/job_script.olcf.bash
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
source /gpfs/alpine/proj-shared/cli115/e3sm-unified/load_latest_e3sm_unified_andes.csh
# alternatively, you can load your own development environment
# source ~/mambaforge/etc/profile.d/conda.sh
# conda activate mpas_dev
# conda activate mpas_analysis_dev
# export E3SMU_MACHINE=anvil

export HDF5_USE_FILE_LOCKING=FALSE
Expand Down
4 changes: 2 additions & 2 deletions dev-spec.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,15 +15,15 @@ gsw
lxml
mache >=1.11.0
matplotlib-base>=3.9.0
mpas_tools>=0.34.1,<1.0.0
mpas_tools >=1.0.0,<2.0.0
nco>=4.8.1,!=5.2.6
netcdf4
numpy>=2.0,<3.0
pandas
pillow >=10.0.0,<11.0.0
progressbar2
pyproj
pyremap>=1.2.0,<2.0.0
pyremap >=2.0.0,<3.0.0
python-dateutil
requests
scipy >=1.7.0
Expand Down
14 changes: 7 additions & 7 deletions docs/tutorials/dev_add_task.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ the code to MPAS-Analysis.
If one just wishes to add a new field that already exists in MPAS-Ocean or
MPAS-Seaice output, only a few of the steps below are necessary:

1. Follow step 1 to set up an ```mpas_dev``` environment.
1. Follow step 1 to set up an ```mpas_analysis_dev``` environment.
2. Copy an existing `ocean <https://github.com/MPAS-Dev/MPAS-Analysis/tree/develop/mpas_analysis/ocean>`_
or `sea_ice <https://github.com/MPAS-Dev/MPAS-Analysis/tree/develop/mpas_analysis/sea_ice>`_
python module to a new name and edit it as needed for the new fields.
Expand All @@ -58,7 +58,7 @@ testing your new MPAS-Analysis development, and running MPAS-Analysis.
Make sure you follow the tutorial for developers, not for users, since the
tutorial for users installs the latest release of MPAS-Analysis, which you
cannot modify. Similarly, changes must be tested in your own development
environment (often called ``mpas_dev``) rather than the in a shared
environment (often called ``mpas_analysis_dev``) rather than the in a shared
environment like `E3SM-Unified <https://github.com/E3SM-Project/e3sm-unified>`_.

Then, please follow the :ref:`tutorial_understand_a_task`. This will give
Expand Down Expand Up @@ -417,8 +417,8 @@ And here's the one for plotting it:

matplotlib.rc('font', size=14)

x = descriptor.xCorner
y = descriptor.yCorner
x = descriptor.x_corner
y = descriptor.y_corner

extent = [x[0], x[-1], y[0], y[-1]]

Expand Down Expand Up @@ -550,12 +550,12 @@ whatever editor you like.)

code .

I'll create or recreate my ``mpas_dev`` environment as in
I'll create or recreate my ``mpas_analysis_dev`` environment as in
:ref:`tutorial_dev_getting_started`, and then make sure to at least do:

.. code-block:: bash

conda activate mpas_dev
conda activate mpas_analysis_dev
python -m pip install --no-deps --no-build-isolation -e .

This last command installs the ``mpas_analysis`` package into the conda
Expand Down Expand Up @@ -1138,7 +1138,7 @@ You also need to add the tasks class and public methods to the
in the developer's guide. Again, the easiest approach is to copy the section
for a similar task and modify as needed.

With the ``mpas_dev`` environment activated, you can run:
With the ``mpas_analysis_dev`` environment activated, you can run:

.. code-block:: bash

Expand Down
36 changes: 18 additions & 18 deletions docs/tutorials/dev_getting_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -249,13 +249,13 @@ If you installed Miniforge3, these steps will happen automatically.
4.3 Create a development environment
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

You can create a new conda environment called ``mpas_dev`` and install the
You can create a new conda environment called ``mpas_analysis_dev`` and install the
dependencies that MPAS-Analysis needs by running the following in the worktree
where you are doing your development:

.. code-block:: bash

$ conda create -y -n mpas_dev --file dev-spec.txt
$ conda create -y -n mpas_analysis_dev --file dev-spec.txt

The last argument is only needed on HPC machines because the conda version of
MPI doesn't work properly on these machines. You can omit it if you're
Expand All @@ -266,42 +266,42 @@ mode by running:

.. code-block:: bash

$ conda activate mpas_dev
$ conda activate mpas_analysis_dev
$ python -m pip install --no-deps --no-build-isolation -e .

In this mode, any edits you make to the code in the worktree will be available
in the conda environment. If you run ``mpas_analysis`` on the command line,
it will know about the changes.

This command only needs to be done once after the ``mpas_dev`` environment is
This command only needs to be done once after the ``mpas_analysis_dev`` environment is
built if you are not using worktrees.

.. note::

If you do use worktrees, rerun the ``python -m pip install ...`` command
each time you switch to developing a new branch, since otherwise the
version of ``mpas_analysis`` in the ``mpas_dev`` environment will be the
version of ``mpas_analysis`` in the ``mpas_analysis_dev`` environment will be the
one you were developing previously.

.. _tutorial_dev_get_started_activ_env:

4.4 Activating the environment
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Each time you open a new terminal window, to activate the ``mpas_dev``
Each time you open a new terminal window, to activate the ``mpas_analysis_dev``
environment, you will need to run either for ``bash``:

.. code-block:: bash

$ source ~/miniforge3/etc/profile.d/conda.sh
$ conda activate mpas_dev
$ conda activate mpas_analysis_dev

or for ``csh``:

.. code-block:: csh

> source ~/miniforge3/etc/profile.d/conda.csh
> conda activate mpas_dev
> conda activate mpas_analysis_dev

You can skip the ``source`` command if you chose to initialize Miniforge3 or
Miniconda3 so it loads automatically. You can also use the ``init_conda``
Expand All @@ -311,16 +311,16 @@ alias for this step if you defined one.
~~~~~~~~~~~~~~~~~~~~~~~

If you switch to a different worktree, it is safest to rerun the whole
process for creating the ``mpas_dev`` conda environment. If you know that
the dependencies are the same as the worktree used to create ``mpas_dev``,
process for creating the ``mpas_analysis_dev`` conda environment. If you know that
the dependencies are the same as the worktree used to create ``mpas_analysis_dev``,
You can just reinstall ``mpas_analysis`` itself by rerunning

.. code-block:: bash

python -m pip install --no-deps --no-build-isolation -e .

in the new worktree. If you forget this step, you will find that changes you
make in the worktree don't affect the ``mpas_dev`` conda environment you are
make in the worktree don't affect the ``mpas_analysis_dev`` conda environment you are
using.

5. Editing code
Expand Down Expand Up @@ -348,7 +348,7 @@ need to follow steps 2-6 of the :ref:`tutorial_getting_started` tutorial.

Run ``mpas_analysis`` on a compute node, not on an HPC login nodes (front
ends), because it uses too many resources to be safely run on a login node.
When using a compute node interactively, activate the ``mpas_dev``
When using a compute node interactively, activate the ``mpas_analysis_dev``
environment, even if it was activated on the login node. Be sure to

7.1 Configuring MPAS-Analysis
Expand Down Expand Up @@ -688,7 +688,7 @@ also be displayed over the full 5 years.)
The hard work is done. Now that we have a config file, we are ready to run.

To run MPAS-Analysis, you should either create a job script or log into
an interactive session on a compute node. Then, activate the ``mpas_dev``
an interactive session on a compute node. Then, activate the ``mpas_analysis_dev``
conda environment as in :ref:`tutorial_dev_get_started_activ_env`.

On many file systems, MPAS-Analysis and other python-based software that used
Expand Down Expand Up @@ -724,15 +724,15 @@ Typical output is the analysis is running correctly looks something like:
Detected E3SM supported machine: anvil
Using the following config files:
/gpfs/fs1/home/ac.xylar/code/mpas-analysis/add_my_fancy_task/mpas_analysis/default.cfg
/gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_dev/lib/python3.10/site-packages/mache/machines/anvil.cfg
/gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_analysis_dev/lib/python3.10/site-packages/mache/machines/anvil.cfg
/gpfs/fs1/home/ac.xylar/code/mpas-analysis/add_my_fancy_task/mpas_analysis/configuration/anvil.cfg
/gpfs/fs1/home/ac.xylar/code/mpas-analysis/add_my_fancy_task/mpas_analysis/__main__.py
/gpfs/fs1/home/ac.xylar/code/mpas-analysis/add_my_fancy_task/myrun.cfg
copying /gpfs/fs1/home/ac.xylar/code/mpas-analysis/add_my_fancy_task/myrun.cfg to HTML dir.

running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp76l7of28/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp76l7of28/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_0.5x0.5degree_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --ignore_unmapped
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpj94wpf9y/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpj94wpf9y/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_6000.0x6000.0km_10.0km_Antarctic_stereo_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --dst_regional --ignore_unmapped
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp6zm13a0s/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp6zm13a0s/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_WOCE_transects_5km_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --dst_regional --ignore_unmapped
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_analysis_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp76l7of28/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp76l7of28/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_0.5x0.5degree_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --ignore_unmapped
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_analysis_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpj94wpf9y/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpj94wpf9y/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_6000.0x6000.0km_10.0km_Antarctic_stereo_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --dst_regional --ignore_unmapped
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_analysis_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp6zm13a0s/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmp6zm13a0s/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_WOCE_transects_5km_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --dst_regional --ignore_unmapped
Preprocessing SOSE transect data...
temperature
salinity
Expand All @@ -741,7 +741,7 @@ Typical output is the analysis is running correctly looks something like:
meridionalVelocity
velMag
Done.
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpe2a9yblb/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpe2a9yblb/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_SOSE_transects_5km_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --dst_regional --ignore_unmapped
running: /gpfs/fs1/home/ac.xylar/anvil/mambaforge/envs/mpas_analysis_dev/bin/ESMF_RegridWeightGen --source /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpe2a9yblb/src_mesh.nc --destination /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/tmpe2a9yblb/dst_mesh.nc --weight /lcrc/group/e3sm/ac.xylar/analysis/A_WCYCL1850.ne4_oQU480.anvil/clim_3-5_ts_1-5/mapping/map_oQU480_to_SOSE_transects_5km_bilinear.nc --method bilinear --netcdf4 --no_log --src_loc center --src_regional --dst_regional --ignore_unmapped

Running tasks: 100% |##########################################| Time: 0:06:42

Expand Down
4 changes: 2 additions & 2 deletions mpas_analysis/ocean/climatology_map_antarctic_melt.py
Original file line number Diff line number Diff line change
Expand Up @@ -407,7 +407,7 @@ def get_observation_descriptor(self, fileName):
# stereographic coordinates
projection = get_pyproj_projection(comparison_grid_name='antarctic')
obsDescriptor = ProjectionGridDescriptor.read(
projection, fileName=fileName, xVarName='x', yVarName='y')
projection, filename=fileName, x_var_name='x', y_var_name='y')

# update the mesh name to match the format used elsewhere in
# MPAS-Analysis
Expand All @@ -416,7 +416,7 @@ def get_observation_descriptor(self, fileName):
width = 1e-3 * (x[-1] - x[0])
height = 1e-3 * (y[-1] - y[0])
res = 1e-3 * (x[1] - x[0])
obsDescriptor.meshName = f'{width}x{height}km_{res}km_Antarctic_stereo'
obsDescriptor.mesh_name = f'{width}x{height}km_{res}km_Antarctic_stereo'

return obsDescriptor

Expand Down
4 changes: 2 additions & 2 deletions mpas_analysis/ocean/climatology_map_argo.py
Original file line number Diff line number Diff line change
Expand Up @@ -394,8 +394,8 @@ def get_observation_descriptor(self, fileName):
# create a descriptor of the observation grid using Lat/Lon
# coordinates
obsDescriptor = LatLonGridDescriptor.read(ds=dsObs,
latVarName='latCoord',
lonVarName='lonCoord')
lat_var_name='latCoord',
lon_var_name='lonCoord')
dsObs.close()
return obsDescriptor

Expand Down
4 changes: 2 additions & 2 deletions mpas_analysis/ocean/climatology_map_bgc.py
Original file line number Diff line number Diff line change
Expand Up @@ -346,8 +346,8 @@ def get_observation_descriptor(self, fileName):
# coordinates
dsObs = self.build_observational_dataset(fileName)
obsDescriptor = LatLonGridDescriptor.read(ds=dsObs,
latVarName='lat',
lonVarName='lon')
lat_var_name='lat',
lon_var_name='lon')
return obsDescriptor

def build_observational_dataset(self, fileName):
Expand Down
Loading
Loading