From 9924ff4e3f6465a7429dbe52555da84cdb8473f9 Mon Sep 17 00:00:00 2001 From: Matt Graham Date: Tue, 12 Mar 2024 17:01:13 +0000 Subject: [PATCH] Change GitLab URL to permalink --- NektarDriftwave/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/NektarDriftwave/README.md b/NektarDriftwave/README.md index 35d6d5c..6f028a0 100644 --- a/NektarDriftwave/README.md +++ b/NektarDriftwave/README.md @@ -37,7 +37,7 @@ Currently the `NektarDriftwave` module only implements methods for [the subset o When using the Cray MPICH (tested with version 8.1.23) MPI implementation on ARCHER2, if Nektar++ (and the `nektar-driftwave` solver) are built with MPI support, running a `ParticleDA` particle filter using MPI hangs when calling the Nektar++ solvers, we believe due to the nested use of MPI at the particle filter and Nektar++ levels. This issue is potentially specific to the Cray MPICH library, as the issue does not occur when building and running using OpenMPI on another system. The implementation here executes the Nektar++ solvers and utilities directly using system calls but similar behaviour was observed when using `MPI_Comm_spawn` to spawn new processes to run the solver instances in within the ParticleDA wrapper. -Building Nektar++ with HDF5 support but without MPI, requires manually commenting out / removing [the lines in the `cmake/ThirdPartyHDF5.cmake` file in the Nektar++ source tree which raise an error if trying to build with HDF5 but no MPI support](https://gitlab.nektar.info/nektar/nektar/-/blob/master/cmake/ThirdPartyHDF5.cmake?ref_type=heads#L13-16). After changing these lines in a local clone of the Nektar++ repository, we successively built Nektar++ with HDF5 but no MPI support on ARCHER2 (using the pre-built Cray HDF5 library on ARCHER2) by running the following from the root of the repository +Building Nektar++ with HDF5 support but without MPI, requires manually commenting out / removing [the lines in the `cmake/ThirdPartyHDF5.cmake` file in the Nektar++ source tree which raise an error if trying to build with HDF5 but no MPI support](https://gitlab.nektar.info/nektar/nektar/-/blob/8bc4b3095361e868b26219eff826d4f1902763df/cmake/ThirdPartyHDF5.cmake#L12-16). After changing these lines in a local clone of the Nektar++ repository, we successively built Nektar++ with HDF5 but no MPI support on ARCHER2 (using the pre-built Cray HDF5 library on ARCHER2) by running the following from the root of the repository ```sh module load cpe/22.12