Skip to content

Commit

Permalink
Fix line number range in GitLab permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
matt-graham committed Mar 12, 2024
1 parent efb5266 commit 6b0fcf2
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion NektarDriftwave/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Currently the `NektarDriftwave` module only implements methods for [the subset o

When using the Cray MPICH (tested with version 8.1.23) MPI implementation on ARCHER2, if Nektar++ (and the `nektar-driftwave` solver) are built with MPI support, running a `ParticleDA` particle filter using MPI hangs when calling the Nektar++ solvers, we believe due to the nested use of MPI at the particle filter and Nektar++ levels. This issue is potentially specific to the Cray MPICH library, as the issue does not occur when building and running using OpenMPI on another system. The implementation here executes the Nektar++ solvers and utilities directly using system calls but similar behaviour was observed when using `MPI_Comm_spawn` to spawn new processes to run the solver instances in within the ParticleDA wrapper.

Building Nektar++ with HDF5 support but without MPI, requires manually commenting out / removing [the lines in the `cmake/ThirdPartyHDF5.cmake` file in the Nektar++ source tree which raise an error if trying to build with HDF5 but no MPI support](https://gitlab.nektar.info/nektar/nektar/-/blob/8bc4b3095361e868b26219eff826d4f1902763df/cmake/ThirdPartyHDF5.cmake#L12-16). After changing these lines in a local clone of the Nektar++ repository, we successively built Nektar++ with HDF5 but no MPI support on ARCHER2 (using the pre-built Cray HDF5 library on ARCHER2) by running the following from the root of the repository
Building Nektar++ with HDF5 support but without MPI, requires manually commenting out / removing [the lines in the `cmake/ThirdPartyHDF5.cmake` file in the Nektar++ source tree which raise an error if trying to build with HDF5 but no MPI support](https://gitlab.nektar.info/nektar/nektar/-/blob/8bc4b3095361e868b26219eff826d4f1902763df/cmake/ThirdPartyHDF5.cmake#L13-16). After changing these lines in a local clone of the Nektar++ repository, we successively built Nektar++ with HDF5 but no MPI support on ARCHER2 (using the pre-built Cray HDF5 library on ARCHER2) by running the following from the root of the repository

```sh
module load cpe/22.12
Expand Down

0 comments on commit 6b0fcf2

Please sign in to comment.