-
Notifications
You must be signed in to change notification settings - Fork 701
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add patch to fix Open MPI 4.1.5 with PMIx >= 4.2.3 #18833
Conversation
I needed this to use any MPI application with srun with pmix or with plain mpirun, or else I would run into OOB/TCP communication errors for even "mpirun hostname" across two nodes (see open-mpi/ompi#11729 for another user with the same problem). This patch is taken from open-mpi/ompi#11472 and with it, both srun and mpirun run flawlessly without further ado.
@boegelbot please test @ generoso |
@boegel: Request for testing this PR well received on login1 PR test command '
Test results coming soon (I hope)... - notification for comment with ID 1725636581 processed Message to humans: this is just bookkeeping information for me, |
Test report by @boegelbot |
@boegelbot please test @ jsc-zen2 |
@SebastianAchilles: Request for testing this PR well received on jsczen2l1.int.jsc-zen2.easybuild-test.cluster PR test command '
Test results coming soon (I hope)... - notification for comment with ID 1726158115 processed Message to humans: this is just bookkeeping information for me, |
Test report by @boegelbot |
Test report by @boegel |
Going in, thanks @bartoldeman! |
I needed this to use any MPI application with srun with pmix or with plain mpirun, or else I would run into OOB/TCP communication errors for even "mpirun hostname" across two nodes (see open-mpi/ompi#11729
for another user with the same problem). This patch is taken from open-mpi/ompi#11472 and with it, both srun and mpirun run flawlessly without further ado.