Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft kokkos-comm initialization/finalization using
MPI_Session
s #68base: develop
Are you sure you want to change the base?
Draft kokkos-comm initialization/finalization using
MPI_Session
s #68Changes from 15 commits
5614211
45c695f
ea667db
bc31289
4852d9c
e37173b
aac626e
4bc55e2
afbb017
89e3ee3
dcc4584
de176b6
bb58138
c2812bd
4ceee5f
40ba7e7
822722d
8faf205
d064469
f0d5285
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't just terminate. Either throw an exception or call
MPI_Abort
(I've seen applications hang where one process aborted without callingMPI_Abort
, it's nasty)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any specific error code you suggest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not really, anything non-zero would work. This ties in with #29 so anything you do here is probably temporary anyway :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason why this isn't just an overload of
split
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wanted to make it explicit that we're splitting from a raw
MPI_Comm
handle, not a wrappedKokkosComm::Communicator
, but I guess we can also simply overload.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess that's a design decision that the Kokkos community should make (to be consistent with the other projects)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can drop the Raw. We typically are happy with overloads.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That doesn't belong into the communicator. Since we use sessions, there should be no use of the world process model. It could even be that no one has called
MPI_Init
.