-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support mpi4py 4.x.x #1618
Support mpi4py 4.x.x #1618
Conversation
Thank you for the PR! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1618 +/- ##
=======================================
Coverage 92.07% 92.07%
=======================================
Files 83 83
Lines 12144 12158 +14
=======================================
+ Hits 11181 11194 +13
- Misses 963 964 +1
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
Benchmarks results - Sponsored by perun
Grafana Dashboard |
Thank you for the PR! |
Thank you for the PR! |
heat/core/tests/test_io.py
Outdated
len(TestCase.get_hostnames()) > 1, | ||
"Test only works on single node, file creation is not synchronized across nodes", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
len(TestCase.get_hostnames()) > 1, | |
"Test only works on single node, file creation is not synchronized across nodes", | |
len(TestCase.get_hostnames()) > 1 or not os.environment.get('TMPDIR'), | |
"Test only works on single node, file creation is not synchronized across nodes. Set the environment variable 'TMPDIR' to a path globally accessible by all nodes otherwise.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this cause problems when TMPDIR is set only on some of the nodes? Can that happen? I'm also not sure if this guarantees that TMPDIR is globally accessible by all the nodes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The user has to set this variable explicitly. I hope they know what they're doing then. Do I give them too much credit here? 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess if it is meant for us, it should be good enough. I don't think there are a lot of people out there running the tests.
@@ -33,7 +33,7 @@ | |||
"Topic :: Scientific/Engineering", | |||
], | |||
install_requires=[ | |||
"mpi4py>=3.0.0, <4.0.0", | |||
"mpi4py>=3.0.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you run it on 3.x.x?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what you mean by that. I installed mpi4py 4.0.0 manually.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you checked wether the changes are backwards compatible with the older versions? If not we have to raise the minimum dependency or add some more lines reflecting that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right, I did not check for backwards compatibility. I think the only problem with that one might be the rename of buffer/memory, but I will run some more tests.
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good 👍
Thank you for the PR! |
Due Diligence
Description
Not sure why this just appeared now, but MPI was complaining that our count and displacement tuples had some floats, when it should have been only int32. I don't think the changelog makes a specific point about this, maybe they were doing the checks before, but now it is up to us.
This caused a lot of our tests to fail, as a lot of the functions using
Allgatherv
did not properly cast the count and displacements tuples.Issue/s resolved: #1598
Changes proposed:
as_buffer
in comunications.py, added extra lines that explicitly cast the elements of tuples count and displs to ints to avoid any unexpected casting errors further down the call stack.Type of change
Does this change modify the behaviour of other functions? If so, which?
yes
All the communication functions that make use of
as_buffer
to send/recv objects through MPI.