-
Notifications
You must be signed in to change notification settings - Fork 3.5k
bugfix: add support for global_ordinal
, local_ordinal
, world_size
in xla
#20872
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
bugfix: add support for global_ordinal
, local_ordinal
, world_size
in xla
#20872
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #20872 +/- ##
========================================
- Coverage 87% 86% -0%
========================================
Files 268 268
Lines 23411 23451 +40
========================================
- Hits 20360 20278 -82
- Misses 3051 3173 +122 |
global_ordinal
, local_ordinal
, world_size
in xla
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you for this finding!
could you pls add test to cover this new behaviour?
can this XLA be also run without TPU?
Thank you for taking the time to review. I implemented a test with mocks, because as far as I understand |
Co-authored-by: Bhimraj Yadav <bhimrajyadav977@gmail.com>
Co-authored-by: Bhimraj Yadav <bhimrajyadav977@gmail.com>
What does this PR do?
PyTorch XLA has deprecated several methods in PyTorch 2.7:
See here
Fixes #20852
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--20872.org.readthedocs.build/en/20872/