-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix #351: Solution 1: enforce weights.shape = x.shape for tuple axis #953
fix #351: Solution 1: enforce weights.shape = x.shape for tuple axis #953
Conversation
…ape for tuple axis
GPU cluster tests are currently disabled on this Pull Request. |
👇 Click on the image for a new way to code review
Legend |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Mystic-Slice thanks a lot for taking this on and welcome to our legacy code 😬
I vaguely remember implementing ht.average
, it seems so overly complicated now.
I agree with your solution 1, we can enforce - just like numpy - that weights.shape == a.shape
if averaging along more than one axis. They must be distributed along the same split axis as well.
The distributed operation itself can be simplified quite a bit, as it's just:
(a*weights).sum(axis)/weights.sum(axis)
a*weights
calls _operations.__binary_op()
and ht.sum()
calls _operations.__reduce_op()
, those will take care of sanitizing, communication when needed, etc.
Please go ahead and make the changes whenever you feel comfortable. Thanks a lot!
Hey @ClaudiaComito! |
* wip: Initial release draft and changelog updater actions configuration * doc: pr title style guide in contibuting.md * ci: improved release draft templates * ci: extra release draft categories * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* doc: parallel tutorial note metioning local and global printing * doc: extenden local print note with ``ht.local_printing()`` * Fix typo * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* Updated the tutorial document. 1. Corrected the spelling mistake -> (sigular to single) 2. Corrected the statement -> the number of dimensions is the rank of the array. 3. Made 2 more small changes. * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Fix typo Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
updates: - [github.com/psf/black: 22.3.0 → 22.6.0](psf/black@22.3.0...22.6.0)
…eatures/994-linspace-zero-samples Non-negative sample size for `linspace`/`logspace`
…ug/996-iinfo-finfo-min Bug/996 `iinfo`/`finfo` `min` value
…re-commit-ci-update-config [pre-commit.ci] pre-commit autoupdate
…alytics#1014) * Test latest pyorch on both main and release branch * Move pytorch release record out of workflows directory * Update paths * New PyTorch release * Temporarily remove trigger * Update pytorch-latest.txt * Reinstate trigger * New PyTorch release * Remove matrix strategy * Update pytorch-latest.txt * New PyTorch release * New PyTorch release * fix: set cuda rng state on gpu tests for test_random.py * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Added tests for python 3.9 and pytorch 1.12 Co-authored-by: Claudia Comito <c.comito@fz-juelich.de> Co-authored-by: Daniel Coquelin <daniel.coquelin@gmail.com> Co-authored-by: ClaudiaComito <c.comito@fz-juelich.de@users.noreply.github.com> Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
updates: - [github.com/psf/black: 22.6.0 → 22.8.0](psf/black@22.6.0...22.8.0) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com>
fixes formatting issues
fixes an issue where the bug label is not set.
Use status badge from a different workflow action
* Check for split in `__reduce_op` * Check whether x is distributed Co-authored-by: mtar <m.tarnawa@fz-juelich.de> Co-authored-by: mtar <m.tarnawa@fz-juelich.de> Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com>
* Update ci worflow action * Update codecov.yml
* Fix `all` * Fix `any` * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Add distributed tests * Expanded tests for combination of axis/split axis Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Claudia Comito <39374113+ClaudiaComito@users.noreply.github.com> Co-authored-by: mtar <m.tarnawa@fz-juelich.de>
updates: - [github.com/psf/black: 22.8.0 → 22.10.0](psf/black@22.8.0...22.10.0)
…pre-commit-ci-update-config [pre-commit.ci] pre-commit autoupdate
Changes moved to a new branch. See, #1037 |
Issue/s resolved: #351
Type of change
Documentation update
Due Diligence
Does this change modify the behaviour of other functions? If so, which?
yes / no
skip ci