-
Notifications
You must be signed in to change notification settings - Fork 478
Issues: pytorch/xla
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Error when usecollective communication in torch_xla.core.xla_model.all_to_all in SPMD mopde
#8345
opened Oct 31, 2024 by
DarkenStar
Bug - Using Sharding in Flash Attention with segment ids.
#8334
opened Oct 29, 2024 by
dudulightricks
Provide debugging and troubleshooting tips to Pallas developer
documentation
#8301
opened Oct 22, 2024 by
miladm
Clarify that torch_xla2 is only recommended for inference
#8270
opened Oct 17, 2024 by
cloudchrischan
Improve documentation for Bugs/features related to improving the usability of PyTorch/XLA
get_memory_info
usability
#8245
opened Oct 9, 2024 by
miladm
XLA2 does not work with jax 0.4.34 (but did work on jax 0.4.33)
#8240
opened Oct 9, 2024 by
Chaosruler972
A process in the process pool was terminated abruptly while the future was running or pending.
#8234
opened Oct 8, 2024 by
fancy45daddy
A process in the process pool was terminated abruptly while the future was running or pending.
#8233
opened Oct 8, 2024 by
fancy45daddy
how to use torch.float16 in diffusers pipeline with pytorch xla
#8223
opened Oct 6, 2024 by
fancy45daddy
Dynamic batch dimension export tutorial (following https://openxla.org/stablehlo/tutorials/pytorch-export) segfaults (torch-xla==2.4.0)
stablehlo
StableHLO related work
#8200
opened Oct 1, 2024 by
optiluca
Previous Next
ProTip!
no:milestone will show everything without a milestone.