Skip to content

Fix: no_grad with AMP bug #20921

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

baskrahmer
Copy link
Contributor

@baskrahmer baskrahmer commented Jun 20, 2025

Fixes #20644

Note however that this would affect performance for other users, so the question is whether it is worth optimizing for this edge case that is fundamentally a torch bug.

cc @Borda


📚 Documentation preview 📚: https://pytorch-lightning--20921.org.readthedocs.build/en/20921/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Jun 20, 2025
@baskrahmer baskrahmer force-pushed the fix/no-grad-amp-bug branch from 08508b6 to d18fb08 Compare June 20, 2025 13:46
@baskrahmer baskrahmer marked this pull request as ready for review June 20, 2025 15:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Computation graph not being built
1 participant