Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add fallback for is_compiling #6663

Merged
merged 2 commits into from
Oct 25, 2024
Merged

Conversation

tohtana
Copy link
Contributor

@tohtana tohtana commented Oct 23, 2024

Importing torch.compiler.is_compiling causes an error with an older version of PyTorch.
This PR adds a fallback for is_compiling to use an equivalent function of older PyTorch versions.

This will resolve #6656.

@tohtana tohtana changed the title dd fallback for is_compiling func Add fallback for is_compiling func Oct 23, 2024
@tohtana tohtana changed the title Add fallback for is_compiling func Add fallback for is_compiling Oct 23, 2024
@tohtana tohtana marked this pull request as ready for review October 23, 2024 23:56
@tjruwase tjruwase added this pull request to the merge queue Oct 24, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Oct 24, 2024
@tohtana tohtana added this pull request to the merge queue Oct 24, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Oct 24, 2024
@loadams loadams added this pull request to the merge queue Oct 25, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to no response for status checks Oct 25, 2024
@loadams loadams added this pull request to the merge queue Oct 25, 2024
Merged via the queue into master with commit 24285d6 Oct 25, 2024
11 checks passed
@loadams loadams deleted the tohtana/avoid_error_is_compiling branch October 28, 2024 02:03
github-merge-queue bot pushed a commit that referenced this pull request Nov 6, 2024
The parameter coordinator in ZeRO3 throws a "backward pass is invalid
for module in evaluation mode" error when the training mode is
unexpected, as it expects all modules to be in training mode during the
backward pass. This is an unnecessarily strict restriction.
This PR relaxes the restriction by using a single parameter coordinator
(instead of separate ones for training and evaluation modes) and
resetting the prefetch state before starting a forward pass.

Use of `is_compiling` needs to be fixed after #6663 is merged.

---------

Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Logan Adams <114770087+loadams@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] deepspeed.utils.logging: module 'torch.compiler' has no attribute 'is_compiling'
3 participants