Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix device mismatch error in Whisper model during feature extraction #35866

Merged
merged 6 commits into from
Feb 4, 2025

Conversation

thedebugger
Copy link
Contributor

@thedebugger thedebugger commented Jan 24, 2025

What does this PR do?

Fixes device mismatch error in Whisper. For instance, when torch default device is set (torch.set_default_device) to cuda, whisper fails with "stft input and window must be on the same device but got self on cpu and window on cuda:0". It is because torch functions (like hann) uses torch_default_device that can lead to device mismatch

@Rocketknight1
Copy link
Member

cc @eustlb

@@ -298,6 +298,7 @@ def test_torch_integration_batch(self):
)
# fmt: on

torch.set_default_device("cuda")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I verified that this test fails with this addition & without my patch. Is adding this line okay? If not can someone recommend me what is better way to write/update the test case since it depends on cuda?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To the best of my knowledge, we are not testing set_default_device in Transformers.
I would rather go with:

Suggested change
torch.set_default_device("cuda")
with torch.device("cuda"):

followed with indentation:

with torch.device("cuda"):
    input_speech = self._load_datasamples(3)
    feature_extractor = WhisperFeatureExtractor()
    input_features = feature_extractor(input_speech, return_tensors="pt").input_features

Copy link
Contributor

@eustlb eustlb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch, thanks! 🤗 Minor change to clean a bit further but otherwise LGTM

Copy link
Contributor

@eustlb eustlb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for iterating! LGTM

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@thedebugger
Copy link
Contributor Author

Hi @eustlb are you waiting on something to merge changes? It is holding my other PR, so want to get this merged at earliest if possible

@thedebugger
Copy link
Contributor Author

Hi @eustlb what is the reason for the hold up?

@eustlb
Copy link
Contributor

eustlb commented Feb 4, 2025

It got buried in my GitHub notifications, sorry about that!

@eustlb eustlb merged commit bc9a6d8 into huggingface:main Feb 4, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants