Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fix] Set the Linear device equal to the main model device in SoftmaxLoss #2378

Merged
merged 1 commit into from
Dec 13, 2023

Conversation

tomaarsen
Copy link
Collaborator

Hello!

Pull Request overview

  • Set the Linear device equal to the main model device in SoftmaxLoss.
  • Prevents crashes if torch is compiled with CUDA.

Details

Since #2351, a model loaded when CUDA is available is immediately placed on CUDA. However, when a SoftmaxLoss instance is initialized, the Linear layer is not moved to a particular device. This results in a mismatch in devices when CUDA is available in the tests.
The CI missed it, as it only has CPU.

  • Tom Aarsen

@tomaarsen tomaarsen added the bug Something isn't working label Dec 13, 2023
@tomaarsen tomaarsen merged commit 0ba8af3 into UKPLab:master Dec 13, 2023
8 checks passed
@tomaarsen tomaarsen deleted the fix/softmaxloss_device branch December 13, 2023 21:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant