Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A runtime error reported when executing the loss.backward(torch.ones_like(loss)) line #10

Open
zztian007 opened this issue Jan 26, 2024 · 6 comments

Comments

@zztian007
Copy link

When I run train.py in the GMD , it reports the following runtime error. Do I need modify any code of the python files as you provided? How to fix it?

Traceback (most recent call last): File "E:\Projects\PythonProjects\XFVD\Models\GMN\train.py", line 97, in <module> loss.backward(torch.ones_like(loss)) # File "D:\WorkSpace\Anaconda3\envs\python38-tf2\lib\site-packages\torch\_tensor.py", line 487, in backward torch.autograd.backward( File "D:\WorkSpace\Anaconda3\envs\python38-tf2\lib\site-packages\torch\autograd\__init__.py", line 197, in backward Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [20]], which is output 0 of ReluBackward0, is at version 1; expected version 0 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).

@Lin-Yijie
Copy link
Owner

What is the version of your pytorch? This issue appears to be unprecedented.

@zztian007
Copy link
Author

The pytorch version is 1.13.
The python version is 3.8.11

@Lin-Yijie
Copy link
Owner

The code was tested with torch = 1.2.0. Could you consider downgrading your PyTorch installation to version 1.2.0 or a version close to it?

@Lin-Yijie
Copy link
Owner

or you could change the line loss.backward(torch.ones_like(loss)) to
loss = torch.sum(loss)
loss.backward()

This might work

@chen-ysh
Copy link

chen-ysh commented Oct 9, 2024

modifier 'loss +=' in 93 line to 'loss.add'

Lin-Yijie added a commit that referenced this issue Oct 9, 2024
@Lin-Yijie
Copy link
Owner

modifier 'loss +=' in 93 line to 'loss.add'

Updated. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants