Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation #7

Open
XinyiYang5 opened this issue Mar 24, 2019 · 2 comments

Comments

@XinyiYang5
Copy link

Hi, when I ran coref.py file, I encountered a RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation. I've tried pytorch 0.4.1 in the requirements.txt and pytorch 1.0 but they got same error. Could you please look into this? Thanks!

File "coref.py", line 692, in
trainer.train(150)
File "coref.py", line 458, in train
self.train_epoch(epoch, *args, **kwargs)
File "coref.py", line 488, in train_epoch
corefs_found, total_corefs, corefs_chosen = self.train_doc(doc)
File "coref.py", line 555, in train_doc
loss.backward()
File "/opt/conda/envs/mlkit36/lib/python3.6/site-packages/torch/tensor.py", line 93, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File "/opt/conda/envs/mlkit36/lib/python3.6/site-packages/torch/autograd/init.py", line 90, in backward
allow_unreachable=True) # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

@shayneobrien
Copy link
Owner

The inplace operation is defined here. I think I did it this way because in 0.4.1, dropout could only be applied to a packed sequence by first unpacking it, applying dropout, and then repacking it. This seems likely to have been fixed in pytorch 1.0...

@vaibkumr
Copy link

vaibkumr commented Aug 4, 2019

I was getting the exact same error for torch==1.1.0
Changing

self.emb_dropout = nn.Dropout(0.50, inplace=True)
self.lstm_dropout = nn.Dropout(0.20, inplace=True)

to

self.emb_dropout = nn.Dropout(0.50)
self.lstm_dropout = nn.Dropout(0.20)

fixed it. Just as written by the author above me.
Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants