Pytorch 奇怪的坑

One of the variables needed for gradient computation has been modified by an inplace operation

 

报错代码:

x2 = self.conv(x1)

sr = self.tail(x2)           ------------sr 有去计算loss

x3 = self.relu(x2)        ------------x3 后续操作结果也去计算loss

relu 默认 inplace为True  ,relu 改变了x2的值  所以sr部分损失回传导致x2的值不对

...

改为:

x2 = self.conv(x1)

x3 = self.relu(x2)

sr = self.tail(x3)

...

posted @ 2020-10-09 11:02  SuckChen  阅读(136)  评论(0)    收藏  举报