报错RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed

训练GAN net时经常遇到这个问题

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling .backward() or autograd.grad() the first time.

翻译一下就是 第二次尝试在图中向后遍历时,保存的临时变量已经被释放

显然,

GAN中有一个变量存在于gen和disc之间,就是fake

加上detach() 就行

 

posted @ 2021-06-09 11:47  Rogn  阅读(7282)  评论(0编辑  收藏  举报