nn.ReLU(inplace=True)

nn.ReLU(inplace=True)inplace=True,用输出的数据覆盖输入的数据;节省空间,此时两者共用内存;
import torch
from torch import nn as nn
m0 = nn.ReLU(inplace=True)
input = torch.randn(8)
print(input)
output = m0(input)
print(output)
print(input)

结果为:
tensor([ 1.3625, -0.7952, -0.3733,  0.7265,  0.0785, -0.0033,  0.2307, -0.8054])
tensor([1.3625, 0.0000, 0.0000, 0.7265, 0.0785, 0.0000, 0.2307, 0.0000])
tensor([1.3625, 0.0000, 0.0000, 0.7265, 0.0785, 0.0000, 0.2307, 0.0000])

此时input与output一样
相当于执行:
y=x+n
x=y

posted @ 2020-09-11 14:24  SnailWorks  阅读(1821)  评论(0)    收藏  举报