pytorch相关api使用
1.torch.nn.GRU
-
input_size – 输入数据的特征维数(通常对应词向量嵌入的维数)
-
hidden_size – (设置隐藏层神经元个数,一般和input_size一样)
-
num_layers – Number of recurrent layers. (默认1层 几层GRU)
-
bias – (默认true )
-
batch_first – If
True, then the input and output tensors are provided as (batch, seq, feature) instead of (seq, batch, feature). Note that this does not apply to hidden or cell states. See the Inputs/Outputs sections below for details. Default:False -
dropout – If non-zero, introduces a Dropout layer on the outputs of each GRU layer except the last layer, with dropout probability equal to
dropout. Default: 0 -
bidirectional – If
True, becomes a bidirectional GRU. Default:False
2.torch.nn.linear
-
in_features – 输入特征维数
-
out_features – 神经元个数
-
bias – 默认true,是否加入偏置项
3.torch.rand()
按照标准正太分布返回随机数
例如:
m = nn.Linear(20, 30) input = torch.randn(128, 20) output = m(input) print(output.size())
torch.Size([128, 30])
浙公网安备 33010602011771号