指定维度softmax 层tensorRT api实现

解决了问题才来记录一下的,现在的心情是好点儿的,但是之前,昨天,真是无厘头,被折腾的一点脾气都没有。
本来就是一个softmax嘛,很简单的嘛,就是按照公式e的指数再相加求和,官方有api实现,比如我找的例子,

 // Add activation layer using the ReLU algorithm.
    IActivationLayer* relu1 = network->addActivation(*ip1->getOutput(0), ActivationType::kRELU);
    assert(relu1);

    // Add second fully connected layer with 20 outputs.
    IFullyConnectedLayer* ip2 = network->addFullyConnected(
        *relu1->getOutput(0), mParams.outputSize, mWeightMap["ip2filter"], mWeightMap["ip2bias"]);
    assert(ip2);

    // Add softmax layer to determine the probability.
    ISoftMaxLayer* prob = network->addSoftMax(*ip2->getOutput(0));
    assert(prob);
    prob->getOutput(0)->setName(mParams.outputTensorNames[0].c_str());
    network->markOutput(*prob->getOutput(0));

这是官方例子简单分类的例子,
https://github.com/NVIDIA/TensorRT/blob/master/samples/opensource/sampleMNISTAPI/sampleMNISTAPI.cpp
里面就有ISoftMaxLayer* prob = network->addSoftMax(*ip2->getOutput(0));
我这里要实现的稍微有点儿不同,在于我需要按照某个维度来softmax,上面这个是全局的。我需要实现的对应pytorch代码如下:

        #[1,6375,2]                # arm_conf[1,12750]
        arm_conf_view = arm_conf.view(arm_conf.size(0), -1,2)
        softmax_1 = nn.Softmax(dim=-1)
        m2 = softmax_1(arm_conf_view)  # #[1,6375,2]

对应的m2部分数值如下:

tensor([[[0.9575, 0.0425],
         [0.9326, 0.0674],
         [0.9131, 0.0869],
         ...,
         [0.8707, 0.1293],
         [0.8746, 0.1254],
         [0.8783, 0.1217]]], grad_fn=<SoftmaxBackward>)

可以看到,是按照第二维进行softmax的,每行加起来之和为1.
在tensorrt端,我现在也得到了一个一维数组arm_conf[12750],再把这个数组也一样softmax就可以了。但是我这个不是全局的softmax。然后我就去看
https://github.com/wang-xinyu/tensorrtx这个仓库有没有人用,一搜索有个人用。
https://github.com/wang-xinyu/tensorrtx/blob/18fa419ae35bfcbd27248b3eb9329f415f604366/retinafaceAntiCov/retinafaceAntiCov.cpp

ILayer* reshapeSoftmax(INetworkDefinition *network, ITensor& input, int c) {
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(c / 2, -1, 0));

    auto sm = network->addSoftMax(*re1->getOutput(0));
    assert(sm);

    auto re2 = network->addShuffle(*sm->getOutput(0));
    assert(re2);
    re2->setReshapeDimensions(Dims3(c, -1, 0));

    return re2;
}

好像和我需要的功能一样,因为我是12750数据,也需要先变成[6375,2]形状的数据,最后传出去的时候再变成[12750]. 很快,我也写类似代码:

ILayer* reshapeSoftmax_ok(INetworkDefinition *network, ITensor& input, int ch) {
    //输入进来是一维的[12750]
    //先变成[XX,ch]
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(1, -1, ch)); //[1,6375,2];
//     re1->setReshapeDimensions(Dims2(-1, ch)); //[6375,2];

    Dims dim0 = re1->getOutput(0)->getDimensions();
    std::cout <<"debug  re1 dim==" << dim0.d[0] << " " << dim0.d[1] << " " << dim0.d[2] << " " << dim0.d[3] << std::endl;
    
    auto sm = network->addSoftMax(*re1->getOutput(0));
    sm->setAxes(2);
    assert(sm);

    //再变成一维的,保持和传进来的形状一样
    Dims dim_;
    dim_.nbDims=1;
    dim_.d[0]=-1;
    auto re2 = network->addShuffle(*sm->getOutput(0));
    assert(re2);
    re2->setReshapeDimensions(dim_);

    return re2;
}

我这里多了sm->setAxes(2);因为我需要在第二维度softmax,但是!结果不对啊。怎么整结果都不对,三维,二维都不对!
sm->setAxes(0);,sm->setAxes(1);sm->setAxes(2);都试了都不对,有的出来的结果好像是全局softmax的结果,有的都是1.
期间,也是查看了文档帮助,

    //!
    //! \brief Add a SoftMax layer to the network.
    //!
    //! \see ISoftMaxLayer
    //! \warning Int32 tensors are not valid input tensors.
    //!
    //! \return The new SoftMax layer, or nullptr if it could not be created.
    //!
    virtual ISoftMaxLayer* addSoftMax(ITensor& input) TRTNOEXCEPT = 0;
//!
//! \class ISoftMaxLayer
//!
//! \brief A Softmax layer in a network definition.
//!
//! This layer applies a per-channel softmax to its input.
//!
//! The output size is the same as the input size.
//!
//! \warning Do not inherit from this class, as doing so will break forward-compatibility of the API and ABI.
//!
class ISoftMaxLayer : public ILayer
{
protected:
    virtual ~ISoftMaxLayer() {}
public:
    //!
    //! \brief Set the axis along which softmax is computed. Currently, only one axis can be set.
    //!
    //! The axis is specified by setting the bit corresponding to the axis to 1.
    //! Let's say we have an NCHW tensor as input (three non-batch dimensions).
    //!
    //! In implicit mode :
    //! Bit 0 corresponds to the C dimension boolean.
    //! Bit 1 corresponds to the H dimension boolean.
    //! Bit 2 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 non-batch axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is H.
    //!
    //! In explicit mode :
    //! Bit 0 corresponds to the N dimension boolean.
    //! Bit 1 corresponds to the C dimension boolean.
    //! Bit 2 corresponds to the H dimension boolean.
    //! Bit 3 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is N.
    //!
    //! For example, to perform softmax on axis R of a NPQRCHW input, set bit 2 with implicit batch mode,
    //! set bit 3 with explicit batch mode.
    //!
    //! \param axes The axis along which softmax is computed.
    //!        Here axes is a bitmap. For example, when doing softmax along axis 0, bit 0 is set to 1, axes = 1 << axis = 1.
    //!
    virtual void setAxes(uint32_t axes) TRTNOEXCEPT = 0;

    //!
    //! \brief Get the axis along which softmax occurs.
    //!
    //! \see setAxes()
    //!
    virtual uint32_t getAxes() const TRTNOEXCEPT = 0;
};

就两个函数可以调用设置维度啊,就是不对,晚上加班各种实验,都是和之前一样的结果,然后又怀疑输入数据不一样,或者tensorrt reshape之后数据不一样,

    //输入进来是一维的[12750]
    //先变成[XX,ch]
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(1, -1, ch)); //[1,6375,2];
//     re1->setReshapeDimensions(Dims2(-1, ch)); //[6375,2];
   return re1;/////////////////////////////////////////

reshape之后提前return,然后看是否和pytorch一致,发现是一致的!
这里实验一次特别麻烦,要先序列化生成engine,然后再反序列化推理出结果才能完成一次实验。
晚上加班到10点多,没有解决,解决不了,郁郁闷闷的回家了。
期间还有个思路就是用cuda编程完成softmax操作,但是感觉太麻烦了。
群里问了大佬,没人回答,然后又私聊问了群里的球哥,他也表示代码是对的,说明天给我实验。
最后我总结了一下就是
“你明天做实验的时候就是比如输入是一个形状是[8]的tensor,然后进函数里面4️ 2,softmax,应该也得到4乘2的tensor,4行2列,每行加起来和为1,然后再变为形状为[8]的tensor输出来。”

然后第二天早上又来搞,检查了一下其他环节有没有问题,没有问题的。又实验了几次,现象还是和之前一样。无解。
中午球哥联系我,让我确认了输入和pytorch代码数据一致。我太困了,拿着手机趴会儿因为一有消息有就能醒,果真到13点的时候,

可以!!!
我意识到这是bit位操作的一些写法。球哥说是的。
给我发了一个图片:

注意图片上面的这段英文:
reduceAxes: the least significant bit corresponds to the first explicit dimension
reduceAxes:最低有效位对应于第一个显式维度

mask是0010,bitmax就是0100
比如我是[6375,2],想在第1维度softmax,mask就是[0,1]对应的bitmap就是[1,0],所以需要这么写1<<1
比如我是[1,6375,2],想在第2维度softmax,mask就是[0,0,1]对应的bitmap就是[1,0,0],所以需要这么写1<<2
bitmap和mask是倒着的啊!
现在反过来再去看官方文档,

  //! In explicit mode :
    //! Bit 0 corresponds to the N dimension boolean.
    //! Bit 1 corresponds to the C dimension boolean.
    //! Bit 2 corresponds to the H dimension boolean.
    //! Bit 3 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is N.
    //!
    //! For example, to perform softmax on axis R of a NPQRCHW input, set bit 2 with implicit batch mode,
    //! set bit 3 with explicit batch mode.
    //!
    //! \param axes The axis along which softmax is computed.
    //!        Here axes is a bitmap. For example, when doing softmax along axis 0, bit 0 is set to 1, axes = 1 << axis = 1.

还是看不懂,axes = 1 << axis = 1.这什么鬼,就不能好好写个例子的吗?
感谢球哥!!

正确的代码如下

ILayer* reshapeSoftmax(INetworkDefinition *network, ITensor& input, int ch) {
    //输入进来是一维的[12750]
    //先变成[XX,ch]
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(1, -1, ch)); //[1,6375,2];
//     re1->setReshapeDimensions(Dims2(-1, ch)); //[6375,2];

    Dims dim0 = re1->getOutput(0)->getDimensions();
    std::cout <<"debug  re1 dim==" << dim0.d[0] << " " << dim0.d[1] << " " << dim0.d[2] << " " << dim0.d[3] << std::endl;

//    return re1;/////////////////////////////////////////

    auto sm = network->addSoftMax(*re1->getOutput(0));
    sm->setAxes(1<<2);
    assert(sm);

    //再变成一维的,保持和传进来的形状一样
    Dims dim_;
    dim_.nbDims=1;
    dim_.d[0]=-1;
    auto re2 = network->addShuffle(*sm->getOutput(0));
    assert(re2);
    re2->setReshapeDimensions(dim_);
    return re2;
}

和球哥讨论,不写setAxes,应该是默认第二个。nchw是c chw是h
然后回顾一下仓库里面的这段代码,人家之所以reshape,可能就是这个大佬发现了softmax操作只能按照某个维度来,需要按照自己指定的维度需要reshape一下才能得到正确结果。
可能他没有知道需要这么写->setAxes(1<<2);指定维度。

感觉球哥研究tensorrt比较深入,还用到了其他的高级api,

posted @ 2021-03-05 14:34  无左无右  阅读(956)  评论(0编辑  收藏  举报