今天遇到分类问题
TensorRT softmax层
如果直接在fc后面接softmax 则会进行全局softmax
需要进行reshape 比如2分类 需要转为[1, 1, 2]这种维度 然后在维度2上进行softmax
百度后 先使用 IShuffleLayer 进行维度变换
复制代码
1
2
3IShuffleLayer *shuffleLayer = network->addShuffle(input); assert(shuffleLayer); shuffleLayer->setReshapeDimensions(Dims3(1, -1, c));
再进行softmax层
复制代码
1
2
3ISoftMaxLayer *softmax = network->addSoftMax(*shuffleLayer1->getOutput(0)); assert(softmax); softmax->setAxes(1<<2);
特别注意这里的 softmax->setAxes(1<<2);
如果按照pytorch思路 直接 softmax->setAxes(2); 则结果就是全部1 结果不是我我们预期的
看下官方代码注释
复制代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30//! //! brief Set the axis along which softmax is computed. Currently, only one axis can be set. //! //! The axis is specified by setting the bit corresponding to the axis to 1. //! Let's say we have an NCHW tensor as input (three non-batch dimensions). //! //! In implicit mode : //! Bit 0 corresponds to the C dimension boolean. //! Bit 1 corresponds to the H dimension boolean. //! Bit 2 corresponds to the W dimension boolean. //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if //! there are fewer than 3 non-batch axes. For example, if the input is NCHW, the default axis is C. If the input //! is NHW, then the default axis is H. //! //! In explicit mode : //! Bit 0 corresponds to the N dimension boolean. //! Bit 1 corresponds to the C dimension boolean. //! Bit 2 corresponds to the H dimension boolean. //! Bit 3 corresponds to the W dimension boolean. //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if //! there are fewer than 3 axes. For example, if the input is NCHW, the default axis is C. If the input //! is NHW, then the default axis is N. //! //! For example, to perform softmax on axis R of a NPQRCHW input, set bit 2 with implicit batch mode, //! set bit 3 with explicit batch mode. //! //! param axes The axis along which softmax is computed. //! Here axes is a bitmap. For example, when doing softmax along axis 0, bit 0 is set to 1, axes = 1 << axis = 1. //!
按照网上的解释 :
例如以NCHW而言,如果想要对H所在维度进行softmax, mask为0010 对于bitmap表示法:0100 转为bit移位操作 (1<<2)
对于C维度操作,mask为0100 bit表示法则为0010 转为bit移位操作(1<<1)
对于我的代码 我的输入softmax维度为(1, 1, 2, 0)
对于维度2进行操作 mask 0010 bit操作则 0100 移位操作(1<<2)
完整代码:
复制代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20// softmax layer ILayer* reshapeSoftmax(INetworkDefinition *network, ITensor &input, int c) { IShuffleLayer *shuffleLayer1 = network->addShuffle(input); assert(shuffleLayer1); shuffleLayer1->setReshapeDimensions(Dims3(1, -1, c)); Dims dim0 = shuffleLayer1->getOutput(0)->getDimensions(); cout << "softmax output dims " << dim0.d[0] << " " << dim0.d[1] << " " << dim0.d[2] << " " << dim0.d[3] << endl; ISoftMaxLayer *softmax = network->addSoftMax(*shuffleLayer1->getOutput(0)); assert(softmax); softmax->setAxes(1<<2); // 再变为一维数组 Dims dim_{}; dim_.nbDims = 1; dim_.d[0] = -1; IShuffleLayer *shuffleLayer2 = network->addShuffle(*softmax->getOutput(0)); assert(shuffleLayer2); shuffleLayer2->setReshapeDimensions(dim_); return shuffleLayer2; }
可以参考下
https://www.cnblogs.com/yanghailin/p/14486077.html
https://github.com/wang-xinyu/tensorrtx/blob/18fa419ae35bfcbd27248b3eb9329f415f604366/retinafaceAntiCov/retinafaceAntiCov.cpp
最后
以上就是有魅力皮皮虾最近收集整理的关于TensorRT softmax层的全部内容,更多相关TensorRT内容请搜索靠谱客的其他文章。
本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
发表评论 取消回复