Symmetry (Mar 2022)

Micro-Attention Branch, a Flexible Plug-In That Enhances Existing 3D ConvNets

  • Yongsheng Li,
  • Tengfei Tu,
  • Huawei Wang,
  • Wei Yin

DOI
https://doi.org/10.3390/sym14040639
Journal volume & issue
Vol. 14, no. 4
p. 639

Abstract

Read online

In the field of video motion recognition, with the increase in network depth, there is an asymmetry in the amount of parameters and their accuracy. We propose the structure of micro-attention branches and a method of integrating attention branches in multi-branches 3D convolution networks. Our proposed attention branches can improve this asymmetry problem. Attention branches can be flexibly added to 3D convolution network in the form of plug-in, without changing the overall structure of the original network. Through this structure, the newly constructed network can fuse the attention features extracted by attention branches in real time in the process of feature extraction. By adding attention branches, the model can focus on the action subject more accurately, so as to improve the accuracy of the model. Moreover, in the case that there are multiple sub-branches in the construction module of the existing network, 3D micro-attention branches can well adapt to this scenario. In the kinetics dataset, we use our proposed micro-attention branch structure to construct a deep network, which is compared with the original network. The experimental results show that the recognition accuracy of the network with micro-attention branches is improved by 3.6% compared with the original network, while the amount of parameters to be trained is only increased by 0.6%.

Keywords