Egyptian Journal of Remote Sensing and Space Sciences (Dec 2023)

A double transformer residual super-resolution network for cross-resolution person re-identification

  • Fuzhen Zhu,
  • Ce Sun,
  • Chen Wang,
  • Bing Zhu

Journal volume & issue
Vol. 26, no. 3
pp. 768 – 776

Abstract

Read online

Cross-resolution person re-identification is a challenging problem in the field of person re-identification. In order to solve the problem of resolution mismatch, many studies introduce super-resolution into person re-identification tasks. In this work, we propose a cross-resolution person re-identification method based on double transformer residual super-resolution network (DTRSR), which mainly includes super-resolution module and person re-identification module. In the super-resolution module, we propose the double transformer network as our attention module. First of all, we divide the features extracted by the residual network. Then calculate the similarity between each local feature and the global feature after average pooling and maximum pooling respectively, which makes our module quickly capture the hidden weight information in the spatial domain. In the person re-identification module, we propose an effective fusion method based on key point features (KPFF). The key point extraction model can not only solve the problem that local features can not be accurately aligned, but also remove the interference of background noise. In order to fully mine the relationship between the features of each key point, we calculate the two-way correlation between each key point feature and other features, and then superimpose the two-way correlation with the feature itself to get the superposition feature which contains global and local information. The effectiveness of this method is proved by extensive experiments. Compared with the most advanced methods, the test results in the three datasets show that our method improves rank-1 by 1.1%, 3.5% and 1.7%; and rank-5 by 1.3%, 1.7% and 0.3%; and rank-10 by 0.1%, 0.4% and 0.1%, respectively.

Keywords