Entropy (Mar 2022)

Learning Visible Thermal Person Re-Identification via Spatial Dependence and Dual-Constraint Loss

  • Chuandong Wang,
  • Chi Zhang,
  • Yujian Feng,
  • Yimu Ji,
  • Jianyu Ding

DOI
https://doi.org/10.3390/e24040443
Journal volume & issue
Vol. 24, no. 4
p. 443

Abstract

Read online

Visible thermal person re-identification (VT Re-ID) is the task of matching pedestrian images collected by thermal and visible light cameras. The two main challenges presented by VT Re-ID are the intra-class variation between pedestrian images and the cross-modality difference between visible and thermal images. Existing works have principally focused on local representation through cross-modality feature distribution, but ignore the internal connection of the local features of pedestrian body parts. Therefore, this paper proposes a dual-path attention network model to establish the spatial dependency relationship between the local features of the pedestrian feature map and to effectively enhance the feature extraction. Meanwhile, we propose cross-modality dual-constraint loss, which adds the center and boundary constraints for each class distribution in the embedding space to promote compactness within the class and enhance the separability between classes. Our experimental results show that our proposed approach has advantages over the state-of-the-art methods on the two public datasets SYSU-MM01 and RegDB. The result for the SYSU-MM01 is Rank-1/mAP 57.74%/54.35%, and the result for the RegDB is Rank-1/mAP 76.07%/69.43%.

Keywords