IEEE Access (Jan 2019)

Part-Based Attribute-Aware Network for Person Re-Identification

  • Yan Zhang,
  • Xusheng Gu,
  • Jun Tang,
  • Ke Cheng,
  • Shoubiao Tan

DOI
https://doi.org/10.1109/ACCESS.2019.2912844
Journal volume & issue
Vol. 7
pp. 53585 – 53595

Abstract

Read online

Despite the rapid progress over the past decade, person re-identification (reID) remains a challenging task due to the fact that discriminative features underlying different granularities are easily affected by illumination and camera-view variation. Most deep learning-based algorithms for reID extract global embedding as the representation of the pedestrian from the convolutional neural network. Considering that person attributes are robust and informative to identify pedestrians. This paper proposes a multi-branch model, namely part-based attribute-aware network (PAAN), to leverage both person reID and attribute performance, which not only utilizes ID label visible to the whole image but also utilizes attribute information. In order to learn discriminative and robust global representation which is invariant to the fact mentioned above, we resort to global and local person attributes to build global and local representation, respectively, utilizing our proposed layered partition strategy. Our goal is to exploit global or local semantic information to guide the optimization of global representation. Besides, in order to enhance the global representation, we design a semantic bridge replenishing mid-level semantic information for the final representation, which contains high-level semantic information. The extensive experiments are conducted to demonstrate the effectiveness of our proposed approach on two large-scale person re-identification datasets including Market-1501 and DukeMTMC-reID, and our approach achieves rank-1 of 92.40% on Market-1501 and 82.59% on DukeMTMC-reID showing strong competitiveness among the start of the art.

Keywords