EURASIP Journal on Wireless Communications and Networking (Jan 2020)

Background perception for correlation filter tracker

  • Yushan Zhang,
  • Jianan Li,
  • Fan Wu,
  • Lingyue Wu,
  • Tingfa Xu

DOI
https://doi.org/10.1186/s13638-019-1630-y
Journal volume & issue
Vol. 2020, no. 1
pp. 1 – 12

Abstract

Read online

Abstract Visual object tracking is one of the most fundamental tasks in the field of computer vision, and it has numerous applications in many realms such as public surveillance, human-computer interaction, robotics, etc. Recently, discriminative correlation filter (DCF)-based trackers have achieved promising results in short-term tracking problems. Most of them focus on extracting reliable features from the foreground of input images to construct a robust and informative description of the target. However, it is often ignored that the image background which contains the surrounding context of the target is often similar across consecutive frames and thus can be beneficial to locating the target. In this paper, we propose a background perception regulation term to additionally exploit useful background information of the target. Specifically, invalid description of the target can be avoided when either background or foreground information becomes unreliable by assigning similar importance to both of them. Moreover, a novel model update strategy is further proposed. Instead of updating the model by frame, we introduce an output evaluation score, which serves to supervise the tracking process and select high-confidence results for model update, thus paving a new way to avoid model corruption. Extensive experiments on OTB-100 dataset well demonstrate the effectiveness of the proposed method BPCF, which gets an AUC score of 0.689 and outperforms most of the state-of-the-art.

Keywords