IET Image Processing (Jun 2021)

Learning adaptive spatial–temporal regularized correlation filters for visual tracking

  • Jianwei Zhao,
  • Yangxiao Li,
  • Zhenghua Zhou

DOI
https://doi.org/10.1049/ipr2.12150
Journal volume & issue
Vol. 15, no. 8
pp. 1773 – 1785

Abstract

Read online

Abstract Recently, there have been many visual tracking methods based on correlation filters. These methods mainly enhance the tracking performances by considering the information of background, space, or time in the appearance model. This paper proposes an effective tracking method, named adaptive spatial–temporal regularized correlation filter (ASTRCF) tracker, based on the popular adaptive spatially regularized correlation filter (ASRCF) tracker. That is, the continuity of object's motion in the process of tracking is considered by introducing a temporal‐regularized term in the appearance model of ASRCF tracker. Furthermore, its solution is inferred by applying the alternating direction method of multipliers. The proposed appearance model contains a background‐awareness term, a spatially regularized term, an adaptive‐weight term, and a temporal‐regularized term. Therefore, it can not only keep the good performances of ASRCF tracker, such as learning the background information and the spatial information adaptively to enhance the discriminating ability, but also take advantage of the relation of correlation filters in the last frame and the current frame for addressing the complex cases, such as occlusion, and fast motion. Extensive experimental results on various challenging databases show that the proposed ASTRCF tracker achieves better tracking performances than some state‐of‐the‐art trackers.

Keywords