Mathematics (Nov 2019)

Combining Spatio-Temporal Context and Kalman Filtering for Visual Tracking

  • Haoran Yang,
  • Juanjuan Wang,
  • Yi Miao,
  • Yulu Yang,
  • Zengshun Zhao,
  • Zhigang Wang,
  • Qian Sun,
  • Dapeng Oliver Wu

DOI
https://doi.org/10.3390/math7111059
Journal volume & issue
Vol. 7, no. 11
p. 1059

Abstract

Read online

As one of the core contents of intelligent monitoring, target tracking is the basis for video content analysis and processing. In visual tracking, due to occlusion, illumination changes, and pose and scale variation, handling such large appearance changes of the target object and the background over time remains the main challenge for robust target tracking. In this paper, we present a new robust algorithm (STC-KF) based on the spatio-temporal context and Kalman filtering. Our approach introduces a novel formulation to address the context information, which adopts the entire local information around the target, thereby preventing the remaining important context information related to the target from being lost by only using the rare key point information. The state of the object in the tracking process can be determined by the Euclidean distance of the image intensity in two consecutive frames. Then, the prediction value of the Kalman filter can be updated as the Kalman observation to the object position and marked on the next frame. The performance of the proposed STC-KF algorithm is evaluated and compared with the original STC algorithm. The experimental results using benchmark sequences imply that the proposed method outperforms the original STC algorithm under the conditions of heavy occlusion and large appearance changes.

Keywords