Complex & Intelligent Systems (Sep 2023)
RCFT: re-parameterization convolution and feature filter for object tracking
Abstract
Abstract Siamese-based trackers have been widely studied for their high accuracy and speed. Both the feature extraction and feature fusion are two important components in Siamese-based trackers. Siamese-based trackers obtain fine local features by traditional convolution. However, some important channel information and global information are lost when enhancing local features. In the feature fusion process, cross-correlation-based feature fusion between the template and search region feature ignores the global spatial context information and does not make the best of the spatial information. In this paper, to solve the above problem, we design a novel feature extraction sub-network based on batch-free normalization re-parameterization convolution, which scales the features in the channel dimension and increases the receptive field. Richer channel information is obtained and powerful target features are extracted for the feature fusion. Furthermore, we learn a feature fusion network (FFN) based on feature filter. The FFN fuses the template and search region features in a global spatial context to obtain high-quality fused features by enhancing important features and filtering redundant features. By jointly learning the proposed feature extraction sub-network and FFN, the local and global information are fully exploited. Then, we propose a novel tracking algorithm based on the designed feature extraction sub-network and FFN with re-parameterization convolution and feature filter, referred to as RCFT. We evaluate the proposed RCFT tracker and some recent state-of-the-art (SOTA) trackers on OTB100, VOT2018, LaSOT, GOT-10k, UAV123 and the visual-thermal dataset VOT-RGBT2019 datasets, which achieves superior tracking performance with 45 FPS tracking speed.
Keywords