Decontaminate feature for tracking : adaptive tracking via evolutionary feature subset

Liu, Qiaoyuan and Wang, Yuru and Yin, Minghao and Ren, Jinchang and Li, Ruizhi (2017) Decontaminate feature for tracking : adaptive tracking via evolutionary feature subset. Journal of Electronic Imaging, 26 (6). 063025. ISSN 1560-229X (https://doi.org/10.1117/1.JEI.26.6.063025)

[thumbnail of Liu-etal-JEI-2017-Decontaminate-feature-for-tracking]
Preview
Text. Filename: Liu_etal_JEI_2017_Decontaminate_feature_for_tracking.pdf
Accepted Author Manuscript

Download (6MB)| Preview

Abstract

Although various visual tracking algorithms have been proposed in the last 2-3 decades, it remains a challenging problem for effective tracking with fast motion, deformation, occlusion et al. Under complex tracking conditions, most tracking models are not discriminative and adaptive enough. When the combined feature vectors are inputted to the visual models, this may lead to redundancy caused low efficiency and ambiguity caused poor performance. In this paper, an effective tracking algorithm is proposed to decontaminate features for each video sequence adaptively, where the visual modeling is treated as an optimization problem from the perspective of evolution. Every feature vector is compared to a biological individual and then decontaminated via classical evolutionary algorithms. With the optimized subsets of features, “Curse of Dimensionality” has been avoided whilst the accuracy of the visual model has been improved. The proposed algorithm has been tested on several publicly available datasets with various tracking challenges and benchmarked with a number of state-of-the-art approaches. The comprehensive experiments have demonstrated the efficacy of the proposed methodology.

ORCID iDs

Liu, Qiaoyuan, Wang, Yuru, Yin, Minghao, Ren, Jinchang ORCID logoORCID: https://orcid.org/0000-0001-6116-3194 and Li, Ruizhi;