Robust visual tracking using structurally random projection and weighted least squares

Shengping Zhang, Huiyu Zhou, Feng Jiang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

109 Citations (Scopus)
336 Downloads (Pure)


Sparse representation based visual tracking approaches have attracted increasing interests in the community in recent years. The main idea is to linearly represent each target candidate using a set of target and trivial templates while imposing a sparsity constraint onto the representation coefficients. After we obtain the coefficients using L1-norm minimization methods, the candidate with the lowest error, when it is reconstructed using only the target templates and the associated coefficients, is considered as the tracking result. In spite of promising system performance widely reported, it is unclear if the performance of these trackers can be maximised. In addition, computational complexity caused by the dimensionality of the feature space limits these algorithms in real-time applications. In this paper, we propose a real-time visual tracking method based on structurally random projection and weighted least squares techniques. In particular, to enhance the discriminative capability of the tracker, we introduce background templates to the linear representation framework. To handle appearance variations over time, we relax the sparsity constraint using a weighed least squares (WLS) method to obtain the representation coefficients. To further reduce the computational complexity, structurally random projection is used to reduce the dimensionality of the feature space while preserving the pairwise distances between the data points in the feature space. Experimental results show that the proposed approach outperforms several state-of-the-art tracking methods.
Original languageEnglish
Pages (from-to)1749-1760
Number of pages12
JournalIEEE Transactions on Circuits and Systems for Video Technology
Issue number11
Early online date20 Feb 2015
Publication statusPublished - Nov 2015

Cite this