Quantcast

Long-term correlation tracking via spatial–temporal context

Research paper by Zhi Chen, Peizhong Liu, Yongzhao Du, Yanmin Luo, Jing-Ming Guo

Indexed on: 29 Jan '19Published on: 28 Jan '19Published in: The Visual Computer



Abstract

In this paper, we mainly deal with the problems of long-term visual tracking while the target objects undergo sophisticated scenarios such as occlusion, out-of-view, and scale changes. We employ two discriminative correlation filters (DCFs) for achieving long-term object tracking, which is performed by learning a spatial–temporal context correlation filter for translation estimation. As for the scale estimation, which is achieved by learning a scale DCF centered on the estimated target position to estimate scale from the best confident results. In addition, we proposed an efficient model update and redetecting activate strategy to avoid unrecoverable drift due to noisy updates, and achieve robust long-term tracking in the case of tracking failure. We evaluate our algorithm carry on OTB benchmark datasets, and the tracking results of both qualitative and quantitative evaluations on challenging sequences demonstrate that the proposed algorithm performs superiorly against several state-of-the-art DCFs methods including some methods which follow deep learning paradigm.