Robust MIL-Based Feature Template Learning for Object Tracking

Authors

  • Xiangyuan Lan Hong Kong Baptist University
  • Pong C. Yuen Hong Kong Baptist University
  • Rama Chellappa University of Maryland, College Park

DOI:

https://doi.org/10.1609/aaai.v31i1.11220

Keywords:

visual tracking

Abstract

Because of appearance variations, training samples of the tracked targets collected by the online tracker are required for updating the tracking model. However, this often leads to tracking drift problem because of potentially corrupted samples: 1) contaminated/outlier samples resulting from large variations (e.g. occlusion, illumination), and 2) misaligned samples caused by tracking inaccuracy. Therefore, in order to reduce the tracking drift while maintaining the adaptability of a visual tracker, how to alleviate these two issues via an effective model learning (updating) strategy is a key problem to be solved. To address these issues, this paper proposes a novel and optimal model learning (updating) scheme which aims to simultaneously eliminate the negative effects from these two issues mentioned above in a unified robust feature template learning framework. Particularly, the proposed feature template learning framework is capable of: 1) adaptively learning uncontaminated feature templates by separating out contaminated samples, and 2) resolving label ambiguities caused by misaligned samples via a probabilistic multiple instance learning (MIL) model. Experiments on challenging video sequences show that the proposed tracker performs favourably against several state-of-the-art trackers.

Downloads

Published

2017-02-12

How to Cite

Lan, X., Yuen, P. C., & Chellappa, R. (2017). Robust MIL-Based Feature Template Learning for Object Tracking. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.11220