TY - JOUR AU - Zhou, Jinghao AU - Wang, Peng AU - Sun, Haoyang PY - 2020/04/03 Y2 - 2024/03/29 TI - Discriminative and Robust Online Learning for Siamese Visual Tracking JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 07 SE - AAAI Technical Track: Vision DO - 10.1609/aaai.v34i07.7002 UR - https://ojs.aaai.org/index.php/AAAI/article/view/7002 SP - 13017-13024 AB - <p>The problem of visual object tracking has traditionally been handled by variant tracking paradigms, either learning a model of the object's appearance exclusively online or matching the object with the target in an offline-trained embedding space. Despite the recent success, each method agonizes over its intrinsic constraint. The online-only approaches suffer from a lack of generalization of the model they learn thus are inferior in target regression, while the offline-only approaches (e.g., convolutional siamese trackers) lack the target-specific context information thus are not discriminative enough to handle distractors, and robust enough to deformation. Therefore, we propose an online module with an attention mechanism for offline siamese networks to extract target-specific features under <em>L2</em> error. We further propose a filter update strategy adaptive to treacherous background noises for discriminative learning, and a template update strategy to handle large target deformations for robust learning. Effectiveness can be validated in the consistent improvement over three siamese baselines: <em>SiamFC</em>, <em>SiamRPN++</em>, and <em>SiamMask</em>. Beyond that, our model based on <em>SiamRPN++</em> obtains the best results over six popular tracking benchmarks and can operate beyond real-time.</p> ER -