Delving into Dynamic Scene Cue-Consistency for Robust 3D Multi-Object Tracking
DOI:
https://doi.org/10.1609/aaai.v40i15.38242Abstract
3D multi-object tracking is a critical and challenging task in the field of autonomous driving. A common paradigm relies on modeling individual object motion, e.g., Kalman filters, to predict trajectories. While effective in simple scenarios, this approach often struggles in crowded environments or with inaccurate detections, as it overlooks the rich geometric relationships between objects. This highlights the need to leverage spatial cues. However, existing geometry-aware methods can be susceptible to interference from irrelevant objects, leading to ambiguous features and incorrect associations. To address this, we propose focusing on cue-consistency: identifying and matching stable spatial patterns over time. We introduce the Dynamic Scene Cue-Consistency Tracker (DSC-Track) to implement this principle. Firstly, we design a unified spatiotemporal encoder using Point Pair Features (PPF) to learn discriminative trajectory embeddings while suppressing interference. Secondly, our cue-consistency transformer module explicitly aligns consistent feature representations between historical tracks and current detections. Finally, a dynamic update mechanism preserves salient spatiotemporal information for stable online tracking. Extensive experiments on the nuScenes and Waymo Open Datasets validate the effectiveness and robustness of our approach. On the nuScenes benchmark, for instance, our method achieves state-of-the-art performance, reaching 73.2% and 70.3% AMOTA on the validation and test sets, respectively.Downloads
Published
2026-03-14
How to Cite
Zhang, H., Wang, X., Wu, B., Zheng, T., Yunhua, W., & Yang, Z. (2026). Delving into Dynamic Scene Cue-Consistency for Robust 3D Multi-Object Tracking. Proceedings of the AAAI Conference on Artificial Intelligence, 40(15), 12484–12492. https://doi.org/10.1609/aaai.v40i15.38242
Issue
Section
AAAI Technical Track on Computer Vision XII