TY - JOUR AU - Wang, Yingming AU - Zhang, Xiangyu AU - Yang, Tong AU - Sun, Jian PY - 2022/06/28 Y2 - 2024/03/28 TI - Anchor DETR: Query Design for Transformer-Based Detector JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 36 IS - 3 SE - AAAI Technical Track on Computer Vision III DO - 10.1609/aaai.v36i3.20158 UR - https://ojs.aaai.org/index.php/AAAI/article/view/20158 SP - 2567-2575 AB - In this paper, we propose a novel query design for the transformer-based object detection. In previous transformer-based detectors, the object queries are a set of learned embeddings.However, each learned embedding does not have an explicit physical meaning and we cannot explain where it will focus on.It is difficult to optimize as the prediction slot of each object query does not have a specific mode. In other words, each object query will not focus on a specific region.To solve these problems, in our query design, object queries are based on anchor points, which are widely used in CNN-based detectors. So each object query focuses on the objects near the anchor point. Moreover, our query design can predict multiple objects at one position to solve the difficulty: ``one region, multiple objects''.In addition, we design an attention variant, which can reduce the memory cost while achieving similar or better performance than the standard attention in DETR.Thanks to the query design and the attention variant, the proposed detector that we called Anchor DETR, can achieve better performance and run faster than the DETR with 10x fewer training epochs.For example, it achieves 44.2 AP with 19 FPS on the MSCOCO dataset when using the ResNet50-DC5 feature for training 50 epochs.Extensive experiments on the MSCOCO benchmark prove the effectiveness of the proposed methods.Code is available at https://github.com/megvii-research/AnchorDETR. ER -