TY - JOUR AU - Xu, Yonghao AU - Du, Bo AU - Zhang, Lefei AU - Zhang, Qian AU - Wang, Guoli AU - Zhang, Liangpei PY - 2019/07/17 Y2 - 2024/03/29 TI - Self-Ensembling Attention Networks: Addressing Domain Shift for Semantic Segmentation JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 33 IS - 01 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v33i01.33015581 UR - https://ojs.aaai.org/index.php/AAAI/article/view/4500 SP - 5581-5588 AB - <p>Recent years have witnessed the great success of deep learning models in semantic segmentation. Nevertheless, these models may not generalize well to unseen image domains due to the phenomenon of domain shift. Since pixel-level annotations are laborious to collect, developing algorithms which can adapt labeled data from source domain to target domain is of great significance. To this end, we propose self-ensembling attention networks to reduce the domain gap between different datasets. To the best of our knowledge, the proposed method is the first attempt to introduce selfensembling model to domain adaptation for semantic segmentation, which provides a different view on how to learn domain-invariant features. Besides, since different regions in the image usually correspond to different levels of domain gap, we introduce the attention mechanism into the proposed framework to generate attention-aware features, which are further utilized to guide the calculation of consistency loss in the target domain. Experiments on two benchmark datasets demonstrate that the proposed framework can yield competitive performance compared with the state of the art methods.</p> ER -