Guide Local Feature Matching by Overlap Estimation

Authors

  • Ying Chen Tencent
  • Dihe Huang Tsinghua University Tencent
  • Shang Xu Tencent
  • Jianlin Liu Tencent
  • Yong Liu Tencent

DOI:

https://doi.org/10.1609/aaai.v36i1.19913

Keywords:

Computer Vision (CV)

Abstract

Local image feature matching under large appearance, viewpoint, and distance changes is challenging yet important. Conventional methods detect and match tentative local features across the whole images, with heuristic consistency checks to guarantee reliable matches. In this paper, we introduce a novel Overlap Estimation method conditioned on image pairs with TRansformer, named OETR, to constrain local feature matching in the commonly visible region. OETR performs overlap estimation in a two step process of feature correlation and then overlap regression. As a preprocessing module, OETR can be plugged into any existing local feature detection and matching pipeline, to mitigate potential view angle or scale variance. Intensive experiments show that OETR can boost state of the art local feature matching performance substantially, especially for image pairs with small shared regions. The code will be publicly available at https://github.com/AbyssGaze/OETR.

Downloads

Published

2022-06-28

How to Cite

Chen, Y., Huang, D., Xu, S., Liu, J., & Liu, Y. (2022). Guide Local Feature Matching by Overlap Estimation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(1), 365-373. https://doi.org/10.1609/aaai.v36i1.19913

Issue

Section

AAAI Technical Track on Computer Vision I