Error Aware Monocular Visual Odometry using Vertical Line Pairs for Small Robots in Urban Areas

Authors

  • Ji Zhang Texas A&M University
  • Dezhen Song Texas A&M University

DOI:

https://doi.org/10.1609/aaai.v24i1.7723

Keywords:

Vertical line, vision odometry, mobile robots

Abstract

We report a new error-aware monocular visual odometry method that only uses vertical lines, such as vertical edges of buildings and poles in urban areas as landmarks. Since vertical lines are easy to extract, insensitive to lighting conditions/ shadows, and sensitive to robot movements on the ground plane, they are robust features if compared with regular point features or line features. We derive a recursive visual odometry method based on the vertical line pairs. We analyze how errors are propagated and introduced in the continuous odometry process by deriving the closed form representation of covariance matrix. We formulate the minimum variance ego-motion estimation problem and present a method that outputs weights for different vertical line pairs. The resulting visual odometry method is tested in physical experiments and compared with two existing methods that are based on point features and line features, respectively. The experiment results show that our method outperforms its two counterparts in robustness, accuracy, and speed. The relative errors of our method are less than 2% in experiments.

Downloads

Published

2010-07-05

How to Cite

Zhang, J., & Song, D. (2010). Error Aware Monocular Visual Odometry using Vertical Line Pairs for Small Robots in Urban Areas. Proceedings of the AAAI Conference on Artificial Intelligence, 24(1), 1645-1650. https://doi.org/10.1609/aaai.v24i1.7723