IDOL: Inertial Deep Orientation-Estimation and Localization


  • Scott Sun Carnegie Mellon University
  • Dennis Melamed Carnegie Mellon University
  • Kris Kitani Carnegie Mellon University



Localization, Mapping, and Navigation, Applications


Many smartphone applications use inertial measurement units (IMUs) to sense movement, but the use of these sensors for pedestrian localization can be challenging due to their noise characteristics. Recent data-driven inertial odometry approaches have demonstrated the increasing feasibility of inertial navigation. However, they still rely upon conventional smartphone orientation estimates that they assume to be accurate, while in fact these orientation estimates can be a significant source of error. To address the problem of inaccurate orientation estimates, we present a two-stage, data-driven pipeline using a commodity smartphone that first estimates device orientations and then estimates device position. The orientation module relies on a recurrent neural network and Extended Kalman Filter to obtain orientation estimates that are used to then rotate raw IMU measurements into the appropriate reference frame. The position module then passes those measurements through another recurrent network architecture to perform localization. Our proposed method outperforms state-of-the-art methods in both orientation and position error on a large dataset we constructed that contains 20 hours of pedestrian motion across 3 buildings and 15 subjects. Code and data are available at




How to Cite

Sun, S., Melamed, D., & Kitani, K. (2021). IDOL: Inertial Deep Orientation-Estimation and Localization. Proceedings of the AAAI Conference on Artificial Intelligence, 35(7), 6128-6137.



AAAI Technical Track on Intelligent Robots