Robust Depth Completion with Uncertainty-Driven Loss Functions

Authors

  • Yufan Zhu Xidian University
  • Weisheng Dong Xidian University
  • Leida Li Xidian University
  • Jinjian Wu Xidian University
  • Xin Li West Virginia University
  • Guangming Shi Xidian University

DOI:

https://doi.org/10.1609/aaai.v36i3.20275

Keywords:

Computer Vision (CV)

Abstract

Recovering a dense depth image from sparse LiDAR scans is a challenging task. Despite the popularity of color-guided methods for sparse-to-dense depth completion, they treated pixels equally during optimization, ignoring the uneven distribution characteristics in the sparse depth map and the accumulated outliers in the synthesized ground truth. In this work, we introduce uncertainty-driven loss functions to improve the robustness of depth completion and handle the uncertainty in depth completion. Specifically, we propose an explicit uncertainty formulation for robust depth completion with Jeffrey's prior. A parametric uncertain-driven loss is introduced and translated to new loss functions that are robust to noisy or missing data. Meanwhile, we propose a multiscale joint prediction model that can simultaneously predict depth and uncertainty maps. The estimated uncertainty map is also used to perform adaptive prediction on the pixels with high uncertainty, leading to a residual map for refining the completion results. Our method has been tested on KITTI Depth Completion Benchmark and achieved the state-of-the-art robustness performance in terms of MAE, IMAE, and IRMSE metrics.

Downloads

Published

2022-06-28

How to Cite

Zhu, Y., Dong, W., Li, L., Wu, J., Li, X., & Shi, G. (2022). Robust Depth Completion with Uncertainty-Driven Loss Functions. Proceedings of the AAAI Conference on Artificial Intelligence, 36(3), 3626-3634. https://doi.org/10.1609/aaai.v36i3.20275

Issue

Section

AAAI Technical Track on Computer Vision III