Robust Depth Completion with Uncertainty-Driven Loss Functions
Keywords:Computer Vision (CV)
AbstractRecovering a dense depth image from sparse LiDAR scans is a challenging task. Despite the popularity of color-guided methods for sparse-to-dense depth completion, they treated pixels equally during optimization, ignoring the uneven distribution characteristics in the sparse depth map and the accumulated outliers in the synthesized ground truth. In this work, we introduce uncertainty-driven loss functions to improve the robustness of depth completion and handle the uncertainty in depth completion. Specifically, we propose an explicit uncertainty formulation for robust depth completion with Jeffrey's prior. A parametric uncertain-driven loss is introduced and translated to new loss functions that are robust to noisy or missing data. Meanwhile, we propose a multiscale joint prediction model that can simultaneously predict depth and uncertainty maps. The estimated uncertainty map is also used to perform adaptive prediction on the pixels with high uncertainty, leading to a residual map for refining the completion results. Our method has been tested on KITTI Depth Completion Benchmark and achieved the state-of-the-art robustness performance in terms of MAE, IMAE, and IRMSE metrics.
How to Cite
Zhu, Y., Dong, W., Li, L., Wu, J., Li, X., & Shi, G. (2022). Robust Depth Completion with Uncertainty-Driven Loss Functions. Proceedings of the AAAI Conference on Artificial Intelligence, 36(3), 3626-3634. https://doi.org/10.1609/aaai.v36i3.20275
AAAI Technical Track on Computer Vision III