EvHDR-GS: Event-guided HDR Video Reconstruction with 3D Gaussian Splatting

Authors

  • Zehao Chen Zhejiang University
  • Zhan Lu Nanyang Technological University
  • De Ma Zhejiang University
  • Huajin Tang Zhejiang University
  • Xudong Jiang Nanyang Technological University
  • Qian Zheng Zhejiang University
  • Gang Pan Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v39i3.32237

Abstract

High Dynamic Range (HDR) video reconstruction seeks to accurately restore the extensive dynamic range present in real-world scenes and is widely employed in downstream applications. Existing methods typically operate on one or a small number of consecutive frames, which often leads to inconsistent brightness across the video due to their limited perspective on the video sequence. Moreover, supervised learning-based approaches are susceptible to data bias, resulting in reduced effectiveness when confronted with test inputs exhibiting a domain gap relative to the training data. To address these limitations, we present an event-guided HDR video reconstruction method through building 3D Gaussian Splatting (3DGS), to ensure consistent brightness imposed by 3D consistency. We introduce HDR 3D Gaussians capable of simultaneously representing HDR and low-dynamic-range (LDR) colors. Furthermore, we incorporate a learnable HDR-to-LDR transformation optimized by input event streams and LDR frames to eliminate the data bias. Experimental results on both synthetic and real-world datasets demonstrate that the proposed method achieves state-of-the-art performance.

Published

2025-04-11

How to Cite

Chen, Z., Lu, Z., Ma, D., Tang, H., Jiang, X., Zheng, Q., & Pan, G. (2025). EvHDR-GS: Event-guided HDR Video Reconstruction with 3D Gaussian Splatting. Proceedings of the AAAI Conference on Artificial Intelligence, 39(3), 2367-2375. https://doi.org/10.1609/aaai.v39i3.32237

Issue

Section

AAAI Technical Track on Computer Vision II