Edge Consistency for 4D Gaussian Splatting in Dynamic Scene Rendering

Authors

  • Boya Shi Shanghai Jiaotong University National University of Defense Technology
  • Thomas N Guan National University of Defense Technology
  • Yi Xiaodong National University of Defense Technology

DOI:

https://doi.org/10.1609/aaai.v40i11.37847

Abstract

Existing dynamic scene rendering methods often struggle to preserve sharp edges and maintain temporal consistency. To address these challenges, we introduce Edge 4D Gaussian Splatting (Edge4DGS), a real-time rendering framework that renders fine-grained geometry from sparse monocular inputs in dynamic scenes. Edge4DGS proposes a hybrid geometric representation that augments Gaussian primitives with convex hulls, enabling accurate modeling of hard surfaces and complex boundaries. To enhance spatial precision, we introduce edge consistency regularization leveraging optical flow, guiding Gaussian distributions to align with true object contours. To enforce temporal coherence, we extend the regularization from discrete time steps to continuous unit intervals, enabling accurate motion modeling and reducing flickering artifacts. A two-stage coarse-to-fine optimization further improves geometric fidelity while preserving computational efficiency. Extensive experiments on monocular and multi-view motion datasets demonstrate that Edge4DGS achieves real-time, high-resolution rendering and consistently surpasses state-of-the-art methods, reducing LPIPS by 56.25%.

Downloads

Published

2026-03-14

How to Cite

Shi, B., Guan, T. N., & Xiaodong, Y. (2026). Edge Consistency for 4D Gaussian Splatting in Dynamic Scene Rendering. Proceedings of the AAAI Conference on Artificial Intelligence, 40(11), 8923–8932. https://doi.org/10.1609/aaai.v40i11.37847

Issue

Section

AAAI Technical Track on Computer Vision VIII