PATexGS: Perceptual-Adaptive Texture Scheduling for Visual Coherence in Textured Gaussian Splatting
DOI:
https://doi.org/10.1609/aaai.v40i12.37999Abstract
3D Gaussian Splatting (3DGS) has emerged as a mainstream solution for real-time rendering and high-fidelity novel view synthesis. Building on this foundation, methods based on Textured Gaussians further improve the expression ability by incorporating explicit texture mapping into Gaussians. However, their reliance on fixed texture resolution often results in noticeable visual incoherence, triggering artifacts such as aliasing or inconsistent sharpness under different viewpoints. To address these issues, we propose PATexGS, a perceptual-adaptive texture scheduling framework designed to improve visual coherence for Textured Gaussians. Specifically, we introduce an entropy-guided texture allocation strategy that dynamically adjusts texture resolution based on each Gaussian’s spatial gradient and rendering contribution, constantly preserving details while being memory efficiency. Furthermore, we incorporate a mipmap-inspired hierarchical scheduling mechanism that adaptively schedule texture levels according to view-dependent projection scale, effectively suppressing aliasing and further enhancing perceptual consistency. Extensive experiments on diverse real-world scenes demonstrate that PATexGS significantly improves visual coherence while maintaining high rendering quality, outperforming existing TexturedGS variants in both perceptual fidelity and storage efficiency.Published
2026-03-14
How to Cite
Wang, Y., Ma, D., Chen, X., & Guan, T. (2026). PATexGS: Perceptual-Adaptive Texture Scheduling for Visual Coherence in Textured Gaussian Splatting. Proceedings of the AAAI Conference on Artificial Intelligence, 40(12), 10297–10305. https://doi.org/10.1609/aaai.v40i12.37999
Issue
Section
AAAI Technical Track on Computer Vision IX