PNeSM: Arbitrary 3D Scene Stylization via Prompt-Based Neural Style Mapping
DOI:
https://doi.org/10.1609/aaai.v38i2.27870Keywords:
CV: 3D Computer Vision, CV: Computational Photography, Image & Video SynthesisAbstract
3D scene stylization refers to transform the appearance of a 3D scene to match a given style image, ensuring that images rendered from different viewpoints exhibit the same style as the given style image, while maintaining the 3D consistency of the stylized scene. Several existing methods have obtained impressive results in stylizing 3D scenes. However, the mod- els proposed by these methods need to be re-trained when applied to a new scene. In other words, their models are cou- pled with a specific scene and cannot adapt to arbitrary other scenes. To address this issue, we propose a novel 3D scene stylization framework to transfer an arbitrary style to an ar- bitrary scene, without any style-related or scene-related re- training. Concretely, we first map the appearance of the 3D scene into a 2D style pattern space, which realizes complete disentanglement of the geometry and appearance of the 3D scene and makes our model be generalized to arbitrary 3D scenes. Then we stylize the appearance of the 3D scene in the 2D style pattern space via a prompt-based 2D stylization al- gorithm. Experimental results demonstrate that our proposed framework is superior to SOTA methods in both visual qual- ity and generalization.Downloads
Published
2024-03-24
How to Cite
Chen, J., Xing, W., Sun, J., Chu, T., Huang, Y., Ji, B., Zhao, L., Lin, H., Chen, H., & Wang, Z. (2024). PNeSM: Arbitrary 3D Scene Stylization via Prompt-Based Neural Style Mapping. Proceedings of the AAAI Conference on Artificial Intelligence, 38(2), 1091-1099. https://doi.org/10.1609/aaai.v38i2.27870
Issue
Section
AAAI Technical Track on Computer Vision I