Implicit Neural Representation with Multi-Scale Sine Activation
DOI:
https://doi.org/10.1609/aaai.v40i26.39305Abstract
Implicit Neural Representations (INRs) have become a powerful paradigm for modeling continuous signals in computer vision, graphics, and scientific computing. However, multilayer perceptrons (MLPs) generally suffer from severe spectral bias, which limits their ability to accurately model high-frequency details and multi-scale structures. To address this challenge, we propose a novel Multi-Scale Sine Activation (MSA), which explicitly introduces multi-scale frequency responses by incorporating multiple sets of sine activations with logarithmically spaced frequencies in parallel at each layer. MSA is further combined with an amplitude modulation mechanism to ensure numerical stability and robust optimization across different frequency channels. We conduct extensive experiments on a series of challenging tasks, including 1D multi-scale function fitting, image representation, video representation, 3D shape representation, and PDEs solving. Experimental results show that MSA outperforms existing state-of-the-art methods in terms of reconstruction accuracy, detail preservation, and training stability.Published
2026-03-14
How to Cite
Han, J., Wei, S., Wu, M., Yu, L., Li, W., Sun, L., … Pang, Y. (2026). Implicit Neural Representation with Multi-Scale Sine Activation. Proceedings of the AAAI Conference on Artificial Intelligence, 40(26), 21567–21575. https://doi.org/10.1609/aaai.v40i26.39305
Issue
Section
AAAI Technical Track on Machine Learning III