Implicit Neural Representation with Multi-Scale Sine Activation

Authors

  • Jufeng Han AnnLab, Institute of Semiconductors, Chinese Academy of Sciences School of Integrated Circuits, University of Chinese Academy of Sciences
  • Shu Wei AnnLab, Institute of Semiconductors, Chinese Academy of Sciences College of Materials Science and Opto-Electronic Technology, University of Chinese Academy of Sciences
  • Min Wu AnnLab, Institute of Semiconductors, Chinese Academy of Sciences School of Integrated Circuits, University of Chinese Academy of Sciences
  • Lina Yu AnnLab, Institute of Semiconductors, Chinese Academy of Sciences College of Materials Science and Opto-Electronic Technology, University of Chinese Academy of Sciences
  • Weijun Li AnnLab, Institute of Semiconductors, Chinese Academy of Sciences School of Integrated Circuits, University of Chinese Academy of Sciences
  • Linjun Sun AnnLab, Institute of Semiconductors, Chinese Academy of Sciences School of Integrated Circuits, University of Chinese Academy of Sciences
  • Hong Qin AnnLab, Institute of Semiconductors, Chinese Academy of Sciences School of Integrated Circuits, University of Chinese Academy of Sciences
  • Yan Pang AnnLab, Institute of Semiconductors, Chinese Academy of Sciences College of Materials Science and Opto-Electronic Technology, University of Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v40i26.39305

Abstract

Implicit Neural Representations (INRs) have become a powerful paradigm for modeling continuous signals in computer vision, graphics, and scientific computing. However, multilayer perceptrons (MLPs) generally suffer from severe spectral bias, which limits their ability to accurately model high-frequency details and multi-scale structures. To address this challenge, we propose a novel Multi-Scale Sine Activation (MSA), which explicitly introduces multi-scale frequency responses by incorporating multiple sets of sine activations with logarithmically spaced frequencies in parallel at each layer. MSA is further combined with an amplitude modulation mechanism to ensure numerical stability and robust optimization across different frequency channels. We conduct extensive experiments on a series of challenging tasks, including 1D multi-scale function fitting, image representation, video representation, 3D shape representation, and PDEs solving. Experimental results show that MSA outperforms existing state-of-the-art methods in terms of reconstruction accuracy, detail preservation, and training stability.

Downloads

Published

2026-03-14

How to Cite

Han, J., Wei, S., Wu, M., Yu, L., Li, W., Sun, L., … Pang, Y. (2026). Implicit Neural Representation with Multi-Scale Sine Activation. Proceedings of the AAAI Conference on Artificial Intelligence, 40(26), 21567–21575. https://doi.org/10.1609/aaai.v40i26.39305

Issue

Section

AAAI Technical Track on Machine Learning III