SAOT: An Enhanced Locality-Aware Spectral Transformer for Solving PDEs

Authors

  • Chenhong Zhou Hong Kong Baptist University
  • Jie Chen Hong Kong Baptist University
  • Zaifeng Yang Institute of High Performance Computing, A*STAR, Singapore

DOI:

https://doi.org/10.1609/aaai.v40i34.40128

Abstract

Neural operators have shown great potential in solving a family of Partial Differential Equations (PDEs) by modeling the mappings between input and output functions. Fourier Neural Operator (FNO) implements global convolutions via parameterizing the integral operators in Fourier space. However, it often results in over-smoothing solutions and fails to capture local details and high-frequency components. To address these limitations, we investigate incorporating the spatial-frequency localization property of Wavelet transforms into the Transformer architecture. We propose a novel Wavelet Attention (WA) module with linear computational complexity to efficiently learn locality-aware features. Building upon WA, we further develop the Spectral Attention Operator Transformer (SAOT), a hybrid spectral Transformer framework that integrates WA’s localized focus with the global receptive field of Fourier-based Attention (FA) through a gated fusion block. Experimental results demonstrate that WA significantly mitigates the limitations of FA and outperforms existing Wavelet-based neural operators by a large margin. By integrating the locality-aware and global spectral representations, SAOT achieves state-of-the-art performance on six operator learning benchmarks and exhibits strong discretization-invariant ability.

Downloads

Published

2026-03-14

How to Cite

Zhou, C., Chen, J., & Yang, Z. (2026). SAOT: An Enhanced Locality-Aware Spectral Transformer for Solving PDEs. Proceedings of the AAAI Conference on Artificial Intelligence, 40(34), 28928–28936. https://doi.org/10.1609/aaai.v40i34.40128

Issue

Section

AAAI Technical Track on Machine Learning XI