Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs

Authors

  • Seungjun Lee Alsemy, South Korea
  • TaeiL Oh Alsemy, South Korea

DOI:

https://doi.org/10.1609/aaai.v38i1.27766

Keywords:

APP: Natural Sciences, ML: Applications

Abstract

Solving partial differential equations (PDEs) by learning the solution operators has emerged as an attractive alternative to traditional numerical methods. However, implementing such architectures presents two main challenges: flexibility in handling irregular and arbitrary input and output formats and scalability to large discretizations. Most existing architectures are limited by their desired structure or infeasible to scale large inputs and outputs. To address these issues, we introduce an attention-based model called an inducing point operator transformer (IPOT). Inspired by inducing points methods, IPOT is designed to handle any input function and output query while capturing global interactions in a computationally efficient way. By detaching the inputs/outputs discretizations from the processor with a smaller latent bottleneck, IPOT offers flexibility in processing arbitrary discretizations and scales linearly with the size of inputs/outputs. Our experimental results demonstrate that IPOT achieves strong performances with manageable computational complexity on an extensive range of PDE benchmarks and real-world weather forecasting scenarios, compared to state-of-the-art methods. Our code is publicly available at https://github.com/7tl7qns7ch/IPOT.

Published

2024-03-25

How to Cite

Lee, S., & Oh, T. (2024). Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs. Proceedings of the AAAI Conference on Artificial Intelligence, 38(1), 153-161. https://doi.org/10.1609/aaai.v38i1.27766

Issue

Section

AAAI Technical Track on Application Domains