InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting

Authors

  • Haizhou Cao Computer Network Information Center, Chinese Academy of Sciences University of Chinese Academy of Sciences
  • Zhenhao Huang North China Electric Power University
  • Tiechui Yao Computer Network Information Center, Chinese Academy of Sciences University of Chinese Academy of Sciences
  • Jue Wang Computer Network Information Center, Chinese Academy of Sciences University of Chinese Academy of Sciences
  • Hui He North China Electric Power University
  • Yangang Wang Computer Network Information Center, Chinese Academy of Sciences University of Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v37i6.25845

Keywords:

ML: Deep Neural Architectures, ML: Deep Neural Network Algorithms, ML: Time-Series/Data Streams

Abstract

Long-term time series forecasting (LTSF) provides substantial benefits for numerous real-world applications, whereas places essential demands on the model capacity to capture long-range dependencies. Recent Transformer-based models have significantly improved LTSF performance. It is worth noting that Transformer with the self-attention mechanism was originally proposed to model language sequences whose tokens (i.e., words) are discrete and highly semantic. However, unlike language sequences, most time series are sequential and continuous numeric points. Time steps with temporal redundancy are weakly semantic, and only leveraging time-domain tokens is hard to depict the overall properties of time series (e.g., the overall trend and periodic variations). To address these problems, we propose a novel Transformer-based forecasting model named InParformer with an Interactive Parallel Attention (InPar Attention) mechanism. The InPar Attention is proposed to learn long-range dependencies comprehensively in both frequency and time domains. To improve its learning capacity and efficiency, we further design several mechanisms, including query selection, key-value pair compression, and recombination. Moreover, InParformer is constructed with evolutionary seasonal-trend decomposition modules to enhance intricate temporal pattern extraction. Extensive experiments on six real-world benchmarks show that InParformer outperforms the state-of-the-art forecasting Transformers.

Downloads

Published

2023-06-26

How to Cite

Cao, H., Huang, Z., Yao, T., Wang, J., He, H., & Wang, Y. (2023). InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 6906-6915. https://doi.org/10.1609/aaai.v37i6.25845

Issue

Section

AAAI Technical Track on Machine Learning I