Sparse-Scale Transformer with Bidirectional Awareness for Time Series Forecasting
DOI:
https://doi.org/10.1609/aaai.v40i28.39566Abstract
Time series forecasting (TSF) plays a crucial role in many real-world applications, such as weather prediction and economic planning. While Transformer-based models have shown strong capabilities in modeling long-range dependencies, effectively capturing the multi-scale temporal dynamics inherent in time series remains a major challenge. Existing methods often adopt time-windows of varying sizes, which may introduce noisy or irrelevant representations when mismatched with the underlying temporal patterns, potentially leading to overfitting. In this paper, we propose Sparse-Scale Transformer (SSformer) with Bidirectional Awareness for Time Series Forecasting to enhance the multi-scale modeling for time series. Specifically, we propose a novel Sparse-Scale Convolution (SSC) block that imposes sparsity on scales to obtain the informative representations by evaluating the intra-scale segment similarity of time series, and utilizes scale-specific convolutions to extract local patterns. Furthermore, we design a Bidirectional-Scale Interaction (BSI) block to explicitly model scale correlations in both coarse-to-fine and fine-to-coarse directions. Finally, scale predictions are ensembled to fully exploit the complementary forecasting capabilities across scales. Extensive experiments on various real-world datasets demonstrate that SSformer achieves state-of-the-art performance with superior efficiency.Published
2026-03-14
How to Cite
Liu, Y., Liu, B., Huang, S., Luo, G., Hu, W., Wang, M., & Hong, R. (2026). Sparse-Scale Transformer with Bidirectional Awareness for Time Series Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 40(28), 23899–23907. https://doi.org/10.1609/aaai.v40i28.39566
Issue
Section
AAAI Technical Track on Machine Learning V