HMformer: Unleashing Transformer’s Potential for Time Series Forecasting via Hierarchical Multi-Scale Modeling

Authors

  • Renjun Huang School of Computer Science and Engineering, Southeast University, Nanjing, China Zhongguancun Academy, Beijing, China
  • Han Xiao School of Computer Science and Engineering, Southeast University, Nanjing, China
  • Bingqing Li School of Computer Science and Engineering, Southeast University, Nanjing, China
  • Baili Zhang School of Computer Science and Engineering, Southeast University, Nanjing, China Key Laboratory of New Generation Artificial Intelligence Technology and Its Interdisciplinary Applications (Southeast University), Ministry of Education, Nanjing, China
  • Jianhua Lyu School of Computer Science and Engineering, Southeast University, Nanjing, China

DOI:

https://doi.org/10.1609/aaai.v40i26.39355

Abstract

Time series forecasting plays a critical role across a wide range of domains. Recently, an increasing number of Transformer-based forecasting models have emerged, achieving remarkably competitive performance. However, real-world time series data often exhibit complex multi-scale periodicities, which are not well-suited for modeling by the original Transformer architecture originally developed for NLP tasks. To address this limitation, we propose the Hierarchical Multi-scale Time Series Transformer (HMformer), employing a novel and sophisticated framework specifically designed for multi-scale time series forecasting. Specifically, HMformer incorporates a hierarchical cross-scale mixing mechanism that progressively aggregates temporal information from fine to coarse granularities, a scale-adaptive feature expansion design enhancing the extraction of high-level temporal semantics, and a multi-branch complementary prediction strategy for effectively integrating diverse temporal patterns. Collectively, these components enable HMformer to capture intricate, multi-scale temporal dynamics while retaining the Transformer’s inherent strength in modeling long-range dependencies. Extensive experiments conducted on multiple real-world benchmark datasets—encompassing both long-term and short-term forecasting tasks—demonstrate that HMformer achieves state-of-the-art performance.

Downloads

Published

2026-03-14

How to Cite

Huang, R., Xiao, H., Li, B., Zhang, B., & Lyu, J. (2026). HMformer: Unleashing Transformer’s Potential for Time Series Forecasting via Hierarchical Multi-Scale Modeling. Proceedings of the AAAI Conference on Artificial Intelligence, 40(26), 22012–22020. https://doi.org/10.1609/aaai.v40i26.39355

Issue

Section

AAAI Technical Track on Machine Learning III