Considering Nonstationary within Multivariate Time Series with Variational Hierarchical Transformer for Forecasting
DOI:
https://doi.org/10.1609/aaai.v38i14.29483Keywords:
ML: Time-Series/Data Streams, ML: Deep Generative Models & AutoencodersAbstract
The forecasting of Multivariate Time Series (MTS) has long been an important but challenging task. Due to the non-stationary problem across long-distance time steps, previous studies primarily adopt stationarization method to attenuate the non-stationary problem of original series for better predictability. However, existed methods always adopt the stationarized series, which ignore the inherent non-stationarity, and have difficulty in modeling MTS with complex distributions due to the lack of stochasticity. To tackle these problems, we first develop a powerful hierarchical probabilistic generative module to consider the non-stationarity and stochastity characteristics within MTS, and then combine it with transformer for a well-defined variational generative dynamic model named Hierarchical Time series Variational Transformer (HTV-Trans), which recovers the intrinsic non-stationary information into temporal dependencies. Being an powerful probabilistic model, HTV-Trans is utilized to learn expressive representations of MTS and applied to the forecasting tasks. Extensive experiments on diverse datasets show the efficiency of HTV-Trans on MTS forecasting tasks.Downloads
Published
2024-03-24
How to Cite
Wang, M., Chen, W., & Chen, B. (2024). Considering Nonstationary within Multivariate Time Series with Variational Hierarchical Transformer for Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 38(14), 15563-15570. https://doi.org/10.1609/aaai.v38i14.29483
Issue
Section
AAAI Technical Track on Machine Learning V