HN-MVTS: HyperNetwork-based Multivariate Time Series Forecasting

Authors

  • Andrey Savchenko Sber AI Lab HSE University
  • Oleg Kachan Sber AI Lab HSE University

DOI:

https://doi.org/10.1609/aaai.v40i30.39711

Abstract

Accurate forecasting of multivariate time series data remains a formidable challenge, particularly due to the growing complexity of temporal dependencies in real-world scenarios. While neural network-based models have achieved notable success in this domain, complex channel-dependent models often suffer from performance degradation compared to channel-independent models that do not consider the relationship between components but provide high robustness due to small capacity. In this work, we propose HN-MVTS, a novel architecture that integrates the hypernetwork-based generative prior with an arbitrary neural network forecasting model. The input of this hypernetwork is a learnable embedding matrix of time series components. To restrict the number of new parameters, the hypernetwork learns to generate the weights of the last layer of the target forecasting networks, serving as a data-adaptive regularizer that improves generalization and long-range predictive accuracy. The hypernetwork is only used during training, so it does not increase the inference time compared to the base forecasting model. Extensive experiments on eight benchmark datasets demonstrate that application of HN-MVTS to the state-of-the-art models (DLinear, PatchTST, TSMixer, etc.) typically improves their performance. Our findings suggest that hypernetwork-driven parameterization offers a promising direction for enhancing existing forecasting techniques in complex scenarios.

Downloads

Published

2026-03-14

How to Cite

Savchenko, A., & Kachan, O. (2026). HN-MVTS: HyperNetwork-based Multivariate Time Series Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 40(30), 25200–25208. https://doi.org/10.1609/aaai.v40i30.39711

Issue

Section

AAAI Technical Track on Machine Learning VII