Conditional Loss and Deep Euler Scheme for Time Series Generation

Authors

  • Carl Remlinger Université Gustave Eiffel, EDF Lab, FiME
  • Joseph Mikael EDF Lab, FiME
  • Romuald Elie Université Gustave Eiffel

DOI:

https://doi.org/10.1609/aaai.v36i7.20782

Keywords:

Machine Learning (ML)

Abstract

We introduce three new generative models for time series that are based on Euler discretization of Stochastic Differential Equations (SDEs) and Wasserstein metrics. Two of these methods rely on the adaptation of generative adversarial networks (GANs) to time series. The third algorithm, called Conditional Euler Generator (CEGEN), minimizes a dedicated distance between the transition probability distributions over all time steps. In the context of Itô processes, we provide theoretical guarantees that minimizing this criterion implies accurate estimations of the drift and volatility parameters. Empirically, CEGEN outperforms state-of-the-art and GANs on both marginal and temporal dynamic metrics. Besides, correlation structures are accurately identified in high dimension. When few real data points are available, we verify the effectiveness of CEGEN when combined with transfer learning methods on model-based simulations. Finally, we illustrate the robustness of our methods on various real-world data sets.

Downloads

Published

2022-06-28

How to Cite

Remlinger, C., Mikael, J., & Elie, R. (2022). Conditional Loss and Deep Euler Scheme for Time Series Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 8098-8105. https://doi.org/10.1609/aaai.v36i7.20782

Issue

Section

AAAI Technical Track on Machine Learning II