Classification of Sparse Time Series via Supervised Matrix Factorization

Authors

  • Josif Grabocka University of Hildesheim
  • Alexandros Nanopoulos University of Hildesheim
  • Lars Schmidt-Thieme University of Hildesheim

DOI:

https://doi.org/10.1609/aaai.v26i1.8271

Keywords:

Time Series, Matrix Factorization, Sparsity, Logistic Regression

Abstract

Data sparsity is an emerging real-world problem observed in a various domains ranging from sensor networks to medical diagnosis. Consecutively, numerous machine learning methods were modeled to treat missing values. Nevertheless, sparsity, defined as missing segments, has not been thoroughly investigated in the context of time series classification. We propose a novel principle for classifying time series, which in contrast to existing approaches, avoids reconstructing the missing segments in time series and operates solely on the observed ones. Based on the proposed principle, we develop a method that prevents adding noise that incurs during the reconstruction of the original time series. Ourmethod adapts supervised matrix factorization by projecting time series in a latent space through stochasticlearning. Furthermore the projected data is built in a supervised fashion via a logistic regression. Abundant experiments on a large collection of 37 data sets demonstrate the superiority of our method, which in the majority of cases outperforms a set of baselines that do not follow our proposed principle.

Downloads

Published

2021-09-20

How to Cite

Grabocka, J., Nanopoulos, A., & Schmidt-Thieme, L. (2021). Classification of Sparse Time Series via Supervised Matrix Factorization. Proceedings of the AAAI Conference on Artificial Intelligence, 26(1), 929–934. https://doi.org/10.1609/aaai.v26i1.8271

Issue

Section

AAAI Technical Track: Machine Learning