Learning Compact Representations of Time-Varying Processes

Authors

  • Philip Bachman McGill University
  • Doina Precup McGill University

DOI:

https://doi.org/10.1609/aaai.v25i1.8061

Abstract

We seek informative representations of the processes underlying time series data. As a first step, we address problems in which these processes can be approximated by linear models that vary smoothly over time. To facilitate estimation of these linear models, we introduce a method of dimension reduction which significantly reduces error when models are estimated locally for each point in time. This improvement is gained by performing dimension reduction implicitly through the model parameters rather than directly in the observation space.

Downloads

Published

2011-08-04

How to Cite

Bachman, P., & Precup, D. (2011). Learning Compact Representations of Time-Varying Processes. Proceedings of the AAAI Conference on Artificial Intelligence, 25(1), 1748-1749. https://doi.org/10.1609/aaai.v25i1.8061