Mixed Observability Predictive State Representations

Authors

  • Sylvie Ong McGill University
  • Yuri Grinberg McGill University
  • Joelle Pineau McGill University

DOI:

https://doi.org/10.1609/aaai.v27i1.8680

Keywords:

Machine Learning, Model Learning, Predictive State Representations, Mixed Observability

Abstract

Learning accurate models of agent behaviours is crucial for the purpose of controlling systems where the agents' and environment's dynamics are unknown. This is a challenging problem, but structural assumptions can be leveraged to tackle it effectively. In particular, many systems exhibit mixed observability, when observations of some system components are essentially perfect and noiseless, while observations of other components are imperfect, aliased or noisy. In this paper we present a new model learning framework, the mixed observability predictive state representation (MO-PSR), which extends the previously known predictive state representations to the case of mixed observability systems. We present a learning algorithm that is scalable to large amounts of data and to large mixed observability domains, and show theoretical analysis of the learning consistency and computational complexity. Empirical results demonstrate that our algorithm is capable of learning accurate models, at a larger scale than with the generic predictive state representation, by leveraging the mixed observability properties.

Downloads

Published

2013-06-30

How to Cite

Ong, S., Grinberg, Y., & Pineau, J. (2013). Mixed Observability Predictive State Representations. Proceedings of the AAAI Conference on Artificial Intelligence, 27(1), 746-752. https://doi.org/10.1609/aaai.v27i1.8680