History-Based Controller Design and Optimization for Partially Observable MDPs

Authors

  • Akshat Kumar Singapore Management University
  • Shlomo Zilberstein University of Massachusetts,  Amherst

DOI:

https://doi.org/10.1609/icaps.v25i1.13730

Abstract

Partially observable MDPs provide an elegant framework forsequential decision making. Finite-state controllers (FSCs) are often used to represent policies for infinite-horizon problems as they offer a compact representation, simple-to-execute plans, and adjustable tradeoff between computational complexityand policy size. We develop novel connections between optimizing FSCs for POMDPs and the dual linear programfor MDPs. Building on that, we present a dual mixed integer linear program (MIP) for optimizing FSCs. To assign well-defined meaning to FSC nodes as well as aid in policy search, we show how to associate history-based features with each FSC node. Using this representation, we address another challenging problem, that of iteratively deciding which nodes to add to FSC to get a better policy. Using an efficient off-the-shelf MIP solver, we show that this new approach can find compact near-optimal FSCs for severallarge benchmark domains, and is competitive with previous best approaches.

Downloads

Published

2015-04-08

How to Cite

Kumar, A., & Zilberstein, S. (2015). History-Based Controller Design and Optimization for Partially Observable MDPs. Proceedings of the International Conference on Automated Planning and Scheduling, 25(1), 156-164. https://doi.org/10.1609/icaps.v25i1.13730