Linear Dynamic Programs for Resource Management

Authors

  • Marek Petrik IBM Research
  • Shlomo Zilberstein University of Massachusetts, Amherst

DOI:

https://doi.org/10.1609/aaai.v25i1.7794

Abstract

Sustainable resource management in many domains presents large continuous stochastic optimization problems, which can often be modeled as Markov decision processes (MDPs). To solve such large MDPs, we identify and leverage linearity in state and action sets that is common in resource management. In particular, we introduce linear dynamic programs (LDPs) that generalize resource management problems and partially observable MDPs (POMDPs). We show that the LDP framework makes it possible to adapt point-based methods--the state of the art in solving POMDPs--to solving LDPs. The experimental results demonstrate the efficiency of this approach in managing the water level of a river reservoir. Finally, we discuss the relationship with dual dynamic programming, a method used to optimize hydroelectric systems.

Downloads

Published

2011-08-04

How to Cite

Petrik, M., & Zilberstein, S. (2011). Linear Dynamic Programs for Resource Management. Proceedings of the AAAI Conference on Artificial Intelligence, 25(1), 1377-1383. https://doi.org/10.1609/aaai.v25i1.7794

Issue

Section

Special Track on Computational Sustainability and AI