On MABs and Separation of Concerns in Monte-Carlo Planning for MDPs

Authors

  • Zohar Feldman Technion - Israel Institute of Technology
  • Carmel Domshlak Technion - Israel Institute of Technology

DOI:

https://doi.org/10.1609/icaps.v24i1.13631

Keywords:

Markov Decision Process, Multi-Armed Bandit, Online Planning, Simple Regret, Monte-Carlo Tree Search

Abstract

Linking online planning for MDPs with their special case of stochastic multi-armed bandit problems, we analyze three state-of-the-art Monte-Carlo tree search al-gorithms: UCT, BRUE, and MaxUCT. Using the outcome, we (i) introduce two new MCTS algorithms,MaxBRUE, which combines uniform sampling with Bellman backups, and MpaUCT, which combines UCB1with a novel backup procedure, (ii) analyze them formally and empirically, and (iii) show how MCTS algorithms can be further stratified by an exploration control mechanism that improves their empirical performance without harming the formal guarantees.

Downloads

Published

2014-05-10

How to Cite

Feldman, Z., & Domshlak, C. (2014). On MABs and Separation of Concerns in Monte-Carlo Planning for MDPs. Proceedings of the International Conference on Automated Planning and Scheduling, 24(1), 120-127. https://doi.org/10.1609/icaps.v24i1.13631