TY - JOUR AU - Ontanon, Santiago PY - 2021/06/30 Y2 - 2024/03/29 TI - The Combinatorial Multi-Armed Bandit Problem and Its Application to Real-Time Strategy Games JF - Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment JA - AIIDE VL - 9 IS - 1 SE - Research Papers — Oral Presentation DO - 10.1609/aiide.v9i1.12681 UR - https://ojs.aaai.org/index.php/AIIDE/article/view/12681 SP - 58-64 AB - <p> Game tree search in games with large branching factors is a notoriously hard problem. In this paper, we address this problem with a new sampling strategy for Monte Carlo Tree Search (MCTS) algorithms, called "Naive Sampling", based on a variant of the Multi-armed Bandit problem called the "Combinatorial Multi-armed Bandit" (CMAB) problem. We present a new MCTS algorithm based on Naive Sampling called NaiveMCTS, and evaluate it in the context of real-time strategy (RTS) games. Our results show that as the branching factor grows, NaiveMCTS performs significantly better than other algorithms. </p> ER -