Enhancing Multimodal Goal Recognition in Open-World Games with Natural Language Player Reflections


  • Anisha Gupta North Carolina State University
  • Dan Carpenter North Carolina State University
  • Wookhee Min North Carolina State University
  • Jonathan Rowe North Carolina State University
  • Roger Azevedo University of Central Florida
  • James Lester North Carolina State University




Goal Recognition, Reflection, Player Modeling


Open-world games promote engagement by offering players a high degree of autonomy to explore expansive game worlds. Player goal recognition has been widely explored for modeling player behavior in open-world games by dynamically recognizing players’ goals using observations of in-game actions and locations. In educational open-world games, in-game reflection tools can help students reflect on their learning and plan their strategies for future gameplay. Data generated from students’ written reflections can serve as a source of evidence for modeling player goals. We present a multimodal goal recognition approach that leverages players’ written reflections along with game trace log features to predict player goals during gameplay. Results show that both the highest predictive performance and best early prediction performance are achieved by deep learning-based, multimodal goal recognition models that utilize both written reflection and gameplay features as input. These models outperform unimodal deep learning models as well as a random forest baseline. Multimodal goal recognition using natural language reflection data has significant potential to enhance goal recognition model performance, as well as player modeling more generally, to support the creation of engaging and adaptive open-world digital games.




How to Cite

Gupta, A., Carpenter, D., Min, W., Rowe, J., Azevedo, R., & Lester, J. (2022). Enhancing Multimodal Goal Recognition in Open-World Games with Natural Language Player Reflections. Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, 18(1), 37-44. https://doi.org/10.1609/aiide.v18i1.21945