Understanding Speech in Interactive Narratives with Crowdsourced Data
DOI:
https://doi.org/10.1609/aiide.v8i1.12507Keywords:
games, speech, natural language, interactive narrative, crowdsourcingAbstract
Speech recognition failures and limited vocabulary coverage pose challenges for speech interaction with characters in games. We describe an end-to-end system for automating characters from a large corpus of recorded human game logs, and demonstrate that inferring utterance meaning through a combination of plan recognition and surface text similarity compensates for recognition and understanding failures significantly better than relying on surface similarity alone.
Downloads
Published
2021-06-30
How to Cite
Orkin, J., & Roy, D. (2021). Understanding Speech in Interactive Narratives with Crowdsourced Data. Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, 8(1), 57-62. https://doi.org/10.1609/aiide.v8i1.12507
Issue
Section
Full Oral Papers