Event Representations for Automated Story Generation with Deep Neural Nets


  • Lara Martin Georgia Institute of Technology
  • Prithviraj Ammanabrolu Georgia Institute of Technology
  • Xinyu Wang Georgia Institute of Technology
  • William Hancock Georgia Institute of Technology
  • Shruti Singh Georgia Institute of Technology
  • Brent Harrison Georgia Institute of Technology
  • Mark Riedl Georgia Institute of Technology




automated story generation, event representations, recurrent neural networks


Automated story generation is the problem of automatically selecting a sequence of events, actions, or words that can be told as a story. We seek to develop a system that can generate stories by learning everything it needs to know from textual story corpora. To date, recurrent neural networks that learn language models at character, word, or sentence levels have had little success generating coherent stories. We explore the question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity. We present a technique for preprocessing textual story data into event sequences. We then present a technique for automated story generation whereby we decompose the problem into the generation of successive events (event2event) and the generation of natural language sentences from events (event2sentence). We give empirical results comparing different event representations and their effects on event successor generation and the translation of events to natural language.




How to Cite

Martin, L., Ammanabrolu, P., Wang, X., Hancock, W., Singh, S., Harrison, B., & Riedl, M. (2018). Event Representations for Automated Story Generation with Deep Neural Nets. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11430



AAAI Technical Track: Game Playing and Interactive Entertainment