Building on Word Animacy to Determine Coreference Chain Animacy in Cultural Narratives

Authors

  • Labiba Jahan Florida International University
  • Geeticka Chauhan Florida International University
  • Mark Finlayson Florida International University

DOI:

https://doi.org/10.1609/aiide.v13i2.12993

Keywords:

animacy, referring expressions, coreference chains, character, narrative

Abstract

Animacy is the characteristic of being able to independently carry out actions in a story world (e.g., movement, communication). It is a necessary property of characters in stories, and so detecting animacy is an important step in automatic story understanding. Prior approaches to animacy detection have conceived of animacy as a word- or phrase-level property, without explicitly connecting it to characters. In this work we compute the animacy of referring expressions using a statistical approach incorporating features such as word embeddings on referring expression, noun, grammatical subject and semantic roles. We then compute the animacy of coreference chains via a majority vote of the animacy of the chain's constituent referring expressions. We also reimplement prior approaches to word-level animacy to compare performance. We demonstrate these results on a small set of folktales with gold-standard annotations for coreference structure and animacy (15 Russian folktales translated into English). Folktales present an interesting challenge because they often involve characters who are members of traditionally inanimate classes (e.g., stoves that walk, tree that talk). We achieve an F1 measure 0.90 for the referring expression animacy model, and 0.86 for the coreference chain model. We discuss several ways in which we anticipate these results may be improved in future work.

Downloads

Published

2021-06-25

How to Cite

Jahan, L., Chauhan, G., & Finlayson, M. (2021). Building on Word Animacy to Determine Coreference Chain Animacy in Cultural Narratives. Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, 13(2), 198-203. https://doi.org/10.1609/aiide.v13i2.12993