FEEL: Featured Event Embedding Learning

Authors

  • I-Ta Lee Purdue University
  • Dan Goldwasser Purdue University

Keywords:

Natural Language Processing, Event Embeddings, Common Sense Inference, Statistical Script Learning, Representation Learning

Abstract

Statistical script learning is an effective way to acquire world knowledge which can be used for commonsense reasoning. Statistical script learning induces this knowledge by observing event sequences generated from texts. The learned model thus can predict subsequent events, given earlier events. Recent approaches rely on learning event embeddings which capture script knowledge. In this work, we suggest a general learning model–Featured Event Embedding Learning (FEEL)–for injecting event embeddings with fine grained information. In addition to capturing the dependencies between subsequent events, our model can take into account higher level abstractions of the input event which help the model generalize better and account for the global context in which the event appears. We evaluated our model over three narrative cloze tasks, and showed that our model is competitive with the most recent state-of-the-art. We also show that our resulting embedding can be used as a strong representation for advanced semantic tasks such as discourse parsing and sentence semantic relatedness.

Downloads

Published

2018-04-26

How to Cite

Lee, I.-T., & Goldwasser, D. (2018). FEEL: Featured Event Embedding Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11936

Issue

Section

Main Track: NLP and Knowledge Representation