Learning Word Representations from Relational Graphs

Authors

  • Danushka Bollegala The University of Liverpool
  • Takanori Maehara National Institute of Informatics
  • Yuichi Yoshida National Institute of Informatics
  • Ken-ichi Kawarabayashi National Institute of Informatics

DOI:

https://doi.org/10.1609/aaai.v29i1.9494

Keywords:

Semantic Relations, Word Representations

Abstract

Attributes of words and relations between two words are central to numerous tasks in Artificial Intelligence such as knowledge representation, similarity measurement, and analogy detection. Often when two words share one or more attributes in common, they are con- nected by some semantic relations. On the other hand, if there are numerous semantic relations between two words, we can expect some of the attributes of one of the words to be inherited by the other. Motivated by this close connection between attributes and relations, given a relational graph in which words are inter-connected via numerous semantic relations, we propose a method to learn a latent representation for the individual words. The proposed method considers not only the co-occurrences of words as done by existing approaches for word representation learning, but also the semantic relations in which two words co-occur. To evaluate the accuracy of the word representations learnt using the proposed method, we use the learnt word representa- tions to solve semantic word analogy problems. Our experimental results show that it is possible to learn better word representations by using semantic semantics between words.

Downloads

Published

2015-02-19

How to Cite

Bollegala, D., Maehara, T., Yoshida, Y., & Kawarabayashi, K.- ichi. (2015). Learning Word Representations from Relational Graphs. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9494

Issue

Section

Main Track: NLP and Knowledge Representation