TransConv: Relationship Embedding in Social Networks

Authors

  • Yi-Yu Lai Purdue University
  • Jennifer Neville Purdue University
  • Dan Goldwasser Purdue University

DOI:

https://doi.org/10.1609/aaai.v33i01.33014130

Abstract

Representation learning (RL) for social networks facilitates real-world tasks such as visualization, link prediction and friend recommendation. Traditional knowledge graph embedding models learn continuous low-dimensional embedding of entities and relations. However, when applied to social networks, existing approaches do not consider the rich textual communications between users, which contains valuable information to describe social relationships. In this paper, we propose TransConv, a novel approach that incorporates textual interactions between pair of users to improve representation learning of both users and relationships. Our experiments on real social network data show TransConv learns better user and relationship embeddings compared to other state-of-theart knowledge graph embedding models. Moreover, the results illustrate that our model is more robust for sparse relationships where there are fewer examples.

Downloads

Published

2019-07-17

How to Cite

Lai, Y.-Y., Neville, J., & Goldwasser, D. (2019). TransConv: Relationship Embedding in Social Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4130-4138. https://doi.org/10.1609/aaai.v33i01.33014130

Issue

Section

AAAI Technical Track: Machine Learning