TransConv: Relationship Embedding in Social Networks


  • Yi-Yu Lai Purdue University
  • Jennifer Neville Purdue University
  • Dan Goldwasser Purdue University



Representation learning (RL) for social networks facilitates real-world tasks such as visualization, link prediction and friend recommendation. Traditional knowledge graph embedding models learn continuous low-dimensional embedding of entities and relations. However, when applied to social networks, existing approaches do not consider the rich textual communications between users, which contains valuable information to describe social relationships. In this paper, we propose TransConv, a novel approach that incorporates textual interactions between pair of users to improve representation learning of both users and relationships. Our experiments on real social network data show TransConv learns better user and relationship embeddings compared to other state-of-theart knowledge graph embedding models. Moreover, the results illustrate that our model is more robust for sparse relationships where there are fewer examples.




How to Cite

Lai, Y.-Y., Neville, J., & Goldwasser, D. (2019). TransConv: Relationship Embedding in Social Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4130-4138.



AAAI Technical Track: Machine Learning