Dynamic Embedding on Textual Networks via a Gaussian Process


  • Pengyu Cheng Duke University
  • Yitong Li Duke University
  • Xinyuan Zhang Duke University
  • Liqun Chen Duke University
  • David Carlson Duke University
  • Lawrence Carin Duke University




Textual network embedding aims to learn low-dimensional representations of text-annotated nodes in a graph. Prior work in this area has typically focused on fixed graph structures; however, real-world networks are often dynamic. We address this challenge with a novel end-to-end node-embedding model, called Dynamic Embedding for Textual Networks with a Gaussian Process (DetGP). After training, DetGP can be applied efficiently to dynamic graphs without re-training or backpropagation. The learned representation of each node is a combination of textual and structural embeddings. Because the structure is allowed to be dynamic, our method uses the Gaussian process to take advantage of its non-parametric properties. To use both local and global graph structures, diffusion is used to model multiple hops between neighbors. The relative importance of global versus local structure for the embeddings is learned automatically. With the non-parametric nature of the Gaussian process, updating the embeddings for a changed graph structure requires only a forward pass through the learned model. Considering link prediction and node classification, experiments demonstrate the empirical effectiveness of our method compared to baseline approaches. We further show that DetGP can be straightforwardly and efficiently applied to dynamic textual networks.




How to Cite

Cheng, P., Li, Y., Zhang, X., Chen, L., Carlson, D., & Carin, L. (2020). Dynamic Embedding on Textual Networks via a Gaussian Process. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 7562-7569. https://doi.org/10.1609/aaai.v34i05.6255



AAAI Technical Track: Natural Language Processing