SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation

Authors

  • Tao He Monash University
  • Lianli Gao University of Electronic Science and Technology of China
  • Jingkuan Song University of Electronic Science and Technology of China
  • Xin Wang Tianjin University
  • Kejie Huang Zhejiang University
  • Yuanfang Li Monash University

DOI:

https://doi.org/10.1609/aaai.v34i04.5832

Abstract

Learning accurate low-dimensional embeddings for a network is a crucial task as it facilitates many network analytics tasks. Moreover, the trained embeddings often require a significant amount of space to store, making storage and processing a challenge, especially as large-scale networks become more prevalent. In this paper, we present a novel semi-supervised network embedding and compression method, SNEQ, that is competitive with state-of-art embedding methods while being far more space- and time-efficient. SNEQ incorporates a novel quantisation method based on a self-attention layer that is trained in an end-to-end fashion, which is able to dramatically compress the size of the trained embeddings, thus reduces storage footprint and accelerates retrieval speed. Our evaluation on four real-world networks of diverse characteristics shows that SNEQ outperforms a number of state-of-the-art embedding methods in link prediction, node classification and node recommendation. Moreover, the quantised embedding shows a great advantage in terms of storage and time compared with continuous embeddings as well as hashing methods.

Downloads

Published

2020-04-03

How to Cite

He, T., Gao, L., Song, J., Wang, X., Huang, K., & Li, Y. (2020). SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4091-4098. https://doi.org/10.1609/aaai.v34i04.5832

Issue

Section

AAAI Technical Track: Machine Learning