TY - JOUR AU - He, Tao AU - Gao, Lianli AU - Song, Jingkuan AU - Wang, Xin AU - Huang, Kejie AU - Li, Yuanfang PY - 2020/04/03 Y2 - 2024/03/29 TI - SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 04 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v34i04.5832 UR - https://ojs.aaai.org/index.php/AAAI/article/view/5832 SP - 4091-4098 AB - <p>Learning accurate low-dimensional embeddings for a network is a crucial task as it facilitates many network analytics tasks. Moreover, the trained embeddings often require a significant amount of space to store, making storage and processing a challenge, especially as large-scale networks become more prevalent. In this paper, we present a novel semi-supervised network embedding and compression method, SNEQ, that is competitive with state-of-art embedding methods while being far more space- and time-efficient. SNEQ incorporates a novel quantisation method based on a self-attention layer that is trained in an end-to-end fashion, which is able to dramatically compress the size of the trained embeddings, thus reduces storage footprint and accelerates retrieval speed. Our evaluation on four real-world networks of diverse characteristics shows that SNEQ outperforms a number of state-of-the-art embedding methods in link prediction, node classification and node recommendation. Moreover, the quantised embedding shows a great advantage in terms of storage and time compared with continuous embeddings as well as hashing methods.</p> ER -