Relation-aware Graph Attention Model with Adaptive Self-adversarial Training

Authors

  • Xiao Qin IBM Almaden Research Center
  • Nasrullah Sheikh IBM Almaden Research Center
  • Berthold Reinwald IBM Almaden Research Center
  • Lingfei Wu IBM Thomas J. Watson Research Center

DOI:

https://doi.org/10.1609/aaai.v35i11.17129

Keywords:

Graph-based Machine Learning

Abstract

This paper describes an end-to-end solution for the relationship prediction task in heterogeneous, multi-relational graphs. We particularly address two building blocks in the pipeline, namely heterogeneous graph representation learning and negative sampling. Existing message passing-based graph neural networks use edges either for graph traversal and/or selection of message encoding functions. Ignoring the edge semantics could have severe repercussions on the quality of embeddings, especially when dealing with two nodes having multiple relations. Furthermore, the expressivity of the learned representation depends on the quality of negative samples used during training. Although existing hard negative sampling techniques can identify challenging negative relationships for optimization, new techniques are required to control false negatives during training as false negatives could corrupt the learning process. To address these issues, first, we propose RelGNN -- a message passing-based heterogeneous graph attention model. In particular, RelGNN generates the states of different relations and leverages them along with the node states to weigh the messages. RelGNN also adopts a self-attention mechanism to balance the importance of attribute features and topological features for generating the final entity embeddings. Second, we introduce a parameter free negative sampling technique -- adaptive self-adversarial (ASA) negative sampling. ASA reduces the false negative rate by leveraging positive relationships to effectively guide the identification of true negative samples. Our experimental evaluation demonstrates that RelGNN optimized by ASA for relationship prediction improves state-of-the-art performance across established benchmarks as well as on a real industrial dataset.

Downloads

Published

2021-05-18

How to Cite

Qin, X., Sheikh, N., Reinwald, B., & Wu, L. (2021). Relation-aware Graph Attention Model with Adaptive Self-adversarial Training. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9368-9376. https://doi.org/10.1609/aaai.v35i11.17129

Issue

Section

AAAI Technical Track on Machine Learning IV