Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay

Authors

  • Fan Zhou University of Electronic Science and Technology of China
  • Chengtai Cao University of Electronic Science and Technology of China

Keywords:

Graph Mining, Social Network Analysis & Community

Abstract

Graph Neural Networks (GNNs) have recently received significant research attention due to their superior performance on a variety of graph-related learning tasks. Most of the current works focus on either static or dynamic graph settings, addressing a single particular task, e.g., node/graph classification, link prediction. In this work, we investigate the question: can GNNs be applied to continuously learning a sequence of tasks? Towards that, we explore the Continual Graph Learning (CGL) paradigm and present the Experience Replay based framework ER-GNN for CGL to alleviate the catastrophic forgetting problem in existing GNNs. ER-GNN stores knowledge from previous tasks as experiences and replays them when learning new tasks to mitigate the catastrophic forgetting issue. We propose three experience node selection strategies: mean of feature, coverage maximization, and influence maximization, to guide the process of selecting experience nodes. Extensive experiments on three benchmark datasets demonstrate the effectiveness of our ER-GNN and shed light on the incremental graph (non-Euclidean) structure learning.

Downloads

Published

2021-05-18

How to Cite

Zhou, F., & Cao, C. (2021). Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay. Proceedings of the AAAI Conference on Artificial Intelligence, 35(5), 4714-4722. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16602

Issue

Section

AAAI Technical Track on Data Mining and Knowledge Management