Federated Graph Condensation with Information Bottleneck Principles

Authors

  • Bo Yan Beijing University of Posts and Telecommunications Institute of Science Tokyo
  • Sihao He Beijing University of Posts and Telecommunications
  • Cheng Yang Beijing University of Posts and Telecommunications
  • Shang Liu China University of Mining and Technology
  • Yang Cao Institute of Science Tokyo
  • Chuan Shi Beijing University of Posts and Telecommunications

DOI:

https://doi.org/10.1609/aaai.v39i12.33417

Abstract

Graph condensation (GC), which reduces the size of a large-scale graph by synthesizing a small-scale condensed graph as its substitution, has benefited various graph learning tasks. However, existing GC methods rely on centralized data storage, which is unfeasible for real-world decentralized data distribution, and overlook data holders' privacy-preserving requirements. To bridge this gap, we propose and study the novel problem of federated graph condensation (FGC) for graph neural networks (GNNs). Specifically, we first propose a general framework for FGC, where we decouple the typical gradient matching process for GC into client-side gradient calculation and server-side gradient matching, integrating knowledge from multiple clients' subgraphs into one smaller condensed graph. Nevertheless, our empirical studies show that under the federated setting, the condensed graph will consistently leak data membership privacy, i.e., the condensed graph during federated training can be utilized to steal training data under the membership inference attack (MIA). To tackle this issue, we innovatively incorporate information bottleneck principles into the FGC, which only needs to extract partial node features in one local pre-training step and utilize the features during federated training. Theoretical and experimental analyses demonstrate that our framework consistently protects membership privacy during training. Meanwhile, it can achieve comparable and even superior performance against existing centralized GC and federated graph learning (FGL) methods.

Downloads

Published

2025-04-11

How to Cite

Yan, B., He, S., Yang, C., Liu, S., Cao, Y., & Shi, C. (2025). Federated Graph Condensation with Information Bottleneck Principles. Proceedings of the AAAI Conference on Artificial Intelligence, 39(12), 12990–12998. https://doi.org/10.1609/aaai.v39i12.33417

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management II