DP-GenG: Differentially Private Dataset Distillation Guided by DP-Generated Data

Authors

  • Shuo Shi Zhejiang University Ningbo Global Innovation Center, Zhejiang University
  • Jinghuai Zhang University of California, Los Angeles
  • Shijie Jiang Hangzhou Normal University
  • Chunyi Zhou Zhejiang University
  • Yuyuan Li Hangzhou Dianzi University
  • Mengying Zhu Zhejiang University
  • Yangyang Wu Zhejiang University
  • Tianyu Du Zhejiang University Ningbo Global Innovation Center, Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v40i11.37854

Abstract

Dataset distillation (DD) compresses large datasets into smaller ones while preserving the performance of models trained on them. Although DD is often assumed to enhance data privacy by aggregating over individual examples, recent studies reveal that standard DD can still leak sensitive information from the original dataset due to the lack of formal privacy guarantees. Existing differentially private (DP)-DD methods attempt to mitigate this risk by injecting noise into the distillation process. However, they often fail to fully leverage the original dataset, resulting in degraded realism and utility. This paper introduces DP-GENG, a novel framework that addresses the key limitations of current DP-DD by leveraging DP-generated data. Specifically, DP-GENG initializes the distilled dataset with DP-generated data to enhance realism. Then, generated data refines the DP-feature matching technique to distill the original dataset under a small privacy budget, and trains an expert model to align the distilled examples with their class distribution. Furthermore, we design a privacy budget allocation strategy to determine budget consumption across DP components and provide a theoretical analysis of the overall privacy guarantees. Extensive experiments show that DP-GENG significantly outperforms state-of-the-art DP-DD methods in terms of both dataset utility and robustness against membership inference attacks, establishing a new paradigm for privacy-preserving dataset distillation.

Downloads

Published

2026-03-14

How to Cite

Shi, S., Zhang, J., Jiang, S., Zhou, C., Li, Y., Zhu, M., … Du, T. (2026). DP-GenG: Differentially Private Dataset Distillation Guided by DP-Generated Data. Proceedings of the AAAI Conference on Artificial Intelligence, 40(11), 8988–8996. https://doi.org/10.1609/aaai.v40i11.37854

Issue

Section

AAAI Technical Track on Computer Vision VIII