Can We Find Strong Lottery Tickets in Generative Models?

Authors

  • Sangyeop Yeo UNIST
  • Yoojin Jang UNIST
  • Jy-yong Sohn University of Wisconsin-Madison
  • Dongyoon Han NAVER AI Lab
  • Jaejun Yoo UNIST

DOI:

https://doi.org/10.1609/aaai.v37i3.25433

Keywords:

CV: Learning & Optimization for CV, CV: Representation Learning for Vision, ML: Deep Generative Models & Autoencoders

Abstract

Yes. In this paper, we investigate strong lottery tickets in generative models, the subnetworks that achieve good generative performance without any weight update. Neural network pruning is considered the main cornerstone of model compression for reducing the costs of computation and memory. Unfortunately, pruning a generative model has not been extensively explored, and all existing pruning algorithms suffer from excessive weight-training costs, performance degradation, limited generalizability, or complicated training. To address these problems, we propose to find a strong lottery ticket via moment-matching scores. Our experimental results show that the discovered subnetwork can perform similarly or better than the trained dense model even when only 10% of the weights remain. To the best of our knowledge, we are the first to show the existence of strong lottery tickets in generative models and provide an algorithm to find it stably. Our code and supplementary materials are publicly available at https://lait-cvlab.github.io/SLT-in-Generative-Models/.

Downloads

Published

2023-06-26

How to Cite

Yeo, S., Jang, Y., Sohn, J.- yong, Han, D., & Yoo, J. (2023). Can We Find Strong Lottery Tickets in Generative Models?. Proceedings of the AAAI Conference on Artificial Intelligence, 37(3), 3267-3275. https://doi.org/10.1609/aaai.v37i3.25433

Issue

Section

AAAI Technical Track on Computer Vision III