Diffusing Gaussian Mixtures for Generating Categorical Data

Authors

  • Florence Regol McGill University
  • Mark Coates McGill University

DOI:

https://doi.org/10.1609/aaai.v37i8.26145

Keywords:

ML: Deep Generative Models & Autoencoders, RU: Uncertainty Representations

Abstract

Learning a categorical distribution comes with its own set of challenges. A successful approach taken by state-of-the-art works is to cast the problem in a continuous domain to take advantage of the impressive performance of the generative models for continuous data. Amongst them are the recently emerging diffusion probabilistic models, which have the observed advantage of generating high-quality samples. Recent advances for categorical generative models have focused on log likelihood improvements. In this work, we propose a generative model for categorical data based on diffusion models with a focus on high-quality sample generation, and propose sampled-based evaluation methods. The efficacy of our method stems from performing diffusion in the continuous domain while having its parameterization informed by the structure of the categorical nature of the target distribution. Our method of evaluation highlights the capabilities and limitations of different generative models for generating categorical data, and includes experiments on synthetic and real-world protein datasets.

Downloads

Published

2023-06-26

How to Cite

Regol, F., & Coates, M. (2023). Diffusing Gaussian Mixtures for Generating Categorical Data. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9570-9578. https://doi.org/10.1609/aaai.v37i8.26145

Issue

Section

AAAI Technical Track on Machine Learning III