GLAD: Improving Latent Graph Generative Modeling with Simple Quantization
DOI:
https://doi.org/10.1609/aaai.v39i18.34169Abstract
Learning graph generative models over latent spaces has received less attention compared to models that operate on the original data space and has so far demonstrated lacklustre performance. We present GLAD a latent space graph generative model. Unlike most previous latent space graph generative models, GLAD operates on a discrete latent space that preserves to a significant extent the discrete nature of the graph structures making no unnatural assumptions such as latent space continuity. We learn the prior of our discrete latent space by adapting diffusion bridges to its structure. By operating over an appropriately constructed latent space we avoid relying on decompositions that are often used in models that operate in the original data space. We present experiments on a series of graph benchmark datasets that demonstrates GLAD as the first equivariant latent graph generative method achieves competitive performance with the state of the art baselines.Published
2025-04-11
How to Cite
Nguyen, V. K., Boget, Y., Lavda, F., & Kalousis, A. (2025). GLAD: Improving Latent Graph Generative Modeling with Simple Quantization. Proceedings of the AAAI Conference on Artificial Intelligence, 39(18), 19695-19702. https://doi.org/10.1609/aaai.v39i18.34169
Issue
Section
AAAI Technical Track on Machine Learning IV