Scalable and Efficient Probabilistic Inference for Bayesian Deep Learning and Generative Modeling

Authors

  • Ruqi Zhang Purdue University

DOI:

https://doi.org/10.1609/aaai.v39i27.35129

Abstract

Probabilistic inference is a fundamental challenge in machine learning, spanning tasks from approximate Bayesian inference to generative AI. In this talk, I will present theoretically-guaranteed scalable and efficient probabilistic inference with applications in Bayesian deep learning and generative modeling. First, I will introduce a new compute paradigm for probabilistic inference that leverages modern accelerators, specifically low-precision and sparsity, to significantly speed up inference while preserving accuracy. Next, I will present a new framework for efficient inference in discrete domains, utilizing gradient information—a largely overlooked feature of discrete distributions—to enable more informed and directional exploration. Finally, I will showcase experimental results demonstrating the effectiveness of these methods across various ML tasks, including Bayesian neural networks, energy-based models, and large language models.

Published

2025-04-11

How to Cite

Zhang, R. (2025). Scalable and Efficient Probabilistic Inference for Bayesian Deep Learning and Generative Modeling. Proceedings of the AAAI Conference on Artificial Intelligence, 39(27), 28737–28737. https://doi.org/10.1609/aaai.v39i27.35129