A Hybrid Stochastic Gradient Hamiltonian Monte Carlo Method

Authors

  • Chao Zhang Zhejiang University
  • Zhijian Li Zhejiang University
  • Zebang Shen University of Pennsylvania
  • Jiahao Xie Zhejiang University
  • Hui Qian Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v35i12.17295

Keywords:

Bayesian Learning

Abstract

Recent theoretical analyses reveal that existing Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) methods need large mini-batches of samples (exponentially dependent on the dimension) to reduce the mean square error of gradient estimates and ensure non-asymptotic convergence guarantees when the target distribution has a nonconvex potential function. In this paper, we propose a novel SG-MCMC algorithm, called Hybrid Stochastic Gradient Hamiltonian Monte Carlo (HSG-HMC) method, which needs merely one sample per iteration and possesses a simple structure with only one hyperparameter. Such improvement leverages a hybrid stochastic gradient estimator that exploits historical stochastic gradient information to control the mean square error. Theoretical analyses show that our method obtains the best-known overall sample complexity to achieve epsilon-accuracy in terms of the 2-Wasserstein distance for sampling from distributions with nonconvex potential functions. Empirical studies on both simulated and real-world datasets demonstrate the advantage of our method.

Downloads

Published

2021-05-18

How to Cite

Zhang, C., Li, Z., Shen, Z., Xie, J., & Qian, H. (2021). A Hybrid Stochastic Gradient Hamiltonian Monte Carlo Method. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10842-10850. https://doi.org/10.1609/aaai.v35i12.17295

Issue

Section

AAAI Technical Track on Machine Learning V