Specifying Weight Priors in Bayesian Deep Neural Networks with Empirical Bayes

Authors

  • Ranganath Krishnan Intel Labs
  • Mahesh Subedar Intel Labs
  • Omesh Tickoo Intel Labs

DOI:

https://doi.org/10.1609/aaai.v34i04.5875

Abstract

Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors and approximate posterior distributions over neural network weights. Specifying meaningful weight priors is a challenging problem, particularly for scaling variational inference to deeper architectures involving high dimensional weight space. We propose MOdel Priors with Empirical Bayes using DNN (MOPED) method to choose informed weight priors in Bayesian neural networks. We formulate a two-stage hierarchical modeling, first find the maximum likelihood estimates of weights with DNN, and then set the weight priors using empirical Bayes approach to infer the posterior with variational inference. We empirically evaluate the proposed approach on real-world tasks including image classification, video activity recognition and audio classification with varying complex neural network architectures. We also evaluate our proposed approach on diabetic retinopathy diagnosis task and benchmark with the state-of-the-art Bayesian deep learning techniques. We demonstrate MOPED method enables scalable variational inference and provides reliable uncertainty quantification.

Downloads

Published

2020-04-03

How to Cite

Krishnan, R., Subedar, M., & Tickoo, O. (2020). Specifying Weight Priors in Bayesian Deep Neural Networks with Empirical Bayes. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4477-4484. https://doi.org/10.1609/aaai.v34i04.5875

Issue

Section

AAAI Technical Track: Machine Learning