Vector Quantized Bayesian Neural Network Inference for Data Streams

Authors

  • Namuk Park Yonsei University
  • Taekyu Lee Yonsei University
  • Songkuk Kim Yonsei University

DOI:

https://doi.org/10.1609/aaai.v35i10.17124

Keywords:

Calibration & Uncertainty Quantification, Time-Series/Data Streams

Abstract

Bayesian neural networks (BNN) can estimate the uncertainty in predictions, as opposed to non-Bayesian neural networks (NNs). However, BNNs have been far less widely used than non-Bayesian NNs in practice since they need iterative NN executions to predict a result for one data, and it gives rise to prohibitive computational cost. This computational burden is a critical problem when processing data streams with low-latency. To address this problem, we propose a novel model VQ-BNN, which approximates BNN inference for data streams. In order to reduce the computational burden, VQ-BNN inference predicts NN only once and compensates the result with previously memorized predictions. To be specific, VQ-BNN inference for data streams is given by temporal exponential smoothing of recent predictions. The computational cost of this model is almost the same as that of non-Bayesian NNs. Experiments including semantic segmentation on real-world data show that this model performs significantly faster than BNNs while estimating predictive results comparable to or superior to the results of BNNs.

Downloads

Published

2021-05-18

How to Cite

Park, N., Lee, T., & Kim, S. (2021). Vector Quantized Bayesian Neural Network Inference for Data Streams. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 9322-9330. https://doi.org/10.1609/aaai.v35i10.17124

Issue

Section

AAAI Technical Track on Machine Learning III