Normalizing Flow Ensembles for Rich Aleatoric and Epistemic Uncertainty Modeling

Authors

  • Lucas Berry McGill University
  • David Meger McGill University

DOI:

https://doi.org/10.1609/aaai.v37i6.25834

Keywords:

ML: Calibration & Uncertainty Quantification, ML: Active Learning, ML: Ensemble Methods, ML: Multimodal Learning, ML: Probabilistic Methods

Abstract

In this work, we demonstrate how to reliably estimate epistemic uncertainty while maintaining the flexibility needed to capture complicated aleatoric distributions. To this end, we propose an ensemble of Normalizing Flows (NF), which are state-of-the-art in modeling aleatoric uncertainty. The ensembles are created via sets of fixed dropout masks, making them less expensive than creating separate NF models. We demonstrate how to leverage the unique structure of NFs, base distributions, to estimate aleatoric uncertainty without relying on samples, provide a comprehensive set of baselines, and derive unbiased estimates for differential entropy. The methods were applied to a variety of experiments, commonly used to benchmark aleatoric and epistemic uncertainty estimation: 1D sinusoidal data, 2D windy grid-world (Wet Chicken), Pendulum, and Hopper. In these experiments, we setup an active learning framework and evaluate each model's capability at measuring aleatoric and epistemic uncertainty. The results show the advantages of using NF ensembles in capturing complicated aleatoric while maintaining accurate epistemic uncertainty estimates.

Downloads

Published

2023-06-26

How to Cite

Berry, L., & Meger, D. (2023). Normalizing Flow Ensembles for Rich Aleatoric and Epistemic Uncertainty Modeling. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 6806-6814. https://doi.org/10.1609/aaai.v37i6.25834

Issue

Section

AAAI Technical Track on Machine Learning I