Evidential Conditional Neural Processes

Authors

  • Deep Shankar Pandey Rochester Institute of Technology
  • Qi Yu Rochester Institute of Technology

DOI:

https://doi.org/10.1609/aaai.v37i8.26125

Keywords:

ML: Meta Learning

Abstract

The Conditional Neural Process (CNP) family of models offer a promising direction to tackle few-shot problems by achieving better scalability and competitive predictive performance. However, the current CNP models only capture the overall uncertainty for the prediction made on a target data point. They lack a systematic fine-grained quantification on the distinct sources of uncertainty that are essential for model training and decision-making under the few-shot setting. We propose Evidential Conditional Neural Processes (ECNP), which replace the standard Gaussian distribution used by CNP with a much richer hierarchical Bayesian structure through evidential learning to achieve epistemic-aleatoric uncertainty decomposition. The evidential hierarchical structure also leads to a theoretically justified robustness over noisy training tasks. Theoretical analysis on the proposed ECNP establishes the relationship with CNP while offering deeper insights on the roles of the evidential parameters. Extensive experiments conducted on both synthetic and real-world data demonstrate the effectiveness of our proposed model in various few-shot settings.

Downloads

Published

2023-06-26

How to Cite

Pandey, D. S., & Yu, Q. (2023). Evidential Conditional Neural Processes. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9389-9397. https://doi.org/10.1609/aaai.v37i8.26125

Issue

Section

AAAI Technical Track on Machine Learning III