Diffeomorphic Information Neural Estimation

Authors

  • Bao Duong Deakin University
  • Thin Nguyen Deakin University

DOI:

https://doi.org/10.1609/aaai.v37i6.25908

Keywords:

ML: Applications, ML: Deep Generative Models & Autoencoders, ML: Other Foundations of Machine Learning

Abstract

Mutual Information (MI) and Conditional Mutual Information (CMI) are multi-purpose tools from information theory that are able to naturally measure the statistical dependencies between random variables, thus they are usually of central interest in several statistical and machine learning tasks, such as conditional independence testing and representation learning. However, estimating CMI, or even MI, is infamously challenging due the intractable formulation. In this study, we introduce DINE (Diffeomorphic Information Neural Estimator)–a novel approach for estimating CMI of continuous random variables, inspired by the invariance of CMI over diffeomorphic maps. We show that the variables of interest can be replaced with appropriate surrogates that follow simpler distributions, allowing the CMI to be efficiently evaluated via analytical solutions. Additionally, we demonstrate the quality of the proposed estimator in comparison with state-of-the-arts in three important tasks, including estimating MI, CMI, as well as its application in conditional independence testing. The empirical evaluations show that DINE consistently outperforms competitors in all tasks and is able to adapt very well to complex and high-dimensional relationships.

Downloads

Published

2023-06-26

How to Cite

Duong, B., & Nguyen, T. (2023). Diffeomorphic Information Neural Estimation. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 7468-7475. https://doi.org/10.1609/aaai.v37i6.25908

Issue

Section

AAAI Technical Track on Machine Learning I