Measuring Dependence with Matrix-based Entropy Functional

Authors

  • Shujian Yu NEC Laboratories Europe
  • Francesco Alesiani NEC Laboratories Europe
  • Xi Yu University of Florida
  • Robert Jenssen UiT - The Arctic University of Norway
  • Jose Principe University of Florida

DOI:

https://doi.org/10.1609/aaai.v35i12.17288

Keywords:

Other Foundations of Machine Learning

Abstract

Measuring the dependence of data plays a central role in statistics and machine learning. In this work, we summarize and generalize the main idea of existing information-theoretic dependence measures into a higher-level perspective by the Shearer's inequality. Based on our generalization, we then propose two measures, namely the matrix-based normalized total correlation and the matrix-based normalized dual total correlation, to quantify the dependence of multiple variables in arbitrary dimensional space, without explicit estimation of the underlying data distributions. We show that our measures are differentiable and statistically more powerful than prevalent ones. We also show the impact of our measures in four different machine learning problems, namely the gene regulatory network inference, the robust machine learning under covariate shift and non-Gaussian noises, the subspace outlier detection, and the understanding of the learning dynamics of convolutional neural networks, to demonstrate their utilities, advantages, as well as implications to those problems.

Downloads

Published

2021-05-18

How to Cite

Yu, S., Alesiani, F., Yu, X., Jenssen, R., & Principe, J. (2021). Measuring Dependence with Matrix-based Entropy Functional. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10781-10789. https://doi.org/10.1609/aaai.v35i12.17288

Issue

Section

AAAI Technical Track on Machine Learning V