Improving Evidential Deep Learning via Multi-Task Learning

Authors

  • Dongpin Oh Deargen Inc.
  • Bonggun Shin Deargen USA Inc.

DOI:

https://doi.org/10.1609/aaai.v36i7.20759

Keywords:

Machine Learning (ML)

Abstract

The Evidential regression network (ENet) estimates a continuous target and its predictive uncertainty without costly Bayesian model averaging. However, it is possible that the target is inaccurately predicted due to the gradient shrinkage problem of the original loss function of the ENet, the negative log marginal likelihood (NLL) loss. In this paper, the objective is to improve the prediction accuracy of the ENet while maintaining its efficient uncertainty estimation by resolving the gradient shrinkage problem. A multi-task learning (MTL) framework, referred to as MT-ENet, is proposed to accomplish this aim. In the MTL, we define the Lipschitz modified mean squared error (MSE) loss function as another loss and add it to the existing NLL loss. The Lipschitz modified MSE loss is designed to mitigate the gradient conflict with the NLL loss by dynamically adjusting its Lipschitz constant. By doing so, the Lipschitz MSE loss does not disturb the uncertainty estimation of the NLL loss. The MT-ENet enhances the predictive accuracy of the ENet without losing uncertainty estimation capability on the synthetic dataset and real-world benchmarks, including drug-target affinity (DTA) regression. Furthermore, the MT-ENet shows remarkable calibration and out-of-distribution detection capability on the DTA benchmarks.

Downloads

Published

2022-06-28

How to Cite

Oh, D., & Shin, B. (2022). Improving Evidential Deep Learning via Multi-Task Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7895-7903. https://doi.org/10.1609/aaai.v36i7.20759

Issue

Section

AAAI Technical Track on Machine Learning II