TY - JOUR
AU - Chen, Bing
AU - Islam, Mazharul
AU - Gao, Jisuo
AU - Wang, Lin
PY - 2022/06/28
Y2 - 2023/12/02
TI - Deconvolutional Density Network: Modeling Free-Form Conditional Distributions
JF - Proceedings of the AAAI Conference on Artificial Intelligence
JA - AAAI
VL - 36
IS - 6
SE - AAAI Technical Track on Machine Learning I
DO - 10.1609/aaai.v36i6.20567
UR - https://ojs.aaai.org/index.php/AAAI/article/view/20567
SP - 6183-6192
AB - Conditional density estimation (CDE) is the task of estimating the probability of an event conditioned on some inputs. A neural network (NN) can also be used to compute the output distribution for continuous-domain, which can be viewed as an extension of regression task. Nevertheless, it is difficult to explicitly approximate a distribution without knowing the information of its general form a priori. In order to fit an arbitrary conditional distribution, discretizing the continuous domain into bins is an effective strategy, as long as we have sufficiently narrow bins and very large data. However, collecting enough data is often hard to reach and falls far short of that ideal in many circumstances, especially in multivariate CDE for the curse of dimensionality. In this paper, we demonstrate the benefits of modeling free-form conditional distributions using a deconvolution-based neural net framework, coping with data deficiency problems in discretization. It has the advantage of being flexible but also takes advantage of the hierarchical smoothness offered by the deconvolution layers. We compare our method to a number of other density-estimation approaches and show that our Deconvolutional Density Network (DDN) outperforms the competing methods on many univariate and multivariate tasks. The code of DDN is available at https://github.com/NBICLAB/DDN
ER -