TY - JOUR
AU - Luo, Simon
AU - Sugiyama, Mahito
PY - 2019/07/17
Y2 - 2021/04/15
TI - Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions
JF - Proceedings of the AAAI Conference on Artificial Intelligence
JA - AAAI
VL - 33
IS - 01
SE - AAAI Technical Track: Machine Learning
DO - 10.1609/aaai.v33i01.33014488
UR - https://ojs.aaai.org/index.php/AAAI/article/view/4362
SP - 4488-4495
AB - <p>Hierarchical probabilistic models are able to use a large number of parameters to create a model with a high representation power. However, it is well known that increasing the number of parameters also increases the complexity of the model which leads to a bias-variance trade-off. Although it is a classical problem, the bias-variance trade-off between <em>hiddenlayers</em> and <em>higher-order interactions</em> have not been well studied. In our study, we propose an efficient inference algorithm for the log-linear formulation of the higher-order Boltzmann machine using a combination of Gibbs sampling and annealed importance sampling. We then perform a bias-variance decomposition to study the differences in <em>hidden layers</em> and <em>higher-order interactions</em>. Our results have shown that using <em>hidden layers</em> and <em>higher-order interactions</em> have a comparable error with a similar order of magnitude and using <em>higherorder interactions</em> produce less variance for smaller sample size.</p>
ER -