Improving Gradient Flow with Unrolled Highway Expectation Maximization
Keywords:(Deep) Neural Network Algorithms, Segmentation
AbstractIntegrating model-based machine learning methods into deep neural architectures allows one to leverage both the expressive power of deep neural nets and the ability of model-based methods to incorporate domain-specific knowledge. In particular, many works have employed the expectation maximization (EM) algorithm in the form of an unrolled layer-wise structure that is jointly trained with a backbone neural network. However, it is difficult to discriminatively train the backbone network by backpropagating through the EM iterations as they are prone to the vanishing gradient problem. To address this issue, we propose Highway Expectation Maximization Networks (HEMNet), which is comprised of unrolled iterations of the generalized EM (GEM) algorithm based on the Newton-Rahpson method. HEMNet features scaled skip connections, or highways, along the depths of the unrolled architecture, resulting in improved gradient flow during backpropagation while incurring negligible additional computation and memory costs compared to standard unrolled EM. Furthermore, HEMNet preserves the underlying EM procedure, thereby fully retaining the convergence properties of the original EM algorithm. We achieve significant improvement in performance on several semantic segmentation benchmarks and empirically show that HEMNet effectively alleviates gradient decay.
How to Cite
Song, C., Kim, E., & Shim, I. (2021). Improving Gradient Flow with Unrolled Highway Expectation Maximization. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9704-9712. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17167
AAAI Technical Track on Machine Learning IV