A Scalable and Extensible Framework for Superposition-Structured Models

Authors

  • Shenjian Zhao Shanghai Jiao Tong University
  • Cong Xie Shanghai Jiao Tong University
  • Zhihua Zhang Shanghai Jiao Tong University

DOI:

https://doi.org/10.1609/aaai.v30i1.10216

Keywords:

superposition-structured, proximal, LBFGS

Abstract

In many learning tasks, structural models usually lead to better interpretability and higher generalization performance. In recent years, however, the simple structural models such as lasso are frequently proved to be insufficient. Accordingly, there has been a lot of work on "superposition-structured" models where multiple structural constraints are imposed. To efficiently solve these "superposition-structured" statistical models, we develop a framework based on a proximal Newton-type method. Employing the smoothed conic dual approach with the LBFGS updating formula, we propose a scalable and extensible proximal quasi-Newton (SEP-QN) framework. Empirical analysis on various datasets shows that our framework is potentially powerful, and achieves super-linear convergence rate for optimizing some popular "superposition-structured" statistical models such as the fused sparse group lasso.

Downloads

Published

2016-03-02

How to Cite

Zhao, S., Xie, C., & Zhang, Z. (2016). A Scalable and Extensible Framework for Superposition-Structured Models. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10216

Issue

Section

Technical Papers: Machine Learning Methods