Robust Partially-Compressed Least-Squares

Authors

  • Stephen Becker University of Colorado Boulder
  • Ban Kawas IBM Research
  • Marek Petrik University of New Hampshire

DOI:

https://doi.org/10.1609/aaai.v31i1.10938

Keywords:

least-squares regression, sketching, randomized projections, robust optimization

Abstract

Randomized matrix compression techniques, such as the Johnson-Lindenstrauss transform, have emerged as an effective and practical way for solving large-scale problems efficiently. With a focus on computational efficiency, however, forsaking solutions quality and accuracy becomes the trade-off. In this paper, we investigate compressed least-squares problems and propose new models and algorithms that address the issue of error and noise introduced by compression. While maintaining computational efficiency, our models provide robust solutions that are more accurate than those of classical compressed variants. We introduce tools from robust optimization together with a form of partial compression to improve the error-time trade-offs of compressed least-squares solvers. We develop an efficient solution algorithm for our Robust Partially-Compressed (RPC) model based on a reduction to a one-dimensional search.

Downloads

Published

2017-02-13

How to Cite

Becker, S., Kawas, B., & Petrik, M. (2017). Robust Partially-Compressed Least-Squares. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10938