Random Features for Shift-Invariant Kernels with Moment Matching

Authors

  • Weiwei Shen GE Global Research Center
  • Zhihui Yang GE Global Research Center
  • Jun Wang East China Normal University

DOI:

https://doi.org/10.1609/aaai.v31i1.10825

Keywords:

Variance Reduction, Moment Matching, Random Feature Maps

Abstract

In order to grapple with the conundrum in the scalability of kernel-based learning algorithms, the method of approximating nonlinear kernels via random feature maps has attracted wide attention in large-scale learning systems. Specifically, the associated sampling procedure is one critical component that dictates the quality of random feature maps. However, for high-dimensional features, the standard Monte Carlo sampling method has been shown to be less effective in producing low-variance random samples. In consequence, it demands constructing a large number of features to attain the desired accuracy for downstream use. In this paper, we present a novel sampling algorithm powered by moment matching techniques to reduce the variance of random features. Our extensive empirical studies and comparisons with several highly competitive peer methods verify the superiority of the proposed algorithm in Gram matrix approximation and generalization errors in regression. Our rigorous theoretical proofs justify that the proposed algorithm is guaranteed achieving lower variance than the standard Monte Carlo method in high dimensional settings.

Downloads

Published

2017-02-13

How to Cite

Shen, W., Yang, Z., & Wang, J. (2017). Random Features for Shift-Invariant Kernels with Moment Matching. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10825