Robust Formulation for PCA: Avoiding Mean Calculation With L<sub>2,p</sub>-norm Maximization

Authors

  • Shuangli Liao Xidian University
  • Jin Li Xidian University
  • Yang Liu Xidian University
  • Quanxue Gao Xidian University
  • Xinbo Gao Xidian University

Abstract

Most existing robust principal component analysis (PCA) involve mean estimation for extracting low-dimensional representation. However, they do not get the optimal mean for real data, which include outliers, under the different robust distances metric learning, such as L1-norm and L2,1-norm. This affects the robustness of algorithms. Motivated by the fact that the variance of data can be characterized by the variation between each pair of data, we propose a novel robust formulation for PCA. It avoids computing the mean of data in the criterion function. Our method employs L2,p-norm as the distance metric to measure the variation in the criterion function and aims to seek the projection matrix that maximizes the sum of variation between each pair of the projected data. Both theoretical analysis and experimental results demonstrate that our methods are efficient and superior to most existing robust methods for data reconstruction.

Downloads

Published

2018-04-29

How to Cite

Liao, S., Li, J., Liu, Y., Gao, Q., & Gao, X. (2018). Robust Formulation for PCA: Avoiding Mean Calculation With L<sub>2,p</sub>-norm Maximization. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11679