Robust Non-Negative Dictionary Learning

Authors

  • Qihe Pan Beihang University
  • Deguang Kong University of Texas Arlington
  • Chris Ding University of Texas Arlington
  • Bin Luo Anhui University

DOI:

https://doi.org/10.1609/aaai.v28i1.9017

Keywords:

dictionary, NMF, robust

Abstract

Dictionary learning plays an important role in machine learning, where data vectors are modeled as a sparse linear combinations of basis factors (i.e., dictionary). However, how to conduct dictionary learning in noisy environment has not been well studied. Moreover, in practice, the dictionary (i.e., the lower rank approximation of the data matrix) and the sparse representations are required to be nonnegative, such as applications for image annotation, document summarization, microarray analysis. In this paper, we propose a new formulation for non-negative dictionary learning in noisy environment, where structure sparsity is enforced on sparse representation. The proposed new formulation is also robust for data with noises and outliers, due to a robust loss function used. We derive an efficient multiplicative updating algorithm to solve the optimization problem, where dictionary and sparse representation are updated iteratively. We prove the convergence and correctness of proposed algorithm rigorously.We show the differences of dictionary at different level of sparsity constraint.The proposed algorithm can be adapted for clustering and semi-supervised learning.

Downloads

Published

2014-06-21

How to Cite

Pan, Q., Kong, D., Ding, C., & Luo, B. (2014). Robust Non-Negative Dictionary Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 28(1). https://doi.org/10.1609/aaai.v28i1.9017

Issue

Section

Main Track: Novel Machine Learning Algorithms