Maximum Margin Dirichlet Process Mixtures for Clustering

Authors

  • Gang Chen State University of New York at Buffalo
  • Haiying Zhang Chinese Academy of Sciences
  • Caiming Xiong Metamind Inc.

DOI:

https://doi.org/10.1609/aaai.v30i1.10197

Keywords:

Nonparametric clustering, maximum margin learning, online learning

Abstract

The Dirichlet process mixtures (DPM) can automatically infer the model complexity from data. Hence it has attracted significant attention recently, and is widely used for model selection and clustering. As a generative model, it generally requires prior base distribution to learn component parameters by maximizing posterior probability. In contrast, discriminative classifiers model the conditional probability directly, and have yielded better results than generative classifiers.In this paper, we propose a maximum margin Dirichlet process mixture for clustering, which is different from the traditional DPM for parameter modeling. Our model takes a discriminative clustering approach, by maximizing a conditional likelihood to estimate parameters. In particular, we take a EM-like algorithm by leveraging Gibbs sampling algorithm for inference, which in turn can be perfectly embedded in the online maximum margin learning procedure to update model parameters. We test our model and show comparative results over the traditional DPM and other nonparametric clustering approaches.

Downloads

Published

2016-02-21

How to Cite

Chen, G., Zhang, H., & Xiong, C. (2016). Maximum Margin Dirichlet Process Mixtures for Clustering. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10197

Issue

Section

Technical Papers: Machine Learning Methods