Algorithms for Generalized Topic Modeling


  • Avrim Blum Toyota Technological Institute at Chicago
  • Nika Haghtalab Carnegie Mellon University



Topic Modeling, Cotraining, Multi-view learning


Recently there has been significant activity in developing algorithms with provable guarantees for topic modeling. In this work we consider a broad generalization of the traditional topic modeling framework, where we no longer assume that words are drawn i.i.d. and instead view a topic as a complex distribution over sequences of paragraphs. Since one could not hope to even represent such a distribution in general (even if paragraphs are given using some natural feature representation), we aim instead to directly learn a predictor that given a new document, accurately predicts its topic mixture, without learning the distributions explicitly. We present several natural conditions under which one can do this from unlabeled data only, and give efficient algorithms to do so, also discussing issues such as noise tolerance and sample complexity. More generally, our model can be viewed as a generalization of the multi-view or co-training setting in machine learning.




How to Cite

Blum, A., & Haghtalab, N. (2018). Algorithms for Generalized Topic Modeling. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1).