TY - JOUR AU - Blum, Avrim AU - Haghtalab, Nika PY - 2018/04/29 Y2 - 2024/03/29 TI - Algorithms for Generalized Topic Modeling JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 32 IS - 1 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v32i1.11825 UR - https://ojs.aaai.org/index.php/AAAI/article/view/11825 SP - AB - <p> Recently there has been significant activity in developing algorithms with provable guarantees for topic modeling. In this work we consider a broad generalization of the traditional topic modeling framework, where we no longer assume that words are drawn i.i.d. and instead view a topic as a complex distribution over sequences of paragraphs. Since one could not hope to even represent such a distribution in general (even if paragraphs are given using some natural feature representation), we aim instead to directly learn a predictor that given a new document, accurately predicts its topic mixture, without learning the distributions explicitly. We present several natural conditions under which one can do this from unlabeled data only, and give efficient algorithms to do so, also discussing issues such as noise tolerance and sample complexity. More generally, our model can be viewed as a generalization of the multi-view or co-training setting in machine learning. </p> ER -