Efficient and Accurate Learning of Mixtures of Plackett-Luce Models
DOI:
https://doi.org/10.1609/aaai.v37i8.26114Keywords:
ML: Learning Preferences or Rankings, HAI: Crowdsourcing, KRR: Preferences, ML: Clustering, ML: Multimodal LearningAbstract
Mixture models of Plackett-Luce (PL), one of the most fundamental ranking models, are an active research area of both theoretical and practical significance. Most previously proposed parameter estimation algorithms instantiate the EM algorithm, often with random initialization. However, such an initialization scheme may not yield a good initial estimate and the algorithms require multiple restarts, incurring a large time complexity. As for the EM procedure, while the E-step can be performed efficiently, maximizing the log-likelihood in the M-step is difficult due to the combinatorial nature of the PL likelihood function. Therefore, previous authors favor algorithms that maximize surrogate likelihood functions. However, the final estimate may deviate from the true maximum likelihood estimate as a consequence. In this paper, we address these known limitations. We propose an initialization algorithm that can provide a provably accurate initial estimate and an EM algorithm that maximizes the true log-likelihood function efficiently. Experiments on both synthetic and real datasets show that our algorithm is competitive in terms of accuracy and speed to baseline algorithms, especially on datasets with a large number of items.Downloads
Published
2023-06-26
How to Cite
Nguyen, D., & Zhang, A. Y. (2023). Efficient and Accurate Learning of Mixtures of Plackett-Luce Models. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9294-9301. https://doi.org/10.1609/aaai.v37i8.26114
Issue
Section
AAAI Technical Track on Machine Learning III