Non-Negative Inductive Matrix Completion for Discrete Dyadic Data

Authors

  • Piyush Rai Indian Institute of Technology Kanpur

DOI:

https://doi.org/10.1609/aaai.v31i1.10925

Keywords:

matrix factorization, non-negative matrix factorization, matrix completion, recommender systems, Bayesian models, multi-label learning, inductive matrix completion

Abstract

We present a non-negative inductive latent factor model for binary- and count-valued matrices containing dyadic data, with side information along the rows and/or the columns of the matrix. The side information is incorporated by conditioning the row and column latent factors on the available side information via a regression model. Our model can not only perform matrix factorization and completion with side-information, but also infers interpretable latent topics that explain/summarize the data. An appealing aspect of our model is in the full local conjugacy of all parts of the model, including the main latent factor model, as well as for the regression model that leverages the side information. This enables us to design scalable and simple to implement Gibbs sampling and Expectation Maximization algorithms for doing inference in the model. Inference cost in our model scales in the number of nonzeros in the data matrix, which makes it particularly attractive for massive, sparse matrices. We demonstrate the effectiveness of our model on several real-world data sets, comparing it with state-of-the-art baselines.

Downloads

Published

2017-02-13

How to Cite

Rai, P. (2017). Non-Negative Inductive Matrix Completion for Discrete Dyadic Data. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10925