Two-Sided Fairness in Non-Personalised Recommendations (Student Abstract)

Authors

  • Aadi Swadipto Mondal Indian Institute of Technology Kharagpur
  • Rakesh Bal Indian Institute of Technology Kharagpur
  • Sayan Sinha Indian Institute of Technology Kharagpur
  • Gourab K Patro Indian Institute of Technology Kharagpur

DOI:

https://doi.org/10.1609/aaai.v35i18.17922

Keywords:

Fairness, Recommender Systems, Social Choice Theory, Bias

Abstract

Recommender systems are one of the most widely used services on several online platforms to suggest potential items to the end-users. These services often use different machine learning techniques for which fairness is a concerning factor, especially when the downstream services have the ability to cause social ramifications. Thus, focusing on the non-personalised (global) recommendations in news media platforms (e.g., top-k trending topics on Twitter, top-k news on a news platform, etc.), we discuss on two specific fairness concerns together (traditionally studied separately)---user fairness and organisational fairness. While user fairness captures the idea of representing the choices of all the individual users in the case of global recommendations, organisational fairness tries to ensure politically/ideologically balanced recommendation sets. This makes user fairness a user-side requirement and organisational fairness a platform-side requirement. For user fairness, we test with methods from social choice theory, i.e., various voting rules known to better represent user choices in their results. Even in our application of voting rules to the recommendation setup, we observe high user satisfaction scores. Now for organisational fairness, we propose a bias metric which measures the aggregate ideological bias of a recommended set of items (articles). Analysing the results obtained from voting rule-based recommendation, we find that while the well-known voting rules are better from the user side, they show high bias values and clearly not suitable for organisational requirements of the platforms. Thus, there is a need to build an encompassing mechanism by cohesively bridging ideas of user fairness and organisational fairness. In this abstract paper, we intend to frame the elementary ideas along with the clear motivation behind the requirement of such a mechanism.

Downloads

Published

2021-05-18

How to Cite

Mondal, A. S., Bal, R., Sinha, S., & Patro, G. K. (2021). Two-Sided Fairness in Non-Personalised Recommendations (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 35(18), 15851-15852. https://doi.org/10.1609/aaai.v35i18.17922

Issue

Section

AAAI Student Abstract and Poster Program