Learning Losses for Strategic Classification

Authors

  • Tosca Lechner University of Waterloo, Waterloo, Canada
  • Ruth Urner York University, Toronto, Canada

DOI:

https://doi.org/10.1609/aaai.v36i7.20696

Keywords:

Machine Learning (ML), Game Theory And Economic Paradigms (GTEP)

Abstract

Strategic classification, i.e. classification under possible strategic manipulations of features, has received a lot of attention from both the machine learning and the game theory community. Most works focus on analysing properties of the optimal decision rule under such manipulations. In our work we take a learning theoretic perspective, focusing on the sample complexity needed to learn a good decision rule which is robust to strategic manipulation. We perform this analysis by introducing a novel loss function, the strategic manipulation loss, which takes into account both the accuracy of the final decision rule and its vulnerability to manipulation. We analyse the sample complexity for a known graph of possible manipulations in terms of the complexity of the function class and the manipulation graph. Additionally, we initialize the study of learning under unknown manipulation capabilities of the involved agents. Using techniques from transfer learning theory, we define a similarity measure for manipulation graphs and show that learning outcomes are robust with respect to small changes in the manipulation graph. Lastly, we analyse the (sample complexity of) learning of the manipulation capability of agents with respect to this similarity measure, providing novel guarantees for strategic classification with respect to an unknown manipulation graph.

Downloads

Published

2022-06-28

How to Cite

Lechner, T., & Urner, R. (2022). Learning Losses for Strategic Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7337-7344. https://doi.org/10.1609/aaai.v36i7.20696

Issue

Section

AAAI Technical Track on Machine Learning II