Learning Set Functions with Implicit Differentiation

Authors

  • Gözde Özcan Northeastern University
  • Chengzhi Shi Northeastern University
  • Stratis Ioannidis Northeastern University

DOI:

https://doi.org/10.1609/aaai.v39i19.34178

Abstract

A recent work introduces the problem of learning set functions from data generated by a so-called optimal subset oracle. Their approach approximates the underlying utility function with an energy-based model, whose parameters are estimated via mean-field variational inference. This approximation reduces to fixed point iterations; however, as the number of iterations increases, automatic differentiation quickly becomes computationally prohibitive due to the size of the Jacobians that are stacked during backpropagation. We address this challenge with implicit differentiation and examine the convergence conditions for the fixed-point iterations. We empirically demonstrate the efficiency of our method on synthetic and real-world subset selection applications including product recommendation, set anomaly detection and compound selection tasks.

Downloads

Published

2025-04-11

How to Cite

Özcan, G., Shi, C., & Ioannidis, S. (2025). Learning Set Functions with Implicit Differentiation. Proceedings of the AAAI Conference on Artificial Intelligence, 39(19), 19777–19785. https://doi.org/10.1609/aaai.v39i19.34178

Issue

Section

AAAI Technical Track on Machine Learning V