Cooperative Knowledge Distillation: A Learner Agnostic Approach

Authors

  • Michael Livanos University of California, Davis
  • Ian Davidson University of California, Davis
  • Stephen Wong University of California, Davis

DOI:

https://doi.org/10.1609/aaai.v38i13.29322

Keywords:

ML: Transfer, Domain Adaptation, Multi-Task Learning

Abstract

Knowledge distillation is a simple but powerful way to transfer knowledge between a teacher model to a student model. Existing work suffers from at least one of the following key limitations in terms of direction and scope of transfer which restrict its use: all knowledge is transferred from teacher to student regardless of whether or not that knowledge is useful, the student is the only one learning in this exchange, and typically distillation transfers knowledge only from a single teacher to a single student. We formulate a novel form of knowledge distillation in which many models can act as both students and teachers which we call cooperative distillation. The models cooperate as follows: a model (the student) identifies specific deficiencies in it's performance and searches for another model (the teacher) who encodes learned knowledge into instructional virtual instances via counterfactual instance generation. Because different models may have different strengths and weaknesses, all models can act as either students or teachers (cooperation) when appropriate and only distill knowledge in areas specific to their strengths (focus). Since counterfactuals as a paradigm are not tied to any specific algorithm, we can use this method to distill knowledge between learners of different architectures, algorithms, and even feature spaces. We demonstrate our approach not only outperforms baselines such as transfer learning, self-supervised learning, and multiple knowledge distillation algorithms on several datasets, but it can also be used in settings where the aforementioned techniques cannot.

Published

2024-03-24

How to Cite

Livanos, M., Davidson, I., & Wong, S. (2024). Cooperative Knowledge Distillation: A Learner Agnostic Approach. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 14124-14131. https://doi.org/10.1609/aaai.v38i13.29322

Issue

Section

AAAI Technical Track on Machine Learning IV