TY - JOUR AU - Shen, Chengchao AU - Wang, Xinchao AU - Song, Jie AU - Sun, Li AU - Song, Mingli PY - 2019/07/17 Y2 - 2024/03/28 TI - Amalgamating Knowledge towards Comprehensive Classification JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 33 IS - 01 SE - AAAI Technical Track: Knowledge Representation and Reasoning DO - 10.1609/aaai.v33i01.33013068 UR - https://ojs.aaai.org/index.php/AAAI/article/view/4165 SP - 3068-3075 AB - <p>With the rapid development of deep learning, there have been an unprecedentedly large number of trained deep network models available online. Reusing such trained models can significantly reduce the cost of training the new models from scratch, if not infeasible at all as the annotations used for the training original networks are often unavailable to public. We propose in this paper to study a new model-reusing task, which we term as <em>knowledge amalgamation.</em> Given multiple trained teacher networks, each of which specializes in a different classification problem, the goal of knowledge amalgamation is to learn a lightweight student model capable of handling the comprehensive classification. We assume no other annotations except the outputs from the teacher models are available, and thus focus on extracting and amalgamating knowledge from the multiple teachers. To this end, we propose a pilot two-step strategy to tackle the knowledge amalgamation task, by learning first the compact feature representations from teachers and then the network parameters in a layer-wise manner so as to build the student model. We apply this approach to four public datasets and obtain very encouraging results: even without any human annotation, the obtained student model is competent to handle the comprehensive classification task and in most cases outperforms the teachers in individual sub-tasks.</p> ER -