Optimize Incompatible Parameters Through Compatibility-aware Knowledge Integration
DOI:
https://doi.org/10.1609/aaai.v39i18.34117Abstract
Deep neural networks have become foundational to advancements in multiple domains, including recommendation systems, natural language processing, and so on. Despite their successes, these models often contain incompatible parameters that can be underutilized or detrimental to model performance, particularly when faced with specific, varying data distributions. Existing research excels in removing such parameters or merging the outputs of multiple different pretrained models. However, the former focuses on efficiency rather than performance, while the latter requires several times more computing and storage resources to support inference. In this paper, we set the goal to explicitly improve these incompatible parameters by leveraging the complementary strengths of different models, thereby directly enhancing the models without any additional parameters. Specifically, we propose Compatibility-aware Knowledge Integration (CKI), which consists of Parameter Compatibility Assessment and Parameter Splicing, which are used to evaluate the knowledge content of multiple models and integrate the knowledge into one model, respectively. The integrated model can be used directly for inference or for further fine-tuning. Extensive experiments on various recommendation and language datasets show that CKI can effectively optimize incompatible parameters under multiple tasks and settings to break through the training limit of the original model without increasing the inference cost.Downloads
Published
2025-04-11
How to Cite
Lv, Z., Ye, K., Wei, Z., Tian, Q., Zhang, S., Zhang, W., … Wu, F. (2025). Optimize Incompatible Parameters Through Compatibility-aware Knowledge Integration. Proceedings of the AAAI Conference on Artificial Intelligence, 39(18), 19233–19241. https://doi.org/10.1609/aaai.v39i18.34117
Issue
Section
AAAI Technical Track on Machine Learning IV