TransFR: Transferable Federated Recommendation with Adapter Tuning on Pre-trained Language Models

Authors

  • Honglei Zhang Key Laboratory of Big Data & Artificial Intelligence in Transportation, Ministry of Education, China Beijing Jiaotong University, China
  • Zhiwei Li University of Technology Sydney, Australia
  • Haoxuan Li Peking University, China
  • Xin Zhou Nanyang Technological University, Singapore
  • Jie Zhang Nanyang Technological University, Singapore
  • Yidong Li Key Laboratory of Big Data & Artificial Intelligence in Transportation, Ministry of Education, China Beijing Jiaotong University, China

DOI:

https://doi.org/10.1609/aaai.v40i33.40048

Abstract

Federated recommendations (FRs), facilitating multiple local clients to collectively learn a global model without disclosing user private data, have emerged as a prevalent on-device service. In conventional FRs, a dominant paradigm is to utilize discrete identities to represent clients and items, which are then mapped to domain-specific embeddings to participate in model training. Despite considerable performance, we reveal three inherent limitations that can not be ignored in federated settings, i.e., non-transferability across domains, ineffectiveness in cold-start settings, and potential privacy violations during federated training. To this end, we propose a transferable federated recommendation model, TransFR, which delicately incorporates the general capabilities empowered by pre-trained models and the personalized abilities by fine-tuning local private data. Specifically, it first learns domain-agnostic representations of items by exploiting pre-trained models with public textual corpora. To tailor for FR tasks, we further introduce efficient federated adapter-tuning and post-adaptation personalization, which facilitate personalized adapters for each client by fitting local private data. We theoretically prove the advantages of incorporating adapter tuning in FRs regarding both effectiveness and privacy. Through extensive experiments, we show that our TransFR surpasses state-of-the-art FRs on transferability.

Downloads

Published

2026-03-14

How to Cite

Zhang, H., Li, Z., Li, H., Zhou, X., Zhang, J., & Li, Y. (2026). TransFR: Transferable Federated Recommendation with Adapter Tuning on Pre-trained Language Models. Proceedings of the AAAI Conference on Artificial Intelligence, 40(33), 28212–28220. https://doi.org/10.1609/aaai.v40i33.40048

Issue

Section

AAAI Technical Track on Machine Learning X