Look Back for More: Harnessing Historical Sequential Updates for Personalized Federated Adapter Tuning
DOI:
https://doi.org/10.1609/aaai.v39i19.34187Abstract
Personalized federated learning (PFL) studies effective model personalization to address the data heterogeneity issue among clients in traditional federated learning (FL). Existing PFL approaches mainly generate personalized models by relying solely on the clients' latest updated models while ignoring their previous updates, which may result in suboptimal personalized model learning. To bridge this gap, we propose a novel framework termed pFedSeq, designed for personalizing adapters to fine-tune a foundation model in FL. In pFedSeq, the server maintains and trains a sequential learner, which processes a sequence of past adapter updates from clients and generates calibrations for personalized adapters. To effectively capture the cross-client and cross-step relations hidden in previous updates and generate high-performing personalized adapters, pFedSeq adopts the powerful selective state space model (SSM) as the architecture of sequential learner. Through extensive experiments on four public benchmark datasets, we demonstrate the superiority of pFedSeq over state-of-the-art PFL methods.Downloads
Published
2025-04-11
How to Cite
Peng, D., Wang, Y., Fu, H., Jiang, J., Liu, Y., Goh, R. S. M., & Wei, Q. (2025). Look Back for More: Harnessing Historical Sequential Updates for Personalized Federated Adapter Tuning. Proceedings of the AAAI Conference on Artificial Intelligence, 39(19), 19857–19865. https://doi.org/10.1609/aaai.v39i19.34187
Issue
Section
AAAI Technical Track on Machine Learning V