Beyond Adapter Retrieval: Latent Geometry-Preserving Composition via Sparse Task Projection

Authors

  • Pengfei Jin Center of Advanced Medical Computing and Analysis, Massachusetts General Hospital and Harvard Medical School
  • Peng Shu School of Computing, The University of Georgia
  • Sifan Song Center of Advanced Medical Computing and Analysis, Massachusetts General Hospital and Harvard Medical School
  • Sekeun Kim Center of Advanced Medical Computing and Analysis, Massachusetts General Hospital and Harvard Medical School
  • Qing Xiao Center of Advanced Medical Computing and Analysis, Massachusetts General Hospital and Harvard Medical School
  • Cheng Chen Department of Electrical and Electronic Engineering, The University of Hong Kong School of Biomedical Engineering, The University of Hong Kong
  • Tianming Liu School of Computing, The University of Georgia
  • Xiang Li Center of Advanced Medical Computing and Analysis, Massachusetts General Hospital and Harvard Medical School
  • Quanzheng Li Center of Advanced Medical Computing and Analysis, Massachusetts General Hospital and Harvard Medical School

DOI:

https://doi.org/10.1609/aaai.v40i27.39400

Abstract

Recent advances in parameter-efficient transfer learning have demonstrated the utility of composing LoRA adapters from libraries of pretrained modules. However, most existing approaches rely on simple retrieval heuristics or uniform averaging, which overlook the latent structure of task relationships in representation space. We propose a new framework for adapter reuse that moves beyond retrieval, formulating adapter composition as a geometry-aware sparse reconstruction problem. Specifically, we represent each task by a latent prototype vector derived from the base model’s encoder and aim to approximate the target task prototype as a sparse linear combination of retrieved reference prototypes, under an L1-regularized optimization objective. The resulting combination weights are then used to blend the corresponding LoRA adapters, yielding a composite adapter tailored to the target task. This formulation not only preserves the local geometric structure of the task representation manifold, but also promotes interpretability and efficient reuse by selecting a minimal set of relevant adapters. We demonstrate the effectiveness of our approach across multiple domains—including medical image segmentation, medical report generation and image synthesis. Our results highlight the benefit of coupling retrieval with latent geometry-aware optimization for improved zero-shot generalization.

Published

2026-03-14

How to Cite

Jin, P., Shu, P., Song, S., Kim, S., Xiao, Q., Chen, C., Liu, T., Li, X., & Li, Q. (2026). Beyond Adapter Retrieval: Latent Geometry-Preserving Composition via Sparse Task Projection. Proceedings of the AAAI Conference on Artificial Intelligence, 40(27), 22417-22425. https://doi.org/10.1609/aaai.v40i27.39400

Issue

Section

AAAI Technical Track on Machine Learning IV