[1]
Koike-Akino, T. et al. 2026. LatentLLM: Activation-Aware Transform to Multi-Head Latent Attention. Proceedings of the AAAI Conference on Artificial Intelligence. 40, 27 (Mar. 2026), 22644–22652. DOI:https://doi.org/10.1609/aaai.v40i27.39425.