1.
Tian Y, Wang Z, Peng Y, Yuan A, Wang Z, Yi B, et al. KeepKV: Achieving Periodic Lossless KV Cache Compression for Efficient LLM Inference. AAAI [Internet]. 2026 Mar. 14 [cited 2026 May 14];40(39):33259-67. Available from: https://ojs.aaai.org/index.php/AAAI/article/view/40611