Wasserstein Distance Constraint and Parameter Sparsification for Batched and Iterative Knowledge Editing

Authors

  • Shanbao Qiao Center for Advanced Image and Information Technology, Department of Computer Science and Artificial Intelligence, Jeonbuk National University
  • Xuebing Liu Center for Advanced Image and Information Technology, Department of Computer Science and Artificial Intelligence, Jeonbuk National University
  • Seung-Hoon Na Center for Advanced Image and Information Technology, Department of Computer Science and Artificial Intelligence, Jeonbuk National University

DOI:

https://doi.org/10.1609/aaai.v39i23.34686

Abstract

Model knowledge editing has become a widely researched topic because it enables efficient and rapid injection of new knowledge into language models or the correction of erroneous or outdated knowledge. Existing model knowledge editing methods typically categorized into single-instance sequential editing and massive one-time editing. However, in practical applications, the batched and iterative editing manner better aligns with model updating patterns. In this work, we explored the performance of parameter-update-based models in a new batched iterative editing benchmark. Our findings show that with an increase in the number of editing iterations, the accumulation of updated parameters leads to a greater change in the distribution of model parameters, making it more challenging to maintain editing performance and model stability. To address this degradation issue, we propose two methods: the Wasserstein distance constraint and update parameter sparsification, where the Wasserstein distance constraint optimizes the transition of parameter distribution before and after the editing, and update parameter sparsification significantly reduces the number of update parameters, thereby alleviating the issue of instability in the parameter distribution caused by the accumulation of update parameters through iterations. Our methods can be generally applied to different parameter-update-based knowledge editing models. Experiments on the zsRE and CounterFact datasets demonstrate that our methods can improve editing performance and enhance the later-stage stability of batched iterative editing across different models.

Published

2025-04-11

How to Cite

Qiao, S., Liu, X., & Na, S.-H. (2025). Wasserstein Distance Constraint and Parameter Sparsification for Batched and Iterative Knowledge Editing. Proceedings of the AAAI Conference on Artificial Intelligence, 39(23), 25019-25028. https://doi.org/10.1609/aaai.v39i23.34686

Issue

Section

AAAI Technical Track on Natural Language Processing II