Knowledge Swapping via Learning and Unlearning

Mingyu Xing, Lechao Cheng, Shengeng Tang, Yaxiong Wang, Zhun Zhong, Meng Wang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:68889-68899, 2025.

Abstract

We introduce Knowledge Swapping, a novel task designed to selectively regulate knowledge of a pretrained model by enabling the forgetting of user-specified information, retaining essential knowledge, and acquiring new knowledge simultaneously. By delving into the analysis of knock-on feature hierarchy, we find that incremental learning typically progresses from low-level representations to higher-level semantics, whereas forgetting tends to occur in the opposite direction—starting from high-level semantics and moving down to low-level features. Building upon this, we propose to benchmark the knowledge swapping task with the strategy of Learning Before Forgetting. Comprehensive experiments on various tasks like image classification, object detection, and semantic segmentation validate the effectiveness of the proposed strategy. The source code is available at https://github.com/xingmingyu123456/KnowledgeSwapping.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-xing25a, title = {Knowledge Swapping via Learning and Unlearning}, author = {Xing, Mingyu and Cheng, Lechao and Tang, Shengeng and Wang, Yaxiong and Zhong, Zhun and Wang, Meng}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {68889--68899}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/xing25a/xing25a.pdf}, url = {https://proceedings.mlr.press/v267/xing25a.html}, abstract = {We introduce Knowledge Swapping, a novel task designed to selectively regulate knowledge of a pretrained model by enabling the forgetting of user-specified information, retaining essential knowledge, and acquiring new knowledge simultaneously. By delving into the analysis of knock-on feature hierarchy, we find that incremental learning typically progresses from low-level representations to higher-level semantics, whereas forgetting tends to occur in the opposite direction—starting from high-level semantics and moving down to low-level features. Building upon this, we propose to benchmark the knowledge swapping task with the strategy of Learning Before Forgetting. Comprehensive experiments on various tasks like image classification, object detection, and semantic segmentation validate the effectiveness of the proposed strategy. The source code is available at https://github.com/xingmingyu123456/KnowledgeSwapping.} }
Endnote
%0 Conference Paper %T Knowledge Swapping via Learning and Unlearning %A Mingyu Xing %A Lechao Cheng %A Shengeng Tang %A Yaxiong Wang %A Zhun Zhong %A Meng Wang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-xing25a %I PMLR %P 68889--68899 %U https://proceedings.mlr.press/v267/xing25a.html %V 267 %X We introduce Knowledge Swapping, a novel task designed to selectively regulate knowledge of a pretrained model by enabling the forgetting of user-specified information, retaining essential knowledge, and acquiring new knowledge simultaneously. By delving into the analysis of knock-on feature hierarchy, we find that incremental learning typically progresses from low-level representations to higher-level semantics, whereas forgetting tends to occur in the opposite direction—starting from high-level semantics and moving down to low-level features. Building upon this, we propose to benchmark the knowledge swapping task with the strategy of Learning Before Forgetting. Comprehensive experiments on various tasks like image classification, object detection, and semantic segmentation validate the effectiveness of the proposed strategy. The source code is available at https://github.com/xingmingyu123456/KnowledgeSwapping.
APA
Xing, M., Cheng, L., Tang, S., Wang, Y., Zhong, Z. & Wang, M.. (2025). Knowledge Swapping via Learning and Unlearning. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:68889-68899 Available from https://proceedings.mlr.press/v267/xing25a.html.

Related Material