Representation Surgery for Multi-Task Model Merging

Enneng Yang, Li Shen, Zhenyi Wang, Guibing Guo, Xiaojun Chen, Xingwei Wang, Dacheng Tao
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:56332-56356, 2024.

Abstract

Multi-task learning (MTL) compresses the information from multiple tasks into a unified backbone to improve computational efficiency and generalization. Recent work directly merges multiple independently trained models to perform MTL instead of collecting their raw data for joint training, greatly expanding the application scenarios of MTL. However, by visualizing the representation distribution of existing model merging schemes, we find that the merged model often suffers from the dilemma of representation bias. That is, there is a significant discrepancy in the representation distribution between the merged and individual models, resulting in poor performance of merged MTL. In this paper, we propose a representation surgery solution called “Surgery" to reduce representation bias in the merged model. Specifically, Surgery is a lightweight task-specific plugin that takes the representation of the merged model as input and attempts to output the biases contained in the representation from the merged model. We then designed an unsupervised optimization objective that updates the Surgery plugin by minimizing the distance between the merged model’s representation and the individual model’s representation. Extensive experiments demonstrate significant MTL performance improvements when our Surgery plugin is applied to state-of-the-art (SOTA) model merging schemes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-yang24t, title = {Representation Surgery for Multi-Task Model Merging}, author = {Yang, Enneng and Shen, Li and Wang, Zhenyi and Guo, Guibing and Chen, Xiaojun and Wang, Xingwei and Tao, Dacheng}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {56332--56356}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/yang24t/yang24t.pdf}, url = {https://proceedings.mlr.press/v235/yang24t.html}, abstract = {Multi-task learning (MTL) compresses the information from multiple tasks into a unified backbone to improve computational efficiency and generalization. Recent work directly merges multiple independently trained models to perform MTL instead of collecting their raw data for joint training, greatly expanding the application scenarios of MTL. However, by visualizing the representation distribution of existing model merging schemes, we find that the merged model often suffers from the dilemma of representation bias. That is, there is a significant discrepancy in the representation distribution between the merged and individual models, resulting in poor performance of merged MTL. In this paper, we propose a representation surgery solution called “Surgery" to reduce representation bias in the merged model. Specifically, Surgery is a lightweight task-specific plugin that takes the representation of the merged model as input and attempts to output the biases contained in the representation from the merged model. We then designed an unsupervised optimization objective that updates the Surgery plugin by minimizing the distance between the merged model’s representation and the individual model’s representation. Extensive experiments demonstrate significant MTL performance improvements when our Surgery plugin is applied to state-of-the-art (SOTA) model merging schemes.} }
Endnote
%0 Conference Paper %T Representation Surgery for Multi-Task Model Merging %A Enneng Yang %A Li Shen %A Zhenyi Wang %A Guibing Guo %A Xiaojun Chen %A Xingwei Wang %A Dacheng Tao %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-yang24t %I PMLR %P 56332--56356 %U https://proceedings.mlr.press/v235/yang24t.html %V 235 %X Multi-task learning (MTL) compresses the information from multiple tasks into a unified backbone to improve computational efficiency and generalization. Recent work directly merges multiple independently trained models to perform MTL instead of collecting their raw data for joint training, greatly expanding the application scenarios of MTL. However, by visualizing the representation distribution of existing model merging schemes, we find that the merged model often suffers from the dilemma of representation bias. That is, there is a significant discrepancy in the representation distribution between the merged and individual models, resulting in poor performance of merged MTL. In this paper, we propose a representation surgery solution called “Surgery" to reduce representation bias in the merged model. Specifically, Surgery is a lightweight task-specific plugin that takes the representation of the merged model as input and attempts to output the biases contained in the representation from the merged model. We then designed an unsupervised optimization objective that updates the Surgery plugin by minimizing the distance between the merged model’s representation and the individual model’s representation. Extensive experiments demonstrate significant MTL performance improvements when our Surgery plugin is applied to state-of-the-art (SOTA) model merging schemes.
APA
Yang, E., Shen, L., Wang, Z., Guo, G., Chen, X., Wang, X. & Tao, D.. (2024). Representation Surgery for Multi-Task Model Merging. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:56332-56356 Available from https://proceedings.mlr.press/v235/yang24t.html.

Related Material