FedPHA: Federated Prompt Learning for Heterogeneous Client Adaptation

Chengying Fang, Wenke Huang, Guancheng Wan, Yihao Yang, Mang Ye
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:15960-15975, 2025.

Abstract

Federated Prompt Learning (FPL) adapts pre-trained Vision-Language Models (VLMs) to federated learning through prompt tuning, leveraging their transferable representations and strong generalization capabilities. Traditional methods often require uniform prompt lengths for federated aggregation, limiting adaptability to clients with diverse prompt lengths and distribution biases. In this paper, we propose Federated Prompt Learning for Heterogeneous Client Adaptation (FedPHA), a novel framework that combines a fixed-length global prompt for efficient aggregation with local prompts of varying lengths to capture client-specific data characteristics. Additionally, FedPHA designs Singular Value Decomposition (SVD) based projection and bidirectional alignment to disentangle global conflicts arising from client heterogeneity, ensuring that personalized client tasks effectively utilize non-harmful global knowledge. This approach ensures that global knowledge improves model generalization while local knowledge preserves local optimization. Experimental results validate the effectiveness of FedPHA in achieving a balance between global and personalized knowledge in federated learning scenarios.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-fang25e, title = {{F}ed{PHA}: Federated Prompt Learning for Heterogeneous Client Adaptation}, author = {Fang, Chengying and Huang, Wenke and Wan, Guancheng and Yang, Yihao and Ye, Mang}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {15960--15975}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/fang25e/fang25e.pdf}, url = {https://proceedings.mlr.press/v267/fang25e.html}, abstract = {Federated Prompt Learning (FPL) adapts pre-trained Vision-Language Models (VLMs) to federated learning through prompt tuning, leveraging their transferable representations and strong generalization capabilities. Traditional methods often require uniform prompt lengths for federated aggregation, limiting adaptability to clients with diverse prompt lengths and distribution biases. In this paper, we propose Federated Prompt Learning for Heterogeneous Client Adaptation (FedPHA), a novel framework that combines a fixed-length global prompt for efficient aggregation with local prompts of varying lengths to capture client-specific data characteristics. Additionally, FedPHA designs Singular Value Decomposition (SVD) based projection and bidirectional alignment to disentangle global conflicts arising from client heterogeneity, ensuring that personalized client tasks effectively utilize non-harmful global knowledge. This approach ensures that global knowledge improves model generalization while local knowledge preserves local optimization. Experimental results validate the effectiveness of FedPHA in achieving a balance between global and personalized knowledge in federated learning scenarios.} }
Endnote
%0 Conference Paper %T FedPHA: Federated Prompt Learning for Heterogeneous Client Adaptation %A Chengying Fang %A Wenke Huang %A Guancheng Wan %A Yihao Yang %A Mang Ye %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-fang25e %I PMLR %P 15960--15975 %U https://proceedings.mlr.press/v267/fang25e.html %V 267 %X Federated Prompt Learning (FPL) adapts pre-trained Vision-Language Models (VLMs) to federated learning through prompt tuning, leveraging their transferable representations and strong generalization capabilities. Traditional methods often require uniform prompt lengths for federated aggregation, limiting adaptability to clients with diverse prompt lengths and distribution biases. In this paper, we propose Federated Prompt Learning for Heterogeneous Client Adaptation (FedPHA), a novel framework that combines a fixed-length global prompt for efficient aggregation with local prompts of varying lengths to capture client-specific data characteristics. Additionally, FedPHA designs Singular Value Decomposition (SVD) based projection and bidirectional alignment to disentangle global conflicts arising from client heterogeneity, ensuring that personalized client tasks effectively utilize non-harmful global knowledge. This approach ensures that global knowledge improves model generalization while local knowledge preserves local optimization. Experimental results validate the effectiveness of FedPHA in achieving a balance between global and personalized knowledge in federated learning scenarios.
APA
Fang, C., Huang, W., Wan, G., Yang, Y. & Ye, M.. (2025). FedPHA: Federated Prompt Learning for Heterogeneous Client Adaptation. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:15960-15975 Available from https://proceedings.mlr.press/v267/fang25e.html.

Related Material