Mega-CE$^2$ : A Multimodal Heterogeneous Aggregation Framework for End-Edge-Cloud Computing

Zinuo Cheng, Haodi Wang, Rongfang Bie
Proceedings of the 17th Asian Conference on Machine Learning, PMLR 304:750-765, 2025.

Abstract

End-Edge-Cloud Computing (EECC) has emerged as the mainstream computing paradigm, integrating edge computing to overcome the limitations of traditional federated learning in communication efficiency and resource scheduling. However, existing studies reveal that most frameworks still struggle with challenges such as computing resource allocation and high end-to-end latency in EECC. To address these issues, we propose Mega-CE$^2$, a novel multi-modal heterogeneous aggregation framework. Mega-CE$^2$ establishes a closed-loop feedback mechanism from the bottom-up to the top down through end-device data serialization, edge-server model personalization, and cloud-based optimization. Notably Mega-CE$^2$ incorporates lightweight adapters for fine-tuning, enabling efficient deployment while preserving local model personalization. These adapters, with fewer parameters than the global model, optimize model parameters during edge-to-cloud aggregation, thereby achieving both lightweight and personalized capabilities. In experiments on three open-source standard datasets, we show that the performance of Mega-CE$^2$ improves by 3%–5%, while maintaining scalability with lightweight and low-latency characteristics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v304-cheng25a, title = {Mega-CE$^2$ : A Multimodal Heterogeneous Aggregation Framework for End-Edge-Cloud Computing}, author = {Cheng, Zinuo and Wang, Haodi and Bie, Rongfang}, booktitle = {Proceedings of the 17th Asian Conference on Machine Learning}, pages = {750--765}, year = {2025}, editor = {Lee, Hung-yi and Liu, Tongliang}, volume = {304}, series = {Proceedings of Machine Learning Research}, month = {09--12 Dec}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v304/main/assets/cheng25a/cheng25a.pdf}, url = {https://proceedings.mlr.press/v304/cheng25a.html}, abstract = {End-Edge-Cloud Computing (EECC) has emerged as the mainstream computing paradigm, integrating edge computing to overcome the limitations of traditional federated learning in communication efficiency and resource scheduling. However, existing studies reveal that most frameworks still struggle with challenges such as computing resource allocation and high end-to-end latency in EECC. To address these issues, we propose Mega-CE$^2$, a novel multi-modal heterogeneous aggregation framework. Mega-CE$^2$ establishes a closed-loop feedback mechanism from the bottom-up to the top down through end-device data serialization, edge-server model personalization, and cloud-based optimization. Notably Mega-CE$^2$ incorporates lightweight adapters for fine-tuning, enabling efficient deployment while preserving local model personalization. These adapters, with fewer parameters than the global model, optimize model parameters during edge-to-cloud aggregation, thereby achieving both lightweight and personalized capabilities. In experiments on three open-source standard datasets, we show that the performance of Mega-CE$^2$ improves by 3%–5%, while maintaining scalability with lightweight and low-latency characteristics.} }
Endnote
%0 Conference Paper %T Mega-CE$^2$ : A Multimodal Heterogeneous Aggregation Framework for End-Edge-Cloud Computing %A Zinuo Cheng %A Haodi Wang %A Rongfang Bie %B Proceedings of the 17th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Hung-yi Lee %E Tongliang Liu %F pmlr-v304-cheng25a %I PMLR %P 750--765 %U https://proceedings.mlr.press/v304/cheng25a.html %V 304 %X End-Edge-Cloud Computing (EECC) has emerged as the mainstream computing paradigm, integrating edge computing to overcome the limitations of traditional federated learning in communication efficiency and resource scheduling. However, existing studies reveal that most frameworks still struggle with challenges such as computing resource allocation and high end-to-end latency in EECC. To address these issues, we propose Mega-CE$^2$, a novel multi-modal heterogeneous aggregation framework. Mega-CE$^2$ establishes a closed-loop feedback mechanism from the bottom-up to the top down through end-device data serialization, edge-server model personalization, and cloud-based optimization. Notably Mega-CE$^2$ incorporates lightweight adapters for fine-tuning, enabling efficient deployment while preserving local model personalization. These adapters, with fewer parameters than the global model, optimize model parameters during edge-to-cloud aggregation, thereby achieving both lightweight and personalized capabilities. In experiments on three open-source standard datasets, we show that the performance of Mega-CE$^2$ improves by 3%–5%, while maintaining scalability with lightweight and low-latency characteristics.
APA
Cheng, Z., Wang, H. & Bie, R.. (2025). Mega-CE$^2$ : A Multimodal Heterogeneous Aggregation Framework for End-Edge-Cloud Computing. Proceedings of the 17th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 304:750-765 Available from https://proceedings.mlr.press/v304/cheng25a.html.

Related Material