Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge

Hanglei Hu, Yingying Guo, Zhikang Chen, Sen Cui, Fei Wu, Kun Kuang, Min Zhang, Bo Jiang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:24314-24327, 2025.

Abstract

Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called Neural-Collapse-Advanced personalized Learning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces Text-modality Collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-of-the-art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-hu25f, title = {Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge}, author = {Hu, Hanglei and Guo, Yingying and Chen, Zhikang and Cui, Sen and Wu, Fei and Kuang, Kun and Zhang, Min and Jiang, Bo}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {24314--24327}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/hu25f/hu25f.pdf}, url = {https://proceedings.mlr.press/v267/hu25f.html}, abstract = {Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called Neural-Collapse-Advanced personalized Learning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces Text-modality Collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-of-the-art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability.} }
Endnote
%0 Conference Paper %T Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge %A Hanglei Hu %A Yingying Guo %A Zhikang Chen %A Sen Cui %A Fei Wu %A Kun Kuang %A Min Zhang %A Bo Jiang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-hu25f %I PMLR %P 24314--24327 %U https://proceedings.mlr.press/v267/hu25f.html %V 267 %X Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called Neural-Collapse-Advanced personalized Learning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces Text-modality Collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-of-the-art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability.
APA
Hu, H., Guo, Y., Chen, Z., Cui, S., Wu, F., Kuang, K., Zhang, M. & Jiang, B.. (2025). Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:24314-24327 Available from https://proceedings.mlr.press/v267/hu25f.html.

Related Material