Distribution Alignment Optimization through Neural Collapse for Long-tailed Classification

Jintong Gao, He Zhao, Dan Dan Guo, Hongyuan Zha
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:14969-14987, 2024.

Abstract

A well-trained deep neural network on balanced datasets usually exhibits the Neural Collapse (NC) phenomenon, which is an informative indicator of the model achieving good performance. However, NC is usually hard to be achieved for a model trained on long-tailed datasets, leading to the deteriorated performance of test data. This work aims to induce the NC phenomenon in imbalanced learning from the perspective of distribution matching. By enforcing the distribution of last-layer representations to align the ideal distribution of the ETF structure, we develop a Distribution Alignment Optimization (DisA) loss, acting as a plug-and-play method can be combined with most of the existing long-tailed methods, we further instantiate it to the cases of fixing classifier and learning classifier. The extensive experiments show the effectiveness of DisA, providing a promising solution to the imbalanced issue. Our code is available at DisA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-gao24s, title = {Distribution Alignment Optimization through Neural Collapse for Long-tailed Classification}, author = {Gao, Jintong and Zhao, He and Guo, Dan Dan and Zha, Hongyuan}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {14969--14987}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/gao24s/gao24s.pdf}, url = {https://proceedings.mlr.press/v235/gao24s.html}, abstract = {A well-trained deep neural network on balanced datasets usually exhibits the Neural Collapse (NC) phenomenon, which is an informative indicator of the model achieving good performance. However, NC is usually hard to be achieved for a model trained on long-tailed datasets, leading to the deteriorated performance of test data. This work aims to induce the NC phenomenon in imbalanced learning from the perspective of distribution matching. By enforcing the distribution of last-layer representations to align the ideal distribution of the ETF structure, we develop a Distribution Alignment Optimization (DisA) loss, acting as a plug-and-play method can be combined with most of the existing long-tailed methods, we further instantiate it to the cases of fixing classifier and learning classifier. The extensive experiments show the effectiveness of DisA, providing a promising solution to the imbalanced issue. Our code is available at DisA.} }
Endnote
%0 Conference Paper %T Distribution Alignment Optimization through Neural Collapse for Long-tailed Classification %A Jintong Gao %A He Zhao %A Dan Dan Guo %A Hongyuan Zha %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-gao24s %I PMLR %P 14969--14987 %U https://proceedings.mlr.press/v235/gao24s.html %V 235 %X A well-trained deep neural network on balanced datasets usually exhibits the Neural Collapse (NC) phenomenon, which is an informative indicator of the model achieving good performance. However, NC is usually hard to be achieved for a model trained on long-tailed datasets, leading to the deteriorated performance of test data. This work aims to induce the NC phenomenon in imbalanced learning from the perspective of distribution matching. By enforcing the distribution of last-layer representations to align the ideal distribution of the ETF structure, we develop a Distribution Alignment Optimization (DisA) loss, acting as a plug-and-play method can be combined with most of the existing long-tailed methods, we further instantiate it to the cases of fixing classifier and learning classifier. The extensive experiments show the effectiveness of DisA, providing a promising solution to the imbalanced issue. Our code is available at DisA.
APA
Gao, J., Zhao, H., Guo, D.D. & Zha, H.. (2024). Distribution Alignment Optimization through Neural Collapse for Long-tailed Classification. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:14969-14987 Available from https://proceedings.mlr.press/v235/gao24s.html.

Related Material