Inducing Neural Collapse in Deep Long-tailed Learning

Xuantong Liu, Jianfeng Zhang, Tianyang Hu, He Cao, Yuan Yao, Lujia Pan
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:11534-11544, 2023.

Abstract

Although deep neural networks achieve tremendous success on various classification tasks, the generalization ability drops sheer when training datasets exhibit long-tailed distributions. One of the reasons is that the learned representations (i.e. features) from the imbalanced datasets are less effective than those from balanced datasets. Specifically, the learned representation under class-balanced distribution will present the Neural Collapse (NC) phenomena. NC indicates the features from the same category are close to each other and from different categories are maximally distant, showing an optimal linear separable state of classification. However, the pattern differs on imbalanced datasets and is partially responsible for the reduced performance of the model. In this work, we propose two explicit feature regularization terms to learn high-quality representation for class-imbalanced data. With the proposed regularization, NC phenomena will appear under the class-imbalanced distribution, and the generalization ability can be significantly improved. Our method is easily implemented, highly effective, and can be plugged into most existing methods. The extensive experimental results on widely-used benchmarks show the effectiveness of our method

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-liu23i, title = {Inducing Neural Collapse in Deep Long-tailed Learning}, author = {Liu, Xuantong and Zhang, Jianfeng and Hu, Tianyang and Cao, He and Yao, Yuan and Pan, Lujia}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {11534--11544}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/liu23i/liu23i.pdf}, url = {https://proceedings.mlr.press/v206/liu23i.html}, abstract = {Although deep neural networks achieve tremendous success on various classification tasks, the generalization ability drops sheer when training datasets exhibit long-tailed distributions. One of the reasons is that the learned representations (i.e. features) from the imbalanced datasets are less effective than those from balanced datasets. Specifically, the learned representation under class-balanced distribution will present the Neural Collapse (NC) phenomena. NC indicates the features from the same category are close to each other and from different categories are maximally distant, showing an optimal linear separable state of classification. However, the pattern differs on imbalanced datasets and is partially responsible for the reduced performance of the model. In this work, we propose two explicit feature regularization terms to learn high-quality representation for class-imbalanced data. With the proposed regularization, NC phenomena will appear under the class-imbalanced distribution, and the generalization ability can be significantly improved. Our method is easily implemented, highly effective, and can be plugged into most existing methods. The extensive experimental results on widely-used benchmarks show the effectiveness of our method} }
Endnote
%0 Conference Paper %T Inducing Neural Collapse in Deep Long-tailed Learning %A Xuantong Liu %A Jianfeng Zhang %A Tianyang Hu %A He Cao %A Yuan Yao %A Lujia Pan %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-liu23i %I PMLR %P 11534--11544 %U https://proceedings.mlr.press/v206/liu23i.html %V 206 %X Although deep neural networks achieve tremendous success on various classification tasks, the generalization ability drops sheer when training datasets exhibit long-tailed distributions. One of the reasons is that the learned representations (i.e. features) from the imbalanced datasets are less effective than those from balanced datasets. Specifically, the learned representation under class-balanced distribution will present the Neural Collapse (NC) phenomena. NC indicates the features from the same category are close to each other and from different categories are maximally distant, showing an optimal linear separable state of classification. However, the pattern differs on imbalanced datasets and is partially responsible for the reduced performance of the model. In this work, we propose two explicit feature regularization terms to learn high-quality representation for class-imbalanced data. With the proposed regularization, NC phenomena will appear under the class-imbalanced distribution, and the generalization ability can be significantly improved. Our method is easily implemented, highly effective, and can be plugged into most existing methods. The extensive experimental results on widely-used benchmarks show the effectiveness of our method
APA
Liu, X., Zhang, J., Hu, T., Cao, H., Yao, Y. & Pan, L.. (2023). Inducing Neural Collapse in Deep Long-tailed Learning. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:11534-11544 Available from https://proceedings.mlr.press/v206/liu23i.html.

Related Material