Weight matrices compression based on PDB model in deep neural networks

Xiaoling Wu, Junpeng Zhu, Zeng Li
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:68078-68093, 2025.

Abstract

Weight matrix compression has been demonstrated to effectively reduce overfitting and improve the generalization performance of deep neural networks. Compression is primarily achieved by filtering out noisy eigenvalues of the weight matrix. In this work, a novel Population Double Bulk (PDB) model is proposed to characterize the eigenvalue behavior of the weight matrix, which is more general than the existing Population Unit Bulk (PUB) model. Based on PDB model and Random Matrix Theory (RMT), we have discovered a new PDBLS algorithm for determining the boundary between noisy eigenvalues and information. A PDB Noise-Filtering algorithm is further introduced to reduce the rank of the weight matrix for compression. Experiments show that our PDB model fits the empirical distribution of eigenvalues of the weight matrix better than the PUB model, and our compressed weight matrices have lower rank at the same level of test accuracy. In some cases, our compression method can even improve generalization performance when labels contain noise. The code is avaliable at https://github.com/xlwu571/PDBLS.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-wu25at, title = {Weight matrices compression based on {PDB} model in deep neural networks}, author = {Wu, Xiaoling and Zhu, Junpeng and Li, Zeng}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {68078--68093}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/wu25at/wu25at.pdf}, url = {https://proceedings.mlr.press/v267/wu25at.html}, abstract = {Weight matrix compression has been demonstrated to effectively reduce overfitting and improve the generalization performance of deep neural networks. Compression is primarily achieved by filtering out noisy eigenvalues of the weight matrix. In this work, a novel Population Double Bulk (PDB) model is proposed to characterize the eigenvalue behavior of the weight matrix, which is more general than the existing Population Unit Bulk (PUB) model. Based on PDB model and Random Matrix Theory (RMT), we have discovered a new PDBLS algorithm for determining the boundary between noisy eigenvalues and information. A PDB Noise-Filtering algorithm is further introduced to reduce the rank of the weight matrix for compression. Experiments show that our PDB model fits the empirical distribution of eigenvalues of the weight matrix better than the PUB model, and our compressed weight matrices have lower rank at the same level of test accuracy. In some cases, our compression method can even improve generalization performance when labels contain noise. The code is avaliable at https://github.com/xlwu571/PDBLS.} }
Endnote
%0 Conference Paper %T Weight matrices compression based on PDB model in deep neural networks %A Xiaoling Wu %A Junpeng Zhu %A Zeng Li %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-wu25at %I PMLR %P 68078--68093 %U https://proceedings.mlr.press/v267/wu25at.html %V 267 %X Weight matrix compression has been demonstrated to effectively reduce overfitting and improve the generalization performance of deep neural networks. Compression is primarily achieved by filtering out noisy eigenvalues of the weight matrix. In this work, a novel Population Double Bulk (PDB) model is proposed to characterize the eigenvalue behavior of the weight matrix, which is more general than the existing Population Unit Bulk (PUB) model. Based on PDB model and Random Matrix Theory (RMT), we have discovered a new PDBLS algorithm for determining the boundary between noisy eigenvalues and information. A PDB Noise-Filtering algorithm is further introduced to reduce the rank of the weight matrix for compression. Experiments show that our PDB model fits the empirical distribution of eigenvalues of the weight matrix better than the PUB model, and our compressed weight matrices have lower rank at the same level of test accuracy. In some cases, our compression method can even improve generalization performance when labels contain noise. The code is avaliable at https://github.com/xlwu571/PDBLS.
APA
Wu, X., Zhu, J. & Li, Z.. (2025). Weight matrices compression based on PDB model in deep neural networks. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:68078-68093 Available from https://proceedings.mlr.press/v267/wu25at.html.

Related Material