Investigating the relationship between diversity and generalization in deep neural networks

Ruan P. Van der Spoel, Randle Rabe
Proceedings of the 7th Northern Lights Deep Learning Conference (NLDL), PMLR 307:375-387, 2026.

Abstract

In ensembles, improved generalization is frequently attributed to \emph{diversity} among members of the ensemble. By viewing a single neural network as an \emph{implicit ensemble}, we perform an exploratory investigation that applies well-known ensemble diversity measures to a neural network in order to study the relationship between diversity and generalization in the over-parameterized regime. Our results show that i) deeper layers of the network generally have higher levels of diversity—particularly for MLPs—and ii) layer-wise accuracy positively correlates with diversity. Additionally, we study the effects of well-known regularizers such as Dropout, DropConnect and batch size, on diversity and generalization. We generally find that increasing the strength of the regularizer increases the diversity in the neural network and this increase in diversity is positively correlated with model accuracy. We show that these results hold for several benchmark datasets (such as Fashion-MNIST and CIFAR-10) and architectures (MLPs and CNNs). Our findings suggest new avenues of research into the generalization ability of deep neural networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v307-spoel26a, title = {Investigating the relationship between diversity and generalization in deep neural networks}, author = {Spoel, Ruan P. Van der and Rabe, Randle}, booktitle = {Proceedings of the 7th Northern Lights Deep Learning Conference (NLDL)}, pages = {375--387}, year = {2026}, editor = {Kim, Hyeongji and Ramírez Rivera, Adín and Ricaud, Benjamin}, volume = {307}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jan}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v307/main/assets/spoel26a/spoel26a.pdf}, url = {https://proceedings.mlr.press/v307/spoel26a.html}, abstract = {In ensembles, improved generalization is frequently attributed to \emph{diversity} among members of the ensemble. By viewing a single neural network as an \emph{implicit ensemble}, we perform an exploratory investigation that applies well-known ensemble diversity measures to a neural network in order to study the relationship between diversity and generalization in the over-parameterized regime. Our results show that i) deeper layers of the network generally have higher levels of diversity—particularly for MLPs—and ii) layer-wise accuracy positively correlates with diversity. Additionally, we study the effects of well-known regularizers such as Dropout, DropConnect and batch size, on diversity and generalization. We generally find that increasing the strength of the regularizer increases the diversity in the neural network and this increase in diversity is positively correlated with model accuracy. We show that these results hold for several benchmark datasets (such as Fashion-MNIST and CIFAR-10) and architectures (MLPs and CNNs). Our findings suggest new avenues of research into the generalization ability of deep neural networks.} }
Endnote
%0 Conference Paper %T Investigating the relationship between diversity and generalization in deep neural networks %A Ruan P. Van der Spoel %A Randle Rabe %B Proceedings of the 7th Northern Lights Deep Learning Conference (NLDL) %C Proceedings of Machine Learning Research %D 2026 %E Hyeongji Kim %E Adín Ramírez Rivera %E Benjamin Ricaud %F pmlr-v307-spoel26a %I PMLR %P 375--387 %U https://proceedings.mlr.press/v307/spoel26a.html %V 307 %X In ensembles, improved generalization is frequently attributed to \emph{diversity} among members of the ensemble. By viewing a single neural network as an \emph{implicit ensemble}, we perform an exploratory investigation that applies well-known ensemble diversity measures to a neural network in order to study the relationship between diversity and generalization in the over-parameterized regime. Our results show that i) deeper layers of the network generally have higher levels of diversity—particularly for MLPs—and ii) layer-wise accuracy positively correlates with diversity. Additionally, we study the effects of well-known regularizers such as Dropout, DropConnect and batch size, on diversity and generalization. We generally find that increasing the strength of the regularizer increases the diversity in the neural network and this increase in diversity is positively correlated with model accuracy. We show that these results hold for several benchmark datasets (such as Fashion-MNIST and CIFAR-10) and architectures (MLPs and CNNs). Our findings suggest new avenues of research into the generalization ability of deep neural networks.
APA
Spoel, R.P.V.d. & Rabe, R.. (2026). Investigating the relationship between diversity and generalization in deep neural networks. Proceedings of the 7th Northern Lights Deep Learning Conference (NLDL), in Proceedings of Machine Learning Research 307:375-387 Available from https://proceedings.mlr.press/v307/spoel26a.html.

Related Material