Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling

Gregory Benton, Wesley Maddox, Sanae Lotfi, Andrew Gordon Gordon Wilson
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:769-779, 2021.

Abstract

With a better understanding of the loss surfaces for multilayer networks, we can build more robust and accurate training procedures. Recently it was discovered that independently trained SGD solutions can be connected along one-dimensional paths of near-constant training loss. In this paper, we in fact demonstrate the existence of mode-connecting simplicial complexes that form multi-dimensional manifolds of low loss, connecting many independently trained models. Building on this discovery, we show how to efficiently construct simplicial complexes for fast ensembling, outperforming independently trained deep ensembles in accuracy, calibration, and robustness to dataset shift. Notably, our approach is easy to apply and only requires a few training epochs to discover a low-loss simplex.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-benton21a, title = {Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling}, author = {Benton, Gregory and Maddox, Wesley and Lotfi, Sanae and Wilson, Andrew Gordon Gordon}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {769--779}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/benton21a/benton21a.pdf}, url = {https://proceedings.mlr.press/v139/benton21a.html}, abstract = {With a better understanding of the loss surfaces for multilayer networks, we can build more robust and accurate training procedures. Recently it was discovered that independently trained SGD solutions can be connected along one-dimensional paths of near-constant training loss. In this paper, we in fact demonstrate the existence of mode-connecting simplicial complexes that form multi-dimensional manifolds of low loss, connecting many independently trained models. Building on this discovery, we show how to efficiently construct simplicial complexes for fast ensembling, outperforming independently trained deep ensembles in accuracy, calibration, and robustness to dataset shift. Notably, our approach is easy to apply and only requires a few training epochs to discover a low-loss simplex.} }
Endnote
%0 Conference Paper %T Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling %A Gregory Benton %A Wesley Maddox %A Sanae Lotfi %A Andrew Gordon Gordon Wilson %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-benton21a %I PMLR %P 769--779 %U https://proceedings.mlr.press/v139/benton21a.html %V 139 %X With a better understanding of the loss surfaces for multilayer networks, we can build more robust and accurate training procedures. Recently it was discovered that independently trained SGD solutions can be connected along one-dimensional paths of near-constant training loss. In this paper, we in fact demonstrate the existence of mode-connecting simplicial complexes that form multi-dimensional manifolds of low loss, connecting many independently trained models. Building on this discovery, we show how to efficiently construct simplicial complexes for fast ensembling, outperforming independently trained deep ensembles in accuracy, calibration, and robustness to dataset shift. Notably, our approach is easy to apply and only requires a few training epochs to discover a low-loss simplex.
APA
Benton, G., Maddox, W., Lotfi, S. & Wilson, A.G.G.. (2021). Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:769-779 Available from https://proceedings.mlr.press/v139/benton21a.html.

Related Material