Active Importance Sampling for Variational Objectives Dominated by Rare Events: Consequences for Optimization and Generalization

Grant M Rotskoff, Andrew R Mitchell, Eric Vanden-Eijnden
Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, PMLR 145:757-780, 2022.

Abstract

Deep neural networks, when optimized with sufficient data, provide accurate representations of high-dimensional functions; in contrast, function approximation techniques that have predominated in scientific computing do not scale well with dimensionality. As a result, many high-dimensional sampling and approximation problems once thought intractable are being revisited through the lens of machine learning. While the promise of unparalleled accuracy may suggest a renaissance for applications that require parameterizing representations of complex systems, in many applications gathering sufficient data to develop such a representation remains a significant challenge. Here we introduce an approach that combines rare events sampling techniques with neural network train- ing to optimize objective functions that are dominated by rare events. We show that importance sampling reduces the asymptotic variance of the solution to a learning problem, suggesting benefits for generalization. We study our algorithm in the context of solving high-dimensional PDEs that admit a variational formulation, a problem with applications in statistical physics and implications in machine learning theory. Our numerical experiments demonstrate that we can successfully learn even with the compounding difficulties of high-dimension and rare data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v145-rotskoff22a, title = {Active Importance Sampling for Variational Objectives Dominated by Rare Events: Consequences for Optimization and Generalization}, author = {Rotskoff, Grant M and Mitchell, Andrew R and Vanden-Eijnden, Eric}, booktitle = {Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference}, pages = {757--780}, year = {2022}, editor = {Bruna, Joan and Hesthaven, Jan and Zdeborova, Lenka}, volume = {145}, series = {Proceedings of Machine Learning Research}, month = {16--19 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v145/rotskoff22a/rotskoff22a.pdf}, url = {https://proceedings.mlr.press/v145/rotskoff22a.html}, abstract = {Deep neural networks, when optimized with sufficient data, provide accurate representations of high-dimensional functions; in contrast, function approximation techniques that have predominated in scientific computing do not scale well with dimensionality. As a result, many high-dimensional sampling and approximation problems once thought intractable are being revisited through the lens of machine learning. While the promise of unparalleled accuracy may suggest a renaissance for applications that require parameterizing representations of complex systems, in many applications gathering sufficient data to develop such a representation remains a significant challenge. Here we introduce an approach that combines rare events sampling techniques with neural network train- ing to optimize objective functions that are dominated by rare events. We show that importance sampling reduces the asymptotic variance of the solution to a learning problem, suggesting benefits for generalization. We study our algorithm in the context of solving high-dimensional PDEs that admit a variational formulation, a problem with applications in statistical physics and implications in machine learning theory. Our numerical experiments demonstrate that we can successfully learn even with the compounding difficulties of high-dimension and rare data.} }
Endnote
%0 Conference Paper %T Active Importance Sampling for Variational Objectives Dominated by Rare Events: Consequences for Optimization and Generalization %A Grant M Rotskoff %A Andrew R Mitchell %A Eric Vanden-Eijnden %B Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference %C Proceedings of Machine Learning Research %D 2022 %E Joan Bruna %E Jan Hesthaven %E Lenka Zdeborova %F pmlr-v145-rotskoff22a %I PMLR %P 757--780 %U https://proceedings.mlr.press/v145/rotskoff22a.html %V 145 %X Deep neural networks, when optimized with sufficient data, provide accurate representations of high-dimensional functions; in contrast, function approximation techniques that have predominated in scientific computing do not scale well with dimensionality. As a result, many high-dimensional sampling and approximation problems once thought intractable are being revisited through the lens of machine learning. While the promise of unparalleled accuracy may suggest a renaissance for applications that require parameterizing representations of complex systems, in many applications gathering sufficient data to develop such a representation remains a significant challenge. Here we introduce an approach that combines rare events sampling techniques with neural network train- ing to optimize objective functions that are dominated by rare events. We show that importance sampling reduces the asymptotic variance of the solution to a learning problem, suggesting benefits for generalization. We study our algorithm in the context of solving high-dimensional PDEs that admit a variational formulation, a problem with applications in statistical physics and implications in machine learning theory. Our numerical experiments demonstrate that we can successfully learn even with the compounding difficulties of high-dimension and rare data.
APA
Rotskoff, G.M., Mitchell, A.R. & Vanden-Eijnden, E.. (2022). Active Importance Sampling for Variational Objectives Dominated by Rare Events: Consequences for Optimization and Generalization. Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, in Proceedings of Machine Learning Research 145:757-780 Available from https://proceedings.mlr.press/v145/rotskoff22a.html.

Related Material