Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks

Dmytro Perekrestenko, Stephan Müller, Helmut Bölcskei
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:7610-7619, 2020.

Abstract

We present an explicit deep neural network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional Lipschitz-continuous target distribution. The key ingredient of our design is a generalization of the "space-filling" property of sawtooth functions discovered in (Bailey & Telgarsky, 2018). We elicit the importance of depth - in our neural network construction - in driving the Wasserstein distance between the target distribution and the approximation realized by the network to zero. An extension to output distributions of arbitrary dimension is outlined. Finally, we show that the proposed construction does not incur a cost - in terms of error measured in Wasserstein-distance - relative to generating $d$-dimensional target distributions from $d$ independent random variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-perekrestenko20a, title = {Constructive Universal High-Dimensional Distribution Generation through Deep {R}e{LU} Networks}, author = {Perekrestenko, Dmytro and M{\"u}ller, Stephan and B{\"o}lcskei, Helmut}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {7610--7619}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/perekrestenko20a/perekrestenko20a.pdf}, url = {https://proceedings.mlr.press/v119/perekrestenko20a.html}, abstract = {We present an explicit deep neural network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional Lipschitz-continuous target distribution. The key ingredient of our design is a generalization of the "space-filling" property of sawtooth functions discovered in (Bailey & Telgarsky, 2018). We elicit the importance of depth - in our neural network construction - in driving the Wasserstein distance between the target distribution and the approximation realized by the network to zero. An extension to output distributions of arbitrary dimension is outlined. Finally, we show that the proposed construction does not incur a cost - in terms of error measured in Wasserstein-distance - relative to generating $d$-dimensional target distributions from $d$ independent random variables.} }
Endnote
%0 Conference Paper %T Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks %A Dmytro Perekrestenko %A Stephan Müller %A Helmut Bölcskei %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-perekrestenko20a %I PMLR %P 7610--7619 %U https://proceedings.mlr.press/v119/perekrestenko20a.html %V 119 %X We present an explicit deep neural network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional Lipschitz-continuous target distribution. The key ingredient of our design is a generalization of the "space-filling" property of sawtooth functions discovered in (Bailey & Telgarsky, 2018). We elicit the importance of depth - in our neural network construction - in driving the Wasserstein distance between the target distribution and the approximation realized by the network to zero. An extension to output distributions of arbitrary dimension is outlined. Finally, we show that the proposed construction does not incur a cost - in terms of error measured in Wasserstein-distance - relative to generating $d$-dimensional target distributions from $d$ independent random variables.
APA
Perekrestenko, D., Müller, S. & Bölcskei, H.. (2020). Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:7610-7619 Available from https://proceedings.mlr.press/v119/perekrestenko20a.html.

Related Material