Provable Lipschitz Certification for Generative Models

Matt Jordan, Alex Dimakis
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:5118-5126, 2021.

Abstract

We present a scalable technique for upper bounding the Lipschitz constant of generative models. We relate this quantity to the maximal norm over the set of attainable vector-Jacobian products of a given generative model. We approximate this set by layerwise convex approximations using zonotopes. Our approach generalizes and improves upon prior work using zonotope transformers and we extend to Lipschitz estimation of neural networks with large output dimension. This provides efficient and tight bounds on small networks and can scale to generative models on VAE and DCGAN architectures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-jordan21a, title = {Provable Lipschitz Certification for Generative Models}, author = {Jordan, Matt and Dimakis, Alex}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {5118--5126}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/jordan21a/jordan21a.pdf}, url = {https://proceedings.mlr.press/v139/jordan21a.html}, abstract = {We present a scalable technique for upper bounding the Lipschitz constant of generative models. We relate this quantity to the maximal norm over the set of attainable vector-Jacobian products of a given generative model. We approximate this set by layerwise convex approximations using zonotopes. Our approach generalizes and improves upon prior work using zonotope transformers and we extend to Lipschitz estimation of neural networks with large output dimension. This provides efficient and tight bounds on small networks and can scale to generative models on VAE and DCGAN architectures.} }
Endnote
%0 Conference Paper %T Provable Lipschitz Certification for Generative Models %A Matt Jordan %A Alex Dimakis %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-jordan21a %I PMLR %P 5118--5126 %U https://proceedings.mlr.press/v139/jordan21a.html %V 139 %X We present a scalable technique for upper bounding the Lipschitz constant of generative models. We relate this quantity to the maximal norm over the set of attainable vector-Jacobian products of a given generative model. We approximate this set by layerwise convex approximations using zonotopes. Our approach generalizes and improves upon prior work using zonotope transformers and we extend to Lipschitz estimation of neural networks with large output dimension. This provides efficient and tight bounds on small networks and can scale to generative models on VAE and DCGAN architectures.
APA
Jordan, M. & Dimakis, A.. (2021). Provable Lipschitz Certification for Generative Models. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:5118-5126 Available from https://proceedings.mlr.press/v139/jordan21a.html.

Related Material