Adversarially Robust Kernel Smoothing

Jia-Jie Zhu, Christina Kouridi, Yassine Nemmour, Bernhard Schölkopf
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:4972-4994, 2022.

Abstract

We propose a scalable robust learning algorithm combining kernel smoothing and robust optimization. Our method is motivated by the convex analysis perspective of distributionally robust optimization based on probability metrics, such as the Wasserstein distance and the maximum mean discrepancy. We adapt the integral operator using supremal convolution in convex analysis to form a novel function majorant used for enforcing robustness. Our method is simple in form and applies to general loss functions and machine learning models. Exploiting a connection with optimal transport, we prove theoretical guarantees for certified robustness under distribution shift. Furthermore, we report experiments with general machine learning models, such as deep neural networks, to demonstrate competitive performance with the state-of-the-art certifiable robust learning algorithms based on the Wasserstein distance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-zhu22d, title = { Adversarially Robust Kernel Smoothing }, author = {Zhu, Jia-Jie and Kouridi, Christina and Nemmour, Yassine and Sch\"olkopf, Bernhard}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {4972--4994}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/zhu22d/zhu22d.pdf}, url = {https://proceedings.mlr.press/v151/zhu22d.html}, abstract = { We propose a scalable robust learning algorithm combining kernel smoothing and robust optimization. Our method is motivated by the convex analysis perspective of distributionally robust optimization based on probability metrics, such as the Wasserstein distance and the maximum mean discrepancy. We adapt the integral operator using supremal convolution in convex analysis to form a novel function majorant used for enforcing robustness. Our method is simple in form and applies to general loss functions and machine learning models. Exploiting a connection with optimal transport, we prove theoretical guarantees for certified robustness under distribution shift. Furthermore, we report experiments with general machine learning models, such as deep neural networks, to demonstrate competitive performance with the state-of-the-art certifiable robust learning algorithms based on the Wasserstein distance. } }
Endnote
%0 Conference Paper %T Adversarially Robust Kernel Smoothing %A Jia-Jie Zhu %A Christina Kouridi %A Yassine Nemmour %A Bernhard Schölkopf %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-zhu22d %I PMLR %P 4972--4994 %U https://proceedings.mlr.press/v151/zhu22d.html %V 151 %X We propose a scalable robust learning algorithm combining kernel smoothing and robust optimization. Our method is motivated by the convex analysis perspective of distributionally robust optimization based on probability metrics, such as the Wasserstein distance and the maximum mean discrepancy. We adapt the integral operator using supremal convolution in convex analysis to form a novel function majorant used for enforcing robustness. Our method is simple in form and applies to general loss functions and machine learning models. Exploiting a connection with optimal transport, we prove theoretical guarantees for certified robustness under distribution shift. Furthermore, we report experiments with general machine learning models, such as deep neural networks, to demonstrate competitive performance with the state-of-the-art certifiable robust learning algorithms based on the Wasserstein distance.
APA
Zhu, J., Kouridi, C., Nemmour, Y. & Schölkopf, B.. (2022). Adversarially Robust Kernel Smoothing . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:4972-4994 Available from https://proceedings.mlr.press/v151/zhu22d.html.

Related Material