Min-Max Optimization without Gradients: Convergence and Applications to Black-Box Evasion and Poisoning Attacks

Sijia Liu, Songtao Lu, Xiangyi Chen, Yao Feng, Kaidi Xu, Abdullah Al-Dujaili, Mingyi Hong, Una-May O’Reilly
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6282-6293, 2020.

Abstract

In this paper, we study the problem of constrained min-max optimization in a black-box setting, where the desired optimizer cannot access the gradients of the objective function but may query its values. We present a principled optimization framework, integrating a zeroth-order (ZO) gradient estimator with an alternating projected stochastic gradient descent-ascent method, where the former only requires a small number of function queries and the later needs just one-step descent/ascent update. We show that the proposed framework, referred to as ZO-Min-Max, has a sublinear convergence rate under mild conditions and scales gracefully with problem size. We also explore a promising connection between black-box min-max optimization and black-box evasion and poisoning attacks in adversarial machine learning (ML). Our empirical evaluations on these use cases demonstrate the effectiveness of our approach and its scalability to dimensions that prohibit using recent black-box solvers.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-liu20j, title = {Min-Max Optimization without Gradients: Convergence and Applications to Black-Box Evasion and Poisoning Attacks}, author = {Liu, Sijia and Lu, Songtao and Chen, Xiangyi and Feng, Yao and Xu, Kaidi and Al-Dujaili, Abdullah and Hong, Mingyi and O'Reilly, Una-May}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6282--6293}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/liu20j/liu20j.pdf}, url = {https://proceedings.mlr.press/v119/liu20j.html}, abstract = {In this paper, we study the problem of constrained min-max optimization in a black-box setting, where the desired optimizer cannot access the gradients of the objective function but may query its values. We present a principled optimization framework, integrating a zeroth-order (ZO) gradient estimator with an alternating projected stochastic gradient descent-ascent method, where the former only requires a small number of function queries and the later needs just one-step descent/ascent update. We show that the proposed framework, referred to as ZO-Min-Max, has a sublinear convergence rate under mild conditions and scales gracefully with problem size. We also explore a promising connection between black-box min-max optimization and black-box evasion and poisoning attacks in adversarial machine learning (ML). Our empirical evaluations on these use cases demonstrate the effectiveness of our approach and its scalability to dimensions that prohibit using recent black-box solvers.} }
Endnote
%0 Conference Paper %T Min-Max Optimization without Gradients: Convergence and Applications to Black-Box Evasion and Poisoning Attacks %A Sijia Liu %A Songtao Lu %A Xiangyi Chen %A Yao Feng %A Kaidi Xu %A Abdullah Al-Dujaili %A Mingyi Hong %A Una-May O’Reilly %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-liu20j %I PMLR %P 6282--6293 %U https://proceedings.mlr.press/v119/liu20j.html %V 119 %X In this paper, we study the problem of constrained min-max optimization in a black-box setting, where the desired optimizer cannot access the gradients of the objective function but may query its values. We present a principled optimization framework, integrating a zeroth-order (ZO) gradient estimator with an alternating projected stochastic gradient descent-ascent method, where the former only requires a small number of function queries and the later needs just one-step descent/ascent update. We show that the proposed framework, referred to as ZO-Min-Max, has a sublinear convergence rate under mild conditions and scales gracefully with problem size. We also explore a promising connection between black-box min-max optimization and black-box evasion and poisoning attacks in adversarial machine learning (ML). Our empirical evaluations on these use cases demonstrate the effectiveness of our approach and its scalability to dimensions that prohibit using recent black-box solvers.
APA
Liu, S., Lu, S., Chen, X., Feng, Y., Xu, K., Al-Dujaili, A., Hong, M. & O’Reilly, U.. (2020). Min-Max Optimization without Gradients: Convergence and Applications to Black-Box Evasion and Poisoning Attacks. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6282-6293 Available from https://proceedings.mlr.press/v119/liu20j.html.

Related Material