Black-Box Methods for Restoring Monotonicity

Evangelia Gergatsouli, Brendan Lucier, Christos Tzamos
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:3463-3473, 2020.

Abstract

In many practical applications, heuristic or approximation algorithms are used to efficiently solve the task at hand. However their solutions frequently do not satisfy natural monotonicity properties expected to hold in the optimum. In this work we develop algorithms that are able to restore monotonicity in the parameters of interest. Specifically, given oracle access to a possibly non monotone function, we provide an algorithm that restores monotonicity while degrading the expected value of the function by at most $\epsilon$. The number of queries required is at most logarithmic in $1/\epsilon$ and exponential in the number of parameters. We also give a lower bound showing that this exponential dependence is necessary. Finally, we obtain improved query complexity bounds for restoring the weaker property of $k$-marginal monotonicity. Under this property, every $k$-dimensional projection of the function is required to be monotone. The query complexity we obtain only scales exponentially with $k$ and is polynomial in the number of parameters.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-gergatsouli20a, title = {Black-Box Methods for Restoring Monotonicity}, author = {Gergatsouli, Evangelia and Lucier, Brendan and Tzamos, Christos}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {3463--3473}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/gergatsouli20a/gergatsouli20a.pdf}, url = {https://proceedings.mlr.press/v119/gergatsouli20a.html}, abstract = {In many practical applications, heuristic or approximation algorithms are used to efficiently solve the task at hand. However their solutions frequently do not satisfy natural monotonicity properties expected to hold in the optimum. In this work we develop algorithms that are able to restore monotonicity in the parameters of interest. Specifically, given oracle access to a possibly non monotone function, we provide an algorithm that restores monotonicity while degrading the expected value of the function by at most $\epsilon$. The number of queries required is at most logarithmic in $1/\epsilon$ and exponential in the number of parameters. We also give a lower bound showing that this exponential dependence is necessary. Finally, we obtain improved query complexity bounds for restoring the weaker property of $k$-marginal monotonicity. Under this property, every $k$-dimensional projection of the function is required to be monotone. The query complexity we obtain only scales exponentially with $k$ and is polynomial in the number of parameters.} }
Endnote
%0 Conference Paper %T Black-Box Methods for Restoring Monotonicity %A Evangelia Gergatsouli %A Brendan Lucier %A Christos Tzamos %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-gergatsouli20a %I PMLR %P 3463--3473 %U https://proceedings.mlr.press/v119/gergatsouli20a.html %V 119 %X In many practical applications, heuristic or approximation algorithms are used to efficiently solve the task at hand. However their solutions frequently do not satisfy natural monotonicity properties expected to hold in the optimum. In this work we develop algorithms that are able to restore monotonicity in the parameters of interest. Specifically, given oracle access to a possibly non monotone function, we provide an algorithm that restores monotonicity while degrading the expected value of the function by at most $\epsilon$. The number of queries required is at most logarithmic in $1/\epsilon$ and exponential in the number of parameters. We also give a lower bound showing that this exponential dependence is necessary. Finally, we obtain improved query complexity bounds for restoring the weaker property of $k$-marginal monotonicity. Under this property, every $k$-dimensional projection of the function is required to be monotone. The query complexity we obtain only scales exponentially with $k$ and is polynomial in the number of parameters.
APA
Gergatsouli, E., Lucier, B. & Tzamos, C.. (2020). Black-Box Methods for Restoring Monotonicity. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:3463-3473 Available from https://proceedings.mlr.press/v119/gergatsouli20a.html.

Related Material