Two-Sided Weak Submodularity for Matroid Constrained Optimization and Regression

Theophile Thiery, Justin Ward
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:3605-3634, 2022.

Abstract

We study the following problem: Given a variable of interest, we would like to find a best linear predictor for it by choosing a subset of k relevant variables obeying a matroid constraint. This problem is a natural generalization of subset selection problems where it is necessary to spread observations amongst multiple different classes. We derive new, strengthened guarantees for this problem by improving the analysis of the residual random greedy algorithm and by developing a novel distorted local-search algorithm. To quantify our approximation guarantees, we refine the definition of weak submodularity by Das and Kempe (2011) and introduce the notion of an upper submodularity ratio, which we connect to the minimum k-sparse eigenvalue of the covariance matrix. More generally, we look at the problem of maximizing a set function f with lower and upper submodularity ratio $\gamma$ and $\beta$ under a matroid constraint. For this problem, our algorithms have asymptotic approximation guarantee 1/2 and (1 - 1/e) as the function is closer to being submodular. As a second application, we show that the Bayesian A-optimal design objective falls into our framework, leading to new guarantees for this problem as well.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-thiery22a, title = {Two-Sided Weak Submodularity for Matroid Constrained Optimization and Regression}, author = {Thiery, Theophile and Ward, Justin}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {3605--3634}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/thiery22a/thiery22a.pdf}, url = {https://proceedings.mlr.press/v178/thiery22a.html}, abstract = {We study the following problem: Given a variable of interest, we would like to find a best linear predictor for it by choosing a subset of k relevant variables obeying a matroid constraint. This problem is a natural generalization of subset selection problems where it is necessary to spread observations amongst multiple different classes. We derive new, strengthened guarantees for this problem by improving the analysis of the residual random greedy algorithm and by developing a novel distorted local-search algorithm. To quantify our approximation guarantees, we refine the definition of weak submodularity by Das and Kempe (2011) and introduce the notion of an upper submodularity ratio, which we connect to the minimum k-sparse eigenvalue of the covariance matrix. More generally, we look at the problem of maximizing a set function f with lower and upper submodularity ratio $\gamma$ and $\beta$ under a matroid constraint. For this problem, our algorithms have asymptotic approximation guarantee 1/2 and (1 - 1/e) as the function is closer to being submodular. As a second application, we show that the Bayesian A-optimal design objective falls into our framework, leading to new guarantees for this problem as well.} }
Endnote
%0 Conference Paper %T Two-Sided Weak Submodularity for Matroid Constrained Optimization and Regression %A Theophile Thiery %A Justin Ward %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-thiery22a %I PMLR %P 3605--3634 %U https://proceedings.mlr.press/v178/thiery22a.html %V 178 %X We study the following problem: Given a variable of interest, we would like to find a best linear predictor for it by choosing a subset of k relevant variables obeying a matroid constraint. This problem is a natural generalization of subset selection problems where it is necessary to spread observations amongst multiple different classes. We derive new, strengthened guarantees for this problem by improving the analysis of the residual random greedy algorithm and by developing a novel distorted local-search algorithm. To quantify our approximation guarantees, we refine the definition of weak submodularity by Das and Kempe (2011) and introduce the notion of an upper submodularity ratio, which we connect to the minimum k-sparse eigenvalue of the covariance matrix. More generally, we look at the problem of maximizing a set function f with lower and upper submodularity ratio $\gamma$ and $\beta$ under a matroid constraint. For this problem, our algorithms have asymptotic approximation guarantee 1/2 and (1 - 1/e) as the function is closer to being submodular. As a second application, we show that the Bayesian A-optimal design objective falls into our framework, leading to new guarantees for this problem as well.
APA
Thiery, T. & Ward, J.. (2022). Two-Sided Weak Submodularity for Matroid Constrained Optimization and Regression. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:3605-3634 Available from https://proceedings.mlr.press/v178/thiery22a.html.

Related Material