Difference of submodular minimization via DC programming

Marwa El Halabi, George Orfanides, Tim Hoheisel
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:9172-9201, 2023.

Abstract

Minimizing the difference of two submodular (DS) functions is a problem that naturally occurs in various machine learning problems. Although it is well known that a DS problem can be equivalently formulated as the minimization of the difference of two convex (DC) functions, existing algorithms do not fully exploit this connection. A classical algorithm for DC problems is called the DC algorithm (DCA). We introduce variants of DCA and its complete form (CDCA) that we apply to the DC program corresponding to DS minimization. We extend existing convergence properties of DCA, and connect them to convergence properties on the DS problem. Our results on DCA match the theoretical guarantees satisfied by existing DS algorithms, while providing a more complete characterization of convergence properties. In the case of CDCA, we obtain a stronger local minimality guarantee. Our numerical results show that our proposed algorithms outperform existing baselines on two applications: speech corpus selection and feature selection.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-el-halabi23b, title = {Difference of submodular minimization via {DC} programming}, author = {El Halabi, Marwa and Orfanides, George and Hoheisel, Tim}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {9172--9201}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/el-halabi23b/el-halabi23b.pdf}, url = {https://proceedings.mlr.press/v202/el-halabi23b.html}, abstract = {Minimizing the difference of two submodular (DS) functions is a problem that naturally occurs in various machine learning problems. Although it is well known that a DS problem can be equivalently formulated as the minimization of the difference of two convex (DC) functions, existing algorithms do not fully exploit this connection. A classical algorithm for DC problems is called the DC algorithm (DCA). We introduce variants of DCA and its complete form (CDCA) that we apply to the DC program corresponding to DS minimization. We extend existing convergence properties of DCA, and connect them to convergence properties on the DS problem. Our results on DCA match the theoretical guarantees satisfied by existing DS algorithms, while providing a more complete characterization of convergence properties. In the case of CDCA, we obtain a stronger local minimality guarantee. Our numerical results show that our proposed algorithms outperform existing baselines on two applications: speech corpus selection and feature selection.} }
Endnote
%0 Conference Paper %T Difference of submodular minimization via DC programming %A Marwa El Halabi %A George Orfanides %A Tim Hoheisel %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-el-halabi23b %I PMLR %P 9172--9201 %U https://proceedings.mlr.press/v202/el-halabi23b.html %V 202 %X Minimizing the difference of two submodular (DS) functions is a problem that naturally occurs in various machine learning problems. Although it is well known that a DS problem can be equivalently formulated as the minimization of the difference of two convex (DC) functions, existing algorithms do not fully exploit this connection. A classical algorithm for DC problems is called the DC algorithm (DCA). We introduce variants of DCA and its complete form (CDCA) that we apply to the DC program corresponding to DS minimization. We extend existing convergence properties of DCA, and connect them to convergence properties on the DS problem. Our results on DCA match the theoretical guarantees satisfied by existing DS algorithms, while providing a more complete characterization of convergence properties. In the case of CDCA, we obtain a stronger local minimality guarantee. Our numerical results show that our proposed algorithms outperform existing baselines on two applications: speech corpus selection and feature selection.
APA
El Halabi, M., Orfanides, G. & Hoheisel, T.. (2023). Difference of submodular minimization via DC programming. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:9172-9201 Available from https://proceedings.mlr.press/v202/el-halabi23b.html.

Related Material