Fast Semidifferential-based Submodular Function Optimization

Rishabh Iyer, Stefanie Jegelka, Jeff Bilmes
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):855-863, 2013.

Abstract

We present a practical and powerful new framework for both unconstrained and constrained submodular function optimization based on discrete semidifferentials (sub- and super-differentials). The resulting algorithms, which repeatedly compute and then efficiently optimize submodular semigradients, offer new and generalize many old methods for submodular optimization. Our approach, moreover, takes steps towards providing a unifying paradigm applicable to both submodular minimization and maximization, problems that historically have been treated quite distinctly. The practicality of our algorithms is important since interest in submodularity, owing to its natural and wide applicability, has recently been in ascendance within machine learning. We analyze theoretical properties of our algorithms for minimization and maximization, and show that many state-of-the-art maximization algorithms are special cases. Lastly, we complement our theoretical analyses with supporting empirical experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-iyer13, title = {Fast Semidifferential-based Submodular Function Optimization}, author = {Iyer, Rishabh and Jegelka, Stefanie and Bilmes, Jeff}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {855--863}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/iyer13.pdf}, url = {https://proceedings.mlr.press/v28/iyer13.html}, abstract = {We present a practical and powerful new framework for both unconstrained and constrained submodular function optimization based on discrete semidifferentials (sub- and super-differentials). The resulting algorithms, which repeatedly compute and then efficiently optimize submodular semigradients, offer new and generalize many old methods for submodular optimization. Our approach, moreover, takes steps towards providing a unifying paradigm applicable to both submodular minimization and maximization, problems that historically have been treated quite distinctly. The practicality of our algorithms is important since interest in submodularity, owing to its natural and wide applicability, has recently been in ascendance within machine learning. We analyze theoretical properties of our algorithms for minimization and maximization, and show that many state-of-the-art maximization algorithms are special cases. Lastly, we complement our theoretical analyses with supporting empirical experiments.} }
Endnote
%0 Conference Paper %T Fast Semidifferential-based Submodular Function Optimization %A Rishabh Iyer %A Stefanie Jegelka %A Jeff Bilmes %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-iyer13 %I PMLR %P 855--863 %U https://proceedings.mlr.press/v28/iyer13.html %V 28 %N 3 %X We present a practical and powerful new framework for both unconstrained and constrained submodular function optimization based on discrete semidifferentials (sub- and super-differentials). The resulting algorithms, which repeatedly compute and then efficiently optimize submodular semigradients, offer new and generalize many old methods for submodular optimization. Our approach, moreover, takes steps towards providing a unifying paradigm applicable to both submodular minimization and maximization, problems that historically have been treated quite distinctly. The practicality of our algorithms is important since interest in submodularity, owing to its natural and wide applicability, has recently been in ascendance within machine learning. We analyze theoretical properties of our algorithms for minimization and maximization, and show that many state-of-the-art maximization algorithms are special cases. Lastly, we complement our theoretical analyses with supporting empirical experiments.
RIS
TY - CPAPER TI - Fast Semidifferential-based Submodular Function Optimization AU - Rishabh Iyer AU - Stefanie Jegelka AU - Jeff Bilmes BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-iyer13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 855 EP - 863 L1 - http://proceedings.mlr.press/v28/iyer13.pdf UR - https://proceedings.mlr.press/v28/iyer13.html AB - We present a practical and powerful new framework for both unconstrained and constrained submodular function optimization based on discrete semidifferentials (sub- and super-differentials). The resulting algorithms, which repeatedly compute and then efficiently optimize submodular semigradients, offer new and generalize many old methods for submodular optimization. Our approach, moreover, takes steps towards providing a unifying paradigm applicable to both submodular minimization and maximization, problems that historically have been treated quite distinctly. The practicality of our algorithms is important since interest in submodularity, owing to its natural and wide applicability, has recently been in ascendance within machine learning. We analyze theoretical properties of our algorithms for minimization and maximization, and show that many state-of-the-art maximization algorithms are special cases. Lastly, we complement our theoretical analyses with supporting empirical experiments. ER -
APA
Iyer, R., Jegelka, S. & Bilmes, J.. (2013). Fast Semidifferential-based Submodular Function Optimization. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):855-863 Available from https://proceedings.mlr.press/v28/iyer13.html.

Related Material