On Bisubmodular Maximization

Ajit Singh, Andrew Guillory, Jeff Bilmes
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:1055-1063, 2012.

Abstract

Bisubmodularity extends the concept of submodularity to set functions with two arguments. We show how bisubmodular maximization leads to richer value-of-information problems, using examples in sensor placement and feature selection. We present the first constant-factor approximation algorithm for a wide class of bisubmodular maximizations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-singh12, title = {On Bisubmodular Maximization}, author = {Singh, Ajit and Guillory, Andrew and Bilmes, Jeff}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {1055--1063}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/singh12/singh12.pdf}, url = {https://proceedings.mlr.press/v22/singh12.html}, abstract = {Bisubmodularity extends the concept of submodularity to set functions with two arguments. We show how bisubmodular maximization leads to richer value-of-information problems, using examples in sensor placement and feature selection. We present the first constant-factor approximation algorithm for a wide class of bisubmodular maximizations.} }
Endnote
%0 Conference Paper %T On Bisubmodular Maximization %A Ajit Singh %A Andrew Guillory %A Jeff Bilmes %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-singh12 %I PMLR %P 1055--1063 %U https://proceedings.mlr.press/v22/singh12.html %V 22 %X Bisubmodularity extends the concept of submodularity to set functions with two arguments. We show how bisubmodular maximization leads to richer value-of-information problems, using examples in sensor placement and feature selection. We present the first constant-factor approximation algorithm for a wide class of bisubmodular maximizations.
RIS
TY - CPAPER TI - On Bisubmodular Maximization AU - Ajit Singh AU - Andrew Guillory AU - Jeff Bilmes BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-singh12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 1055 EP - 1063 L1 - http://proceedings.mlr.press/v22/singh12/singh12.pdf UR - https://proceedings.mlr.press/v22/singh12.html AB - Bisubmodularity extends the concept of submodularity to set functions with two arguments. We show how bisubmodular maximization leads to richer value-of-information problems, using examples in sensor placement and feature selection. We present the first constant-factor approximation algorithm for a wide class of bisubmodular maximizations. ER -
APA
Singh, A., Guillory, A. & Bilmes, J.. (2012). On Bisubmodular Maximization. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:1055-1063 Available from https://proceedings.mlr.press/v22/singh12.html.

Related Material