Constrained Discrete Black-Box Optimization using Mixed-Integer Programming

Theodore P Papalexopoulos, Christian Tjandraatmadja, Ross Anderson, Juan Pablo Vielma, David Belanger
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:17295-17322, 2022.

Abstract

Discrete black-box optimization problems are challenging for model-based optimization (MBO) algorithms, such as Bayesian optimization, due to the size of the search space and the need to satisfy combinatorial constraints. In particular, these methods require repeatedly solving a complex discrete global optimization problem in the inner loop, where popular heuristic inner-loop solvers introduce approximations and are difficult to adapt to combinatorial constraints. In response, we propose NN+MILP, a general discrete MBO framework using piecewise-linear neural networks as surrogate models and mixed-integer linear programming (MILP) to optimize the acquisition function. MILP provides optimality guarantees and a versatile declarative language for domain-specific constraints. We test our approach on a range of unconstrained and constrained problems, including DNA binding, constrained binary quadratic problems from the MINLPLib benchmark, and the NAS-Bench-101 neural architecture search benchmark. NN+MILP surpasses or matches the performance of black-box algorithms tailored to the constraints at hand, with global optimization of the acquisition problem running in a few minutes using only standard software packages and hardware.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-papalexopoulos22a, title = {Constrained Discrete Black-Box Optimization using Mixed-Integer Programming}, author = {Papalexopoulos, Theodore P and Tjandraatmadja, Christian and Anderson, Ross and Vielma, Juan Pablo and Belanger, David}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {17295--17322}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/papalexopoulos22a/papalexopoulos22a.pdf}, url = {https://proceedings.mlr.press/v162/papalexopoulos22a.html}, abstract = {Discrete black-box optimization problems are challenging for model-based optimization (MBO) algorithms, such as Bayesian optimization, due to the size of the search space and the need to satisfy combinatorial constraints. In particular, these methods require repeatedly solving a complex discrete global optimization problem in the inner loop, where popular heuristic inner-loop solvers introduce approximations and are difficult to adapt to combinatorial constraints. In response, we propose NN+MILP, a general discrete MBO framework using piecewise-linear neural networks as surrogate models and mixed-integer linear programming (MILP) to optimize the acquisition function. MILP provides optimality guarantees and a versatile declarative language for domain-specific constraints. We test our approach on a range of unconstrained and constrained problems, including DNA binding, constrained binary quadratic problems from the MINLPLib benchmark, and the NAS-Bench-101 neural architecture search benchmark. NN+MILP surpasses or matches the performance of black-box algorithms tailored to the constraints at hand, with global optimization of the acquisition problem running in a few minutes using only standard software packages and hardware.} }
Endnote
%0 Conference Paper %T Constrained Discrete Black-Box Optimization using Mixed-Integer Programming %A Theodore P Papalexopoulos %A Christian Tjandraatmadja %A Ross Anderson %A Juan Pablo Vielma %A David Belanger %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-papalexopoulos22a %I PMLR %P 17295--17322 %U https://proceedings.mlr.press/v162/papalexopoulos22a.html %V 162 %X Discrete black-box optimization problems are challenging for model-based optimization (MBO) algorithms, such as Bayesian optimization, due to the size of the search space and the need to satisfy combinatorial constraints. In particular, these methods require repeatedly solving a complex discrete global optimization problem in the inner loop, where popular heuristic inner-loop solvers introduce approximations and are difficult to adapt to combinatorial constraints. In response, we propose NN+MILP, a general discrete MBO framework using piecewise-linear neural networks as surrogate models and mixed-integer linear programming (MILP) to optimize the acquisition function. MILP provides optimality guarantees and a versatile declarative language for domain-specific constraints. We test our approach on a range of unconstrained and constrained problems, including DNA binding, constrained binary quadratic problems from the MINLPLib benchmark, and the NAS-Bench-101 neural architecture search benchmark. NN+MILP surpasses or matches the performance of black-box algorithms tailored to the constraints at hand, with global optimization of the acquisition problem running in a few minutes using only standard software packages and hardware.
APA
Papalexopoulos, T.P., Tjandraatmadja, C., Anderson, R., Vielma, J.P. & Belanger, D.. (2022). Constrained Discrete Black-Box Optimization using Mixed-Integer Programming. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:17295-17322 Available from https://proceedings.mlr.press/v162/papalexopoulos22a.html.

Related Material