Black-box Optimization with Unknown Constraints via Overparameterized Deep Neural Networks

Dat Phan Trong, Hung The Tran, Sunil Gupta
Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, PMLR 286:4266-4289, 2025.

Abstract

Optimizing expensive black-box functions under unknown constraints is a fundamental challenge across a range of real-world domains, such as hyperparameter tuning in machine learning, safe control in robotics, and material or drug discovery. In these settings, each function evaluation may be costly or time-consuming, and the system may need to operate within unknown or difficult-to-specify safety boundaries. We apply the Expected Improvement (EI) acquisition function to select the next samples within a feasible region, determined by Lower Confidence Bound (LCB) conditions for all constraints. The LCB approach guarantees constraint feasibility, while EI efficiently balances exploration and exploitation, especially when the feasible regions are much smaller than the overall search space. To model both the objective function and constraints, we use Deep Neural Networks (DNNs) instead of Gaussian Processes (GPs) to improve scalability and handle complex structured data. We provide a theoretical analysis showing our method’s convergence using recent Neural Tangent Kernel (NTK) theory. Under regularity conditions, both cumulative regret and constraint violation are bounded by the maximum information gain, with equivalent upper bounds to GP-based methods. To validate our algorithm, we conduct experiments on synthetic and real-world benchmarks, showing its benefit over recent methods in black-box optimization with unknown constraints.

Cite this Paper


BibTeX
@InProceedings{pmlr-v286-phan-trong25a, title = {Black-box Optimization with Unknown Constraints via Overparameterized Deep Neural Networks}, author = {Phan Trong, Dat and Tran, Hung The and Gupta, Sunil}, booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence}, pages = {4266--4289}, year = {2025}, editor = {Chiappa, Silvia and Magliacane, Sara}, volume = {286}, series = {Proceedings of Machine Learning Research}, month = {21--25 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v286/main/assets/phan-trong25a/phan-trong25a.pdf}, url = {https://proceedings.mlr.press/v286/phan-trong25a.html}, abstract = {Optimizing expensive black-box functions under unknown constraints is a fundamental challenge across a range of real-world domains, such as hyperparameter tuning in machine learning, safe control in robotics, and material or drug discovery. In these settings, each function evaluation may be costly or time-consuming, and the system may need to operate within unknown or difficult-to-specify safety boundaries. We apply the Expected Improvement (EI) acquisition function to select the next samples within a feasible region, determined by Lower Confidence Bound (LCB) conditions for all constraints. The LCB approach guarantees constraint feasibility, while EI efficiently balances exploration and exploitation, especially when the feasible regions are much smaller than the overall search space. To model both the objective function and constraints, we use Deep Neural Networks (DNNs) instead of Gaussian Processes (GPs) to improve scalability and handle complex structured data. We provide a theoretical analysis showing our method’s convergence using recent Neural Tangent Kernel (NTK) theory. Under regularity conditions, both cumulative regret and constraint violation are bounded by the maximum information gain, with equivalent upper bounds to GP-based methods. To validate our algorithm, we conduct experiments on synthetic and real-world benchmarks, showing its benefit over recent methods in black-box optimization with unknown constraints.} }
Endnote
%0 Conference Paper %T Black-box Optimization with Unknown Constraints via Overparameterized Deep Neural Networks %A Dat Phan Trong %A Hung The Tran %A Sunil Gupta %B Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2025 %E Silvia Chiappa %E Sara Magliacane %F pmlr-v286-phan-trong25a %I PMLR %P 4266--4289 %U https://proceedings.mlr.press/v286/phan-trong25a.html %V 286 %X Optimizing expensive black-box functions under unknown constraints is a fundamental challenge across a range of real-world domains, such as hyperparameter tuning in machine learning, safe control in robotics, and material or drug discovery. In these settings, each function evaluation may be costly or time-consuming, and the system may need to operate within unknown or difficult-to-specify safety boundaries. We apply the Expected Improvement (EI) acquisition function to select the next samples within a feasible region, determined by Lower Confidence Bound (LCB) conditions for all constraints. The LCB approach guarantees constraint feasibility, while EI efficiently balances exploration and exploitation, especially when the feasible regions are much smaller than the overall search space. To model both the objective function and constraints, we use Deep Neural Networks (DNNs) instead of Gaussian Processes (GPs) to improve scalability and handle complex structured data. We provide a theoretical analysis showing our method’s convergence using recent Neural Tangent Kernel (NTK) theory. Under regularity conditions, both cumulative regret and constraint violation are bounded by the maximum information gain, with equivalent upper bounds to GP-based methods. To validate our algorithm, we conduct experiments on synthetic and real-world benchmarks, showing its benefit over recent methods in black-box optimization with unknown constraints.
APA
Phan Trong, D., Tran, H.T. & Gupta, S.. (2025). Black-box Optimization with Unknown Constraints via Overparameterized Deep Neural Networks. Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 286:4266-4289 Available from https://proceedings.mlr.press/v286/phan-trong25a.html.

Related Material