Diversity By Design: Leveraging Distribution Matching for Offline Model-Based Optimization

Michael S Yao, James Gee, Osbert Bastani
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:71687-71738, 2025.

Abstract

The goal of offline model-based optimization (MBO) is to propose new designs that maximize a reward function given only an offline dataset. However, an important desiderata is to also propose a diverse set of final candidates that capture many optimal and near-optimal design configurations. We propose Diversity In Adversarial Model-based Optimization (DynAMO) as a novel method to introduce design diversity as an explicit objective into any MBO problem. Our key insight is to formulate diversity as a distribution matching problem where the distribution of generated designs captures the inherent diversity contained within the offline dataset. Extensive experiments spanning multiple scientific domains show that DynAMO can be used with common optimization methods to significantly improve the diversity of proposed designs while still discovering high-quality candidates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-yao25b, title = {Diversity By Design: Leveraging Distribution Matching for Offline Model-Based Optimization}, author = {Yao, Michael S and Gee, James and Bastani, Osbert}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {71687--71738}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/yao25b/yao25b.pdf}, url = {https://proceedings.mlr.press/v267/yao25b.html}, abstract = {The goal of offline model-based optimization (MBO) is to propose new designs that maximize a reward function given only an offline dataset. However, an important desiderata is to also propose a diverse set of final candidates that capture many optimal and near-optimal design configurations. We propose Diversity In Adversarial Model-based Optimization (DynAMO) as a novel method to introduce design diversity as an explicit objective into any MBO problem. Our key insight is to formulate diversity as a distribution matching problem where the distribution of generated designs captures the inherent diversity contained within the offline dataset. Extensive experiments spanning multiple scientific domains show that DynAMO can be used with common optimization methods to significantly improve the diversity of proposed designs while still discovering high-quality candidates.} }
Endnote
%0 Conference Paper %T Diversity By Design: Leveraging Distribution Matching for Offline Model-Based Optimization %A Michael S Yao %A James Gee %A Osbert Bastani %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-yao25b %I PMLR %P 71687--71738 %U https://proceedings.mlr.press/v267/yao25b.html %V 267 %X The goal of offline model-based optimization (MBO) is to propose new designs that maximize a reward function given only an offline dataset. However, an important desiderata is to also propose a diverse set of final candidates that capture many optimal and near-optimal design configurations. We propose Diversity In Adversarial Model-based Optimization (DynAMO) as a novel method to introduce design diversity as an explicit objective into any MBO problem. Our key insight is to formulate diversity as a distribution matching problem where the distribution of generated designs captures the inherent diversity contained within the offline dataset. Extensive experiments spanning multiple scientific domains show that DynAMO can be used with common optimization methods to significantly improve the diversity of proposed designs while still discovering high-quality candidates.
APA
Yao, M.S., Gee, J. & Bastani, O.. (2025). Diversity By Design: Leveraging Distribution Matching for Offline Model-Based Optimization. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:71687-71738 Available from https://proceedings.mlr.press/v267/yao25b.html.

Related Material