MITIGATING OVER-EXPLORATION IN LATENT SPACE OPTIMIZATION USING LES

Omer Ronen, Ahmed Imtiaz Humayun, Richard Baraniuk, Randall Balestriero, Bin Yu
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:51996-52024, 2025.

Abstract

We develop Latent Exploration Score (LES) to mitigate over-exploration in Latent Space Optimization (LSO), a popular method for solving black-box discrete optimization problems. LSO utilizes continuous optimization within the latent space of a Variational Autoencoder (VAE) and is known to be susceptible to over-exploration, which manifests in unrealistic solutions that reduce its practicality. LES leverages the trained decoder’s approximation of the data distribution, and can be employed with any VAE decoder–including pretrained ones–without additional training, architectural changes or access to the training data. Our evaluation across five LSO benchmark tasks and twenty-two VAE models demonstrates that LES always enhances the quality of the solutions while maintaining high objective values, leading to improvements over existing solutions in most cases. We believe that new avenues to LSO will be opened by LES’ ability to identify out of distribution areas, differentiability, and computational tractability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-ronen25a, title = {{MITIGATING} {OVER}-{EXPLORATION} {IN} {LATENT} {SPACE} {OPTIMIZATION} {USING} {LES}}, author = {Ronen, Omer and Humayun, Ahmed Imtiaz and Baraniuk, Richard and Balestriero, Randall and Yu, Bin}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {51996--52024}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/ronen25a/ronen25a.pdf}, url = {https://proceedings.mlr.press/v267/ronen25a.html}, abstract = {We develop Latent Exploration Score (LES) to mitigate over-exploration in Latent Space Optimization (LSO), a popular method for solving black-box discrete optimization problems. LSO utilizes continuous optimization within the latent space of a Variational Autoencoder (VAE) and is known to be susceptible to over-exploration, which manifests in unrealistic solutions that reduce its practicality. LES leverages the trained decoder’s approximation of the data distribution, and can be employed with any VAE decoder–including pretrained ones–without additional training, architectural changes or access to the training data. Our evaluation across five LSO benchmark tasks and twenty-two VAE models demonstrates that LES always enhances the quality of the solutions while maintaining high objective values, leading to improvements over existing solutions in most cases. We believe that new avenues to LSO will be opened by LES’ ability to identify out of distribution areas, differentiability, and computational tractability.} }
Endnote
%0 Conference Paper %T MITIGATING OVER-EXPLORATION IN LATENT SPACE OPTIMIZATION USING LES %A Omer Ronen %A Ahmed Imtiaz Humayun %A Richard Baraniuk %A Randall Balestriero %A Bin Yu %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-ronen25a %I PMLR %P 51996--52024 %U https://proceedings.mlr.press/v267/ronen25a.html %V 267 %X We develop Latent Exploration Score (LES) to mitigate over-exploration in Latent Space Optimization (LSO), a popular method for solving black-box discrete optimization problems. LSO utilizes continuous optimization within the latent space of a Variational Autoencoder (VAE) and is known to be susceptible to over-exploration, which manifests in unrealistic solutions that reduce its practicality. LES leverages the trained decoder’s approximation of the data distribution, and can be employed with any VAE decoder–including pretrained ones–without additional training, architectural changes or access to the training data. Our evaluation across five LSO benchmark tasks and twenty-two VAE models demonstrates that LES always enhances the quality of the solutions while maintaining high objective values, leading to improvements over existing solutions in most cases. We believe that new avenues to LSO will be opened by LES’ ability to identify out of distribution areas, differentiability, and computational tractability.
APA
Ronen, O., Humayun, A.I., Baraniuk, R., Balestriero, R. & Yu, B.. (2025). MITIGATING OVER-EXPLORATION IN LATENT SPACE OPTIMIZATION USING LES. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:51996-52024 Available from https://proceedings.mlr.press/v267/ronen25a.html.

Related Material