Neural Genetic Search in Discrete Spaces

Hyeonah Kim, Sanghyeok Choi, Jiwoo Son, Jinkyoo Park, Changhyun Kwon
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:29969-29987, 2025.

Abstract

Effective search methods are crucial for improving the performance of deep generative models at test time. In this paper, we introduce a novel test-time search method, Neural Genetic Search (NGS), which incorporates the evolutionary mechanism of genetic algorithms into the generation procedure of deep models. The core idea behind NGS is its crossover, which is defined as parent-conditioned generation using trained generative models. This approach offers a versatile and easy-to-implement search algorithm for deep generative models. We demonstrate the effectiveness and flexibility of NGS through experiments across three distinct domains: routing problems, adversarial prompt generation for language models, and molecular design.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-kim25b, title = {Neural Genetic Search in Discrete Spaces}, author = {Kim, Hyeonah and Choi, Sanghyeok and Son, Jiwoo and Park, Jinkyoo and Kwon, Changhyun}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {29969--29987}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/kim25b/kim25b.pdf}, url = {https://proceedings.mlr.press/v267/kim25b.html}, abstract = {Effective search methods are crucial for improving the performance of deep generative models at test time. In this paper, we introduce a novel test-time search method, Neural Genetic Search (NGS), which incorporates the evolutionary mechanism of genetic algorithms into the generation procedure of deep models. The core idea behind NGS is its crossover, which is defined as parent-conditioned generation using trained generative models. This approach offers a versatile and easy-to-implement search algorithm for deep generative models. We demonstrate the effectiveness and flexibility of NGS through experiments across three distinct domains: routing problems, adversarial prompt generation for language models, and molecular design.} }
Endnote
%0 Conference Paper %T Neural Genetic Search in Discrete Spaces %A Hyeonah Kim %A Sanghyeok Choi %A Jiwoo Son %A Jinkyoo Park %A Changhyun Kwon %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-kim25b %I PMLR %P 29969--29987 %U https://proceedings.mlr.press/v267/kim25b.html %V 267 %X Effective search methods are crucial for improving the performance of deep generative models at test time. In this paper, we introduce a novel test-time search method, Neural Genetic Search (NGS), which incorporates the evolutionary mechanism of genetic algorithms into the generation procedure of deep models. The core idea behind NGS is its crossover, which is defined as parent-conditioned generation using trained generative models. This approach offers a versatile and easy-to-implement search algorithm for deep generative models. We demonstrate the effectiveness and flexibility of NGS through experiments across three distinct domains: routing problems, adversarial prompt generation for language models, and molecular design.
APA
Kim, H., Choi, S., Son, J., Park, J. & Kwon, C.. (2025). Neural Genetic Search in Discrete Spaces. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:29969-29987 Available from https://proceedings.mlr.press/v267/kim25b.html.

Related Material