A Chance-Constrained Generative Framework for Sequence Optimization

Xianggen Liu, Qiang Liu, Sen Song, Jian Peng
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6271-6281, 2020.

Abstract

Deep generative modeling has achieved many successes for continuous data generation, such as producing realistic images and controlling their properties (e.g., styles). However, the development of generative modeling techniques for optimizing discrete data, such as sequences or strings, still lags behind largely due to the challenges in modeling complex and long-range constraints, including both syntax and semantics, in discrete structures. In this paper, we formulate the sequence optimization task as a chance-constrained optimization problem. The key idea is to enforce a high probability of generating valid sequences and also optimize the property of interest. We propose a novel minimax algorithm to simultaneously tighten a bound of the valid chance and optimize the expected property. Extensive experimental results in three domains demonstrate the superiority of our approach over the existing sequence optimization methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-liu20i, title = {A Chance-Constrained Generative Framework for Sequence Optimization}, author = {Liu, Xianggen and Liu, Qiang and Song, Sen and Peng, Jian}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6271--6281}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/liu20i/liu20i.pdf}, url = {https://proceedings.mlr.press/v119/liu20i.html}, abstract = {Deep generative modeling has achieved many successes for continuous data generation, such as producing realistic images and controlling their properties (e.g., styles). However, the development of generative modeling techniques for optimizing discrete data, such as sequences or strings, still lags behind largely due to the challenges in modeling complex and long-range constraints, including both syntax and semantics, in discrete structures. In this paper, we formulate the sequence optimization task as a chance-constrained optimization problem. The key idea is to enforce a high probability of generating valid sequences and also optimize the property of interest. We propose a novel minimax algorithm to simultaneously tighten a bound of the valid chance and optimize the expected property. Extensive experimental results in three domains demonstrate the superiority of our approach over the existing sequence optimization methods.} }
Endnote
%0 Conference Paper %T A Chance-Constrained Generative Framework for Sequence Optimization %A Xianggen Liu %A Qiang Liu %A Sen Song %A Jian Peng %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-liu20i %I PMLR %P 6271--6281 %U https://proceedings.mlr.press/v119/liu20i.html %V 119 %X Deep generative modeling has achieved many successes for continuous data generation, such as producing realistic images and controlling their properties (e.g., styles). However, the development of generative modeling techniques for optimizing discrete data, such as sequences or strings, still lags behind largely due to the challenges in modeling complex and long-range constraints, including both syntax and semantics, in discrete structures. In this paper, we formulate the sequence optimization task as a chance-constrained optimization problem. The key idea is to enforce a high probability of generating valid sequences and also optimize the property of interest. We propose a novel minimax algorithm to simultaneously tighten a bound of the valid chance and optimize the expected property. Extensive experimental results in three domains demonstrate the superiority of our approach over the existing sequence optimization methods.
APA
Liu, X., Liu, Q., Song, S. & Peng, J.. (2020). A Chance-Constrained Generative Framework for Sequence Optimization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6271-6281 Available from https://proceedings.mlr.press/v119/liu20i.html.

Related Material