Learning Monotonic Probabilities with a Generative Cost Model

Yongxiang Tang, Yanhua Cheng, Xiaocheng Liu, Chenchen Jiao, Yanxiang Zeng, Ning Luo, Pengjia Yuan, Xialong Liu, Peng Jiang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:58789-58804, 2025.

Abstract

In many machine learning tasks, it is often necessary for the relationship between input and output variables to be monotonic, including both strictly monotonic and implicitly monotonic relationships. Traditional methods for maintaining monotonicity mainly rely on construction or regularization techniques, whereas this paper shows that the issue of strict monotonic probability can be viewed as a partial order between an observable revenue variable and a latent cost variable. This perspective enables us to reformulate the monotonicity challenge into modeling the latent cost variable. To tackle this, we introduce a generative network for the latent cost variable, termed the Generative Cost Model (GCM), which inherently addresses the strict monotonic problem, and propose the Implicit Generative Cost Model (IGCM) to address the implicit monotonic problem. We further validate our approach with a numerical simulation of quantile regression and conduct multiple experiments on public datasets, showing that our method significantly outperforms existing monotonic modeling techniques. The code for our experiments can be found at https://github.com/tyxaaron/GCM.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-tang25c, title = {Learning Monotonic Probabilities with a Generative Cost Model}, author = {Tang, Yongxiang and Cheng, Yanhua and Liu, Xiaocheng and Jiao, Chenchen and Zeng, Yanxiang and Luo, Ning and Yuan, Pengjia and Liu, Xialong and Jiang, Peng}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {58789--58804}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/tang25c/tang25c.pdf}, url = {https://proceedings.mlr.press/v267/tang25c.html}, abstract = {In many machine learning tasks, it is often necessary for the relationship between input and output variables to be monotonic, including both strictly monotonic and implicitly monotonic relationships. Traditional methods for maintaining monotonicity mainly rely on construction or regularization techniques, whereas this paper shows that the issue of strict monotonic probability can be viewed as a partial order between an observable revenue variable and a latent cost variable. This perspective enables us to reformulate the monotonicity challenge into modeling the latent cost variable. To tackle this, we introduce a generative network for the latent cost variable, termed the Generative Cost Model (GCM), which inherently addresses the strict monotonic problem, and propose the Implicit Generative Cost Model (IGCM) to address the implicit monotonic problem. We further validate our approach with a numerical simulation of quantile regression and conduct multiple experiments on public datasets, showing that our method significantly outperforms existing monotonic modeling techniques. The code for our experiments can be found at https://github.com/tyxaaron/GCM.} }
Endnote
%0 Conference Paper %T Learning Monotonic Probabilities with a Generative Cost Model %A Yongxiang Tang %A Yanhua Cheng %A Xiaocheng Liu %A Chenchen Jiao %A Yanxiang Zeng %A Ning Luo %A Pengjia Yuan %A Xialong Liu %A Peng Jiang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-tang25c %I PMLR %P 58789--58804 %U https://proceedings.mlr.press/v267/tang25c.html %V 267 %X In many machine learning tasks, it is often necessary for the relationship between input and output variables to be monotonic, including both strictly monotonic and implicitly monotonic relationships. Traditional methods for maintaining monotonicity mainly rely on construction or regularization techniques, whereas this paper shows that the issue of strict monotonic probability can be viewed as a partial order between an observable revenue variable and a latent cost variable. This perspective enables us to reformulate the monotonicity challenge into modeling the latent cost variable. To tackle this, we introduce a generative network for the latent cost variable, termed the Generative Cost Model (GCM), which inherently addresses the strict monotonic problem, and propose the Implicit Generative Cost Model (IGCM) to address the implicit monotonic problem. We further validate our approach with a numerical simulation of quantile regression and conduct multiple experiments on public datasets, showing that our method significantly outperforms existing monotonic modeling techniques. The code for our experiments can be found at https://github.com/tyxaaron/GCM.
APA
Tang, Y., Cheng, Y., Liu, X., Jiao, C., Zeng, Y., Luo, N., Yuan, P., Liu, X. & Jiang, P.. (2025). Learning Monotonic Probabilities with a Generative Cost Model. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:58789-58804 Available from https://proceedings.mlr.press/v267/tang25c.html.

Related Material