Consistent Amortized Clustering via Generative Flow Networks

Irit Chelly, Roy Uziel, Oren Freifeld, Ari Pakman
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1729-1737, 2025.

Abstract

Neural models for amortized probabilistic clustering yield samples of cluster labels given a set-structured input, while avoiding lengthy Markov chain runs and the need for explicit data likelihoods. Existing methods which label each data point sequentially, like the Neural Clustering Process, often lead to cluster assignments highly dependent on the data order. Alternatively, methods that sequentially create full clusters, do not provide assignment probabilities. In this paper, we introduce GFNCP, a novel framework for amortized clustering. GFNCP is formulated as a Generative Flow Network with a shared energy-based parametrization of policy and reward. We show that the flow matching conditions are equivalent to consistency of the clustering posterior under marginalization, which in turn implies order invariance. GFNCP also outperforms existing methods in clustering performance on both synthetic and real-world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-chelly25a, title = {Consistent Amortized Clustering via Generative Flow Networks}, author = {Chelly, Irit and Uziel, Roy and Freifeld, Oren and Pakman, Ari}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1729--1737}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/chelly25a/chelly25a.pdf}, url = {https://proceedings.mlr.press/v258/chelly25a.html}, abstract = {Neural models for amortized probabilistic clustering yield samples of cluster labels given a set-structured input, while avoiding lengthy Markov chain runs and the need for explicit data likelihoods. Existing methods which label each data point sequentially, like the Neural Clustering Process, often lead to cluster assignments highly dependent on the data order. Alternatively, methods that sequentially create full clusters, do not provide assignment probabilities. In this paper, we introduce GFNCP, a novel framework for amortized clustering. GFNCP is formulated as a Generative Flow Network with a shared energy-based parametrization of policy and reward. We show that the flow matching conditions are equivalent to consistency of the clustering posterior under marginalization, which in turn implies order invariance. GFNCP also outperforms existing methods in clustering performance on both synthetic and real-world data.} }
Endnote
%0 Conference Paper %T Consistent Amortized Clustering via Generative Flow Networks %A Irit Chelly %A Roy Uziel %A Oren Freifeld %A Ari Pakman %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-chelly25a %I PMLR %P 1729--1737 %U https://proceedings.mlr.press/v258/chelly25a.html %V 258 %X Neural models for amortized probabilistic clustering yield samples of cluster labels given a set-structured input, while avoiding lengthy Markov chain runs and the need for explicit data likelihoods. Existing methods which label each data point sequentially, like the Neural Clustering Process, often lead to cluster assignments highly dependent on the data order. Alternatively, methods that sequentially create full clusters, do not provide assignment probabilities. In this paper, we introduce GFNCP, a novel framework for amortized clustering. GFNCP is formulated as a Generative Flow Network with a shared energy-based parametrization of policy and reward. We show that the flow matching conditions are equivalent to consistency of the clustering posterior under marginalization, which in turn implies order invariance. GFNCP also outperforms existing methods in clustering performance on both synthetic and real-world data.
APA
Chelly, I., Uziel, R., Freifeld, O. & Pakman, A.. (2025). Consistent Amortized Clustering via Generative Flow Networks. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1729-1737 Available from https://proceedings.mlr.press/v258/chelly25a.html.

Related Material