DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows

Jonathan Geuter, Clément Bonet, Anna Korba, David Alvarez-Melis
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3988-3996, 2025.

Abstract

Deep Equilibrium Models (DEQs) are a class of implicit neural networks that solve for a fixed point of a neural network in their forward pass. Traditionally, DEQs take sequences as inputs, but have since been applied to a variety of data. In this work, we present Distributional Deep Equilibrium Models (DDEQs), extending DEQs to discrete measure inputs, such as sets or point clouds. We provide a theoretically grounded framework for DDEQs. Leveraging Wasserstein gradient flows, we show how the forward pass of the DEQ can be adapted to find fixed points of discrete measures under permutation-invariance, and derive adequate network architectures for DDEQs. In experiments, we show that they can compete with state-of-the-art models in tasks such as point cloud classification and point cloud completion, while being significantly more parameter-efficient.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-geuter25a, title = {DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows}, author = {Geuter, Jonathan and Bonet, Cl{\'e}ment and Korba, Anna and Alvarez-Melis, David}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3988--3996}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/geuter25a/geuter25a.pdf}, url = {https://proceedings.mlr.press/v258/geuter25a.html}, abstract = {Deep Equilibrium Models (DEQs) are a class of implicit neural networks that solve for a fixed point of a neural network in their forward pass. Traditionally, DEQs take sequences as inputs, but have since been applied to a variety of data. In this work, we present Distributional Deep Equilibrium Models (DDEQs), extending DEQs to discrete measure inputs, such as sets or point clouds. We provide a theoretically grounded framework for DDEQs. Leveraging Wasserstein gradient flows, we show how the forward pass of the DEQ can be adapted to find fixed points of discrete measures under permutation-invariance, and derive adequate network architectures for DDEQs. In experiments, we show that they can compete with state-of-the-art models in tasks such as point cloud classification and point cloud completion, while being significantly more parameter-efficient.} }
Endnote
%0 Conference Paper %T DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows %A Jonathan Geuter %A Clément Bonet %A Anna Korba %A David Alvarez-Melis %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-geuter25a %I PMLR %P 3988--3996 %U https://proceedings.mlr.press/v258/geuter25a.html %V 258 %X Deep Equilibrium Models (DEQs) are a class of implicit neural networks that solve for a fixed point of a neural network in their forward pass. Traditionally, DEQs take sequences as inputs, but have since been applied to a variety of data. In this work, we present Distributional Deep Equilibrium Models (DDEQs), extending DEQs to discrete measure inputs, such as sets or point clouds. We provide a theoretically grounded framework for DDEQs. Leveraging Wasserstein gradient flows, we show how the forward pass of the DEQ can be adapted to find fixed points of discrete measures under permutation-invariance, and derive adequate network architectures for DDEQs. In experiments, we show that they can compete with state-of-the-art models in tasks such as point cloud classification and point cloud completion, while being significantly more parameter-efficient.
APA
Geuter, J., Bonet, C., Korba, A. & Alvarez-Melis, D.. (2025). DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3988-3996 Available from https://proceedings.mlr.press/v258/geuter25a.html.

Related Material