Stacking Deep Set Networks and Pooling by Quantiles

Zhuojun Chen, Xinghua Zhu, Dongzhe Su, Justin C. I. Chuang
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:7953-7971, 2024.

Abstract

We propose Stacked Deep Sets and Quantile Pooling for learning tasks on set data. We introduce Quantile Pooling, a novel permutation-invariant pooling operation that synergizes max and average pooling. Just like max pooling, quantile pooling emphasizes the most salient features of the data. Like average pooling, it captures the overall distribution and subtle features of the data. Like both, it is lightweight and fast. We demonstrate the effectiveness of our approach in a variety of tasks, showing that quantile pooling can outperform both max and average pooling in each of their respective strengths. We also introduce a variant of deep set networks that is more expressive and universal. While Quantile Pooling balances robustness and sensitivity, Stacked Deep Sets enhances learning with depth.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-chen24bo, title = {Stacking Deep Set Networks and Pooling by Quantiles}, author = {Chen, Zhuojun and Zhu, Xinghua and Su, Dongzhe and Chuang, Justin C. I.}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {7953--7971}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/chen24bo/chen24bo.pdf}, url = {https://proceedings.mlr.press/v235/chen24bo.html}, abstract = {We propose Stacked Deep Sets and Quantile Pooling for learning tasks on set data. We introduce Quantile Pooling, a novel permutation-invariant pooling operation that synergizes max and average pooling. Just like max pooling, quantile pooling emphasizes the most salient features of the data. Like average pooling, it captures the overall distribution and subtle features of the data. Like both, it is lightweight and fast. We demonstrate the effectiveness of our approach in a variety of tasks, showing that quantile pooling can outperform both max and average pooling in each of their respective strengths. We also introduce a variant of deep set networks that is more expressive and universal. While Quantile Pooling balances robustness and sensitivity, Stacked Deep Sets enhances learning with depth.} }
Endnote
%0 Conference Paper %T Stacking Deep Set Networks and Pooling by Quantiles %A Zhuojun Chen %A Xinghua Zhu %A Dongzhe Su %A Justin C. I. Chuang %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-chen24bo %I PMLR %P 7953--7971 %U https://proceedings.mlr.press/v235/chen24bo.html %V 235 %X We propose Stacked Deep Sets and Quantile Pooling for learning tasks on set data. We introduce Quantile Pooling, a novel permutation-invariant pooling operation that synergizes max and average pooling. Just like max pooling, quantile pooling emphasizes the most salient features of the data. Like average pooling, it captures the overall distribution and subtle features of the data. Like both, it is lightweight and fast. We demonstrate the effectiveness of our approach in a variety of tasks, showing that quantile pooling can outperform both max and average pooling in each of their respective strengths. We also introduce a variant of deep set networks that is more expressive and universal. While Quantile Pooling balances robustness and sensitivity, Stacked Deep Sets enhances learning with depth.
APA
Chen, Z., Zhu, X., Su, D. & Chuang, J.C.I.. (2024). Stacking Deep Set Networks and Pooling by Quantiles. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:7953-7971 Available from https://proceedings.mlr.press/v235/chen24bo.html.

Related Material