Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision

Felix Petersen, Christian Borgelt, Hilde Kuehne, Oliver Deussen
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8546-8555, 2021.

Abstract

Sorting and ranking supervision is a method for training neural networks end-to-end based on ordering constraints. That is, the ground truth order of sets of samples is known, while their absolute values remain unsupervised. For that, we propose differentiable sorting networks by relaxing their pairwise conditional swap operations. To address the problems of vanishing gradients and extensive blurring that arise with larger numbers of layers, we propose mapping activations to regions with moderate gradients. We consider odd-even as well as bitonic sorting networks, which outperform existing relaxations of the sorting operation. We show that bitonic sorting networks can achieve stable training on large input sets of up to 1024 elements.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-petersen21a, title = {Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision}, author = {Petersen, Felix and Borgelt, Christian and Kuehne, Hilde and Deussen, Oliver}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8546--8555}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/petersen21a/petersen21a.pdf}, url = {https://proceedings.mlr.press/v139/petersen21a.html}, abstract = {Sorting and ranking supervision is a method for training neural networks end-to-end based on ordering constraints. That is, the ground truth order of sets of samples is known, while their absolute values remain unsupervised. For that, we propose differentiable sorting networks by relaxing their pairwise conditional swap operations. To address the problems of vanishing gradients and extensive blurring that arise with larger numbers of layers, we propose mapping activations to regions with moderate gradients. We consider odd-even as well as bitonic sorting networks, which outperform existing relaxations of the sorting operation. We show that bitonic sorting networks can achieve stable training on large input sets of up to 1024 elements.} }
Endnote
%0 Conference Paper %T Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision %A Felix Petersen %A Christian Borgelt %A Hilde Kuehne %A Oliver Deussen %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-petersen21a %I PMLR %P 8546--8555 %U https://proceedings.mlr.press/v139/petersen21a.html %V 139 %X Sorting and ranking supervision is a method for training neural networks end-to-end based on ordering constraints. That is, the ground truth order of sets of samples is known, while their absolute values remain unsupervised. For that, we propose differentiable sorting networks by relaxing their pairwise conditional swap operations. To address the problems of vanishing gradients and extensive blurring that arise with larger numbers of layers, we propose mapping activations to regions with moderate gradients. We consider odd-even as well as bitonic sorting networks, which outperform existing relaxations of the sorting operation. We show that bitonic sorting networks can achieve stable training on large input sets of up to 1024 elements.
APA
Petersen, F., Borgelt, C., Kuehne, H. & Deussen, O.. (2021). Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8546-8555 Available from https://proceedings.mlr.press/v139/petersen21a.html.

Related Material