DICOD: Distributed Convolutional Coordinate Descent for Convolutional Sparse Coding

Thomas Moreau, Laurent Oudre, Nicolas Vayatis
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3626-3634, 2018.

Abstract

In this paper, we introduce DICOD, a convolutional sparse coding algorithm which builds shift invariant representations for long signals. This algorithm is designed to run in a distributed setting, with local message passing, making it communication efficient. It is based on coordinate descent and uses locally greedy updates which accelerate the resolution compared to greedy coordinate selection. We prove the convergence of this algorithm and highlight its computational speed-up which is super-linear in the number of cores used. We also provide empirical evidence for the acceleration properties of our algorithm compared to state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-moreau18a, title = {{DICOD}: Distributed Convolutional Coordinate Descent for Convolutional Sparse Coding}, author = {Moreau, Thomas and Oudre, Laurent and Vayatis, Nicolas}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3626--3634}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/moreau18a/moreau18a.pdf}, url = {https://proceedings.mlr.press/v80/moreau18a.html}, abstract = {In this paper, we introduce DICOD, a convolutional sparse coding algorithm which builds shift invariant representations for long signals. This algorithm is designed to run in a distributed setting, with local message passing, making it communication efficient. It is based on coordinate descent and uses locally greedy updates which accelerate the resolution compared to greedy coordinate selection. We prove the convergence of this algorithm and highlight its computational speed-up which is super-linear in the number of cores used. We also provide empirical evidence for the acceleration properties of our algorithm compared to state-of-the-art methods.} }
Endnote
%0 Conference Paper %T DICOD: Distributed Convolutional Coordinate Descent for Convolutional Sparse Coding %A Thomas Moreau %A Laurent Oudre %A Nicolas Vayatis %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-moreau18a %I PMLR %P 3626--3634 %U https://proceedings.mlr.press/v80/moreau18a.html %V 80 %X In this paper, we introduce DICOD, a convolutional sparse coding algorithm which builds shift invariant representations for long signals. This algorithm is designed to run in a distributed setting, with local message passing, making it communication efficient. It is based on coordinate descent and uses locally greedy updates which accelerate the resolution compared to greedy coordinate selection. We prove the convergence of this algorithm and highlight its computational speed-up which is super-linear in the number of cores used. We also provide empirical evidence for the acceleration properties of our algorithm compared to state-of-the-art methods.
APA
Moreau, T., Oudre, L. & Vayatis, N.. (2018). DICOD: Distributed Convolutional Coordinate Descent for Convolutional Sparse Coding. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3626-3634 Available from https://proceedings.mlr.press/v80/moreau18a.html.

Related Material