Swept Approximate Message Passing for Sparse Estimation

Andre Manoel, Florent Krzakala, Eric Tramel, Lenka Zdeborovà
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1123-1132, 2015.

Abstract

Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to stabilizing AMP in these contexts by applying AMP updates to individual coefficients rather than in parallel. Our results show that this change to the AMP iteration can provide theoretically expected, but hitherto unobtainable, performance for problems on which the standard AMP iteration diverges. Additionally, we find that the computational costs of this swept coefficient update scheme is not unduly burdensome, allowing it to be applied efficiently to signals of large dimensionality.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-manoel15, title = {Swept Approximate Message Passing for Sparse Estimation}, author = {Manoel, Andre and Krzakala, Florent and Tramel, Eric and Zdeborovà, Lenka}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1123--1132}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/manoel15.pdf}, url = {https://proceedings.mlr.press/v37/manoel15.html}, abstract = {Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to stabilizing AMP in these contexts by applying AMP updates to individual coefficients rather than in parallel. Our results show that this change to the AMP iteration can provide theoretically expected, but hitherto unobtainable, performance for problems on which the standard AMP iteration diverges. Additionally, we find that the computational costs of this swept coefficient update scheme is not unduly burdensome, allowing it to be applied efficiently to signals of large dimensionality.} }
Endnote
%0 Conference Paper %T Swept Approximate Message Passing for Sparse Estimation %A Andre Manoel %A Florent Krzakala %A Eric Tramel %A Lenka Zdeborovà %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-manoel15 %I PMLR %P 1123--1132 %U https://proceedings.mlr.press/v37/manoel15.html %V 37 %X Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to stabilizing AMP in these contexts by applying AMP updates to individual coefficients rather than in parallel. Our results show that this change to the AMP iteration can provide theoretically expected, but hitherto unobtainable, performance for problems on which the standard AMP iteration diverges. Additionally, we find that the computational costs of this swept coefficient update scheme is not unduly burdensome, allowing it to be applied efficiently to signals of large dimensionality.
RIS
TY - CPAPER TI - Swept Approximate Message Passing for Sparse Estimation AU - Andre Manoel AU - Florent Krzakala AU - Eric Tramel AU - Lenka Zdeborovà BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-manoel15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1123 EP - 1132 L1 - http://proceedings.mlr.press/v37/manoel15.pdf UR - https://proceedings.mlr.press/v37/manoel15.html AB - Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to stabilizing AMP in these contexts by applying AMP updates to individual coefficients rather than in parallel. Our results show that this change to the AMP iteration can provide theoretically expected, but hitherto unobtainable, performance for problems on which the standard AMP iteration diverges. Additionally, we find that the computational costs of this swept coefficient update scheme is not unduly burdensome, allowing it to be applied efficiently to signals of large dimensionality. ER -
APA
Manoel, A., Krzakala, F., Tramel, E. & Zdeborovà, L.. (2015). Swept Approximate Message Passing for Sparse Estimation. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1123-1132 Available from https://proceedings.mlr.press/v37/manoel15.html.

Related Material