A Highly Scalable Parallel Algorithm for Isotropic Total Variation Models

Jie Wang, Qingyang Li, Sen Yang, Wei Fan, Peter Wonka, Jieping Ye
; Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):235-243, 2014.

Abstract

Total variation (TV) models are among the most popular and successful tools in signal processing. However, due to the complex nature of the TV term, it is challenging to efficiently compute a solution for large-scale problems. State-of-the-art algorithms that are based on the alternating direction method of multipliers (ADMM) often involve solving large-size linear systems. In this paper, we propose a highly scalable parallel algorithm for TV models that is based on a novel decomposition strategy of the problem domain. As a result, the TV models can be decoupled into a set of small and independent subproblems, which admit closed form solutions. This makes our approach particularly suitable for parallel implementation. Our algorithm is guaranteed to converge to its global minimum. With N variables and n_p processes, the time complexity is O(N/(εn_p)) to reach an epsilon-optimal solution. Extensive experiments demonstrate that our approach outperforms existing state-of-the-art algorithms, especially in dealing with high-resolution, mega-size images.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-wangb14, title = {A Highly Scalable Parallel Algorithm for Isotropic Total Variation Models}, author = {Jie Wang and Qingyang Li and Sen Yang and Wei Fan and Peter Wonka and Jieping Ye}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {235--243}, year = {2014}, editor = {Eric P. Xing and Tony Jebara}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/wangb14.pdf}, url = {http://proceedings.mlr.press/v32/wangb14.html}, abstract = {Total variation (TV) models are among the most popular and successful tools in signal processing. However, due to the complex nature of the TV term, it is challenging to efficiently compute a solution for large-scale problems. State-of-the-art algorithms that are based on the alternating direction method of multipliers (ADMM) often involve solving large-size linear systems. In this paper, we propose a highly scalable parallel algorithm for TV models that is based on a novel decomposition strategy of the problem domain. As a result, the TV models can be decoupled into a set of small and independent subproblems, which admit closed form solutions. This makes our approach particularly suitable for parallel implementation. Our algorithm is guaranteed to converge to its global minimum. With N variables and n_p processes, the time complexity is O(N/(εn_p)) to reach an epsilon-optimal solution. Extensive experiments demonstrate that our approach outperforms existing state-of-the-art algorithms, especially in dealing with high-resolution, mega-size images.} }
Endnote
%0 Conference Paper %T A Highly Scalable Parallel Algorithm for Isotropic Total Variation Models %A Jie Wang %A Qingyang Li %A Sen Yang %A Wei Fan %A Peter Wonka %A Jieping Ye %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-wangb14 %I PMLR %J Proceedings of Machine Learning Research %P 235--243 %U http://proceedings.mlr.press %V 32 %N 2 %W PMLR %X Total variation (TV) models are among the most popular and successful tools in signal processing. However, due to the complex nature of the TV term, it is challenging to efficiently compute a solution for large-scale problems. State-of-the-art algorithms that are based on the alternating direction method of multipliers (ADMM) often involve solving large-size linear systems. In this paper, we propose a highly scalable parallel algorithm for TV models that is based on a novel decomposition strategy of the problem domain. As a result, the TV models can be decoupled into a set of small and independent subproblems, which admit closed form solutions. This makes our approach particularly suitable for parallel implementation. Our algorithm is guaranteed to converge to its global minimum. With N variables and n_p processes, the time complexity is O(N/(εn_p)) to reach an epsilon-optimal solution. Extensive experiments demonstrate that our approach outperforms existing state-of-the-art algorithms, especially in dealing with high-resolution, mega-size images.
RIS
TY - CPAPER TI - A Highly Scalable Parallel Algorithm for Isotropic Total Variation Models AU - Jie Wang AU - Qingyang Li AU - Sen Yang AU - Wei Fan AU - Peter Wonka AU - Jieping Ye BT - Proceedings of the 31st International Conference on Machine Learning PY - 2014/01/27 DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-wangb14 PB - PMLR SP - 235 DP - PMLR EP - 243 L1 - http://proceedings.mlr.press/v32/wangb14.pdf UR - http://proceedings.mlr.press/v32/wangb14.html AB - Total variation (TV) models are among the most popular and successful tools in signal processing. However, due to the complex nature of the TV term, it is challenging to efficiently compute a solution for large-scale problems. State-of-the-art algorithms that are based on the alternating direction method of multipliers (ADMM) often involve solving large-size linear systems. In this paper, we propose a highly scalable parallel algorithm for TV models that is based on a novel decomposition strategy of the problem domain. As a result, the TV models can be decoupled into a set of small and independent subproblems, which admit closed form solutions. This makes our approach particularly suitable for parallel implementation. Our algorithm is guaranteed to converge to its global minimum. With N variables and n_p processes, the time complexity is O(N/(εn_p)) to reach an epsilon-optimal solution. Extensive experiments demonstrate that our approach outperforms existing state-of-the-art algorithms, especially in dealing with high-resolution, mega-size images. ER -
APA
Wang, J., Li, Q., Yang, S., Fan, W., Wonka, P. & Ye, J.. (2014). A Highly Scalable Parallel Algorithm for Isotropic Total Variation Models. Proceedings of the 31st International Conference on Machine Learning, in PMLR 32(2):235-243

Related Material