Convolutional Dictionary Learning through Tensor Factorization

Furong Huang, Animashree Anandkumar
Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015, PMLR 44:116-129, 2015.

Abstract

Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. In this paper, we extend tensor decomposition framework to models with invariances, such as convolutional dictionary models. Our tensor decomposition algorithm is based on the popular alternating least squares (ALS) method, but with additional shift invariance constraints on the factors. We demonstrate that each ALS update can be computed efficiently using simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to models with better reconstruction error and is much faster, compared to the popular alternating minimization heuristic, where the filters and activation maps are alternately updated.

Cite this Paper


BibTeX
@InProceedings{pmlr-v44-huang15convolutional, title = {Convolutional Dictionary Learning through Tensor Factorization}, author = {Huang, Furong and Anandkumar, Animashree}, booktitle = {Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015}, pages = {116--129}, year = {2015}, editor = {Storcheus, Dmitry and Rostamizadeh, Afshin and Kumar, Sanjiv}, volume = {44}, series = {Proceedings of Machine Learning Research}, address = {Montreal, Canada}, month = {11 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v44/huang15convolutional.pdf}, url = {https://proceedings.mlr.press/v44/huang15convolutional.html}, abstract = {Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. In this paper, we extend tensor decomposition framework to models with invariances, such as convolutional dictionary models. Our tensor decomposition algorithm is based on the popular alternating least squares (ALS) method, but with additional shift invariance constraints on the factors. We demonstrate that each ALS update can be computed efficiently using simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to models with better reconstruction error and is much faster, compared to the popular alternating minimization heuristic, where the filters and activation maps are alternately updated.} }
Endnote
%0 Conference Paper %T Convolutional Dictionary Learning through Tensor Factorization %A Furong Huang %A Animashree Anandkumar %B Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015 %C Proceedings of Machine Learning Research %D 2015 %E Dmitry Storcheus %E Afshin Rostamizadeh %E Sanjiv Kumar %F pmlr-v44-huang15convolutional %I PMLR %P 116--129 %U https://proceedings.mlr.press/v44/huang15convolutional.html %V 44 %X Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. In this paper, we extend tensor decomposition framework to models with invariances, such as convolutional dictionary models. Our tensor decomposition algorithm is based on the popular alternating least squares (ALS) method, but with additional shift invariance constraints on the factors. We demonstrate that each ALS update can be computed efficiently using simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to models with better reconstruction error and is much faster, compared to the popular alternating minimization heuristic, where the filters and activation maps are alternately updated.
RIS
TY - CPAPER TI - Convolutional Dictionary Learning through Tensor Factorization AU - Furong Huang AU - Animashree Anandkumar BT - Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015 DA - 2015/12/08 ED - Dmitry Storcheus ED - Afshin Rostamizadeh ED - Sanjiv Kumar ID - pmlr-v44-huang15convolutional PB - PMLR DP - Proceedings of Machine Learning Research VL - 44 SP - 116 EP - 129 L1 - http://proceedings.mlr.press/v44/huang15convolutional.pdf UR - https://proceedings.mlr.press/v44/huang15convolutional.html AB - Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. In this paper, we extend tensor decomposition framework to models with invariances, such as convolutional dictionary models. Our tensor decomposition algorithm is based on the popular alternating least squares (ALS) method, but with additional shift invariance constraints on the factors. We demonstrate that each ALS update can be computed efficiently using simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to models with better reconstruction error and is much faster, compared to the popular alternating minimization heuristic, where the filters and activation maps are alternately updated. ER -
APA
Huang, F. & Anandkumar, A.. (2015). Convolutional Dictionary Learning through Tensor Factorization. Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015, in Proceedings of Machine Learning Research 44:116-129 Available from https://proceedings.mlr.press/v44/huang15convolutional.html.

Related Material